Update: FCOE

By | Blog, Computers, Hardware, Homelab | No Comments

So I have done some more work on the Fiber Channel over Ethernet whitebox switch build! I got enough cheapo cards to test this idea, and man is it wonderful and a piece of crap at the same time. Im pretty sure it is because Im testing this idea on some really old hardware, but problems just keep cropping up. Like, if I reboot the machine, it will default the bridge for the LAN to 10.10.10.1 instead to the 192 scheme for some reason. Then, the SFP+ ports will randomly not work when they just work off the two ports on my gigabit switch (hence showing why whitebox builds are a bad idea, they just dont work sometimes).

So what does this mean? I will still build the box, test it out, and giggle a little, but I will eventually just turn that box into my next gen hyperv environment. I will put my router on it as a VM, and have it be the 10gbps connection to my switch, and then just save up money to buy a small SFP+ switch to connect my SAN and servers together. Right now I just have my gaming / rendering box and my media server connected to the two SFP+ ports on my switch so I can have that 10gbps goodness for editing pictures and videos on the fly.

Would I recommend this project to people? No, unless you really want to play around with what you can do with such a build, I would not. It was nice when it worked, I got up to five computers with 200 MBps speeds between them! But it got to the point where I was rebooting that box two times a day to get it to work because it was running out of memory for the transfers. Not worth it since I needed that box to stay up and provide my services to the outside world such as for WWW and VPN. I will make a video eventually about it, but only when I get the new build up and running. Till then! :)

Fiber Channel over Ethernet

By | Blog, Computers, Hardware, Homelab | No Comments

So I have been looking into getting 10gbps connections between my servers and gaming rig, with a 10gbps backbone link for the rest of the 1gbps links on my network such as gaming consoles, wireless access points, and so on. I want the 10gig because I would like to consolidate all my storage to one or two servers and have those servers be iSCSI targets for my VMs and services to store data on, and also to deduplicate data as well as make everything easier to compress and backup to redundant storage, then to the cloud via BackBlaze. Right now as it stands my VM server has its own SSD storage for databases and HDD for websites and other services, backed up manually. The media server has a butt load of HDD for storage and serving up videos and music and etc. This is where I store most of my data and backups. And my cold storage server which has 7x500GB drives for secondary storage of VMs and games and impossible to re-download data. Its a big mess really.

So I started looking at prices for network cards and wire and switches to see what one would have to pay to get some 10gig awesomeness for my homelab. I have discovered that fiber wire costs $3 for 50ft and $7 for 100ft if you look on ebay. Awesome. I brought one of each of the lengths and am looking to get some shorter 1 – 3 ft cables as well that has the SFP+ modules attached already on the ends. I will use the 100ft cable to connect my gaming rig and the 50ft to connect anything else that I want across the house. Maybe a cheap 16 port switch for the gaming consoles that has SFP+ uplink?

NIC cards vary based on what condition it is in. I have gotten two Intel x520-da2 NIC cards for $100 because one of the clips that holds the SFP+ module is broken, but the cards still work. These cards usually go for $150 – $200 new. I have also gotten an Intel x520-da1 which is a single SFP+ port card that I will be testing in my servers.

Now for the SFP+ modules. These are usually expensive too, but you can find them for usually around $25 on ebay as well, so I have gotten four of those to test transfer speeds. Still waiting on them to arrive.

Now for the switch. I have determined that getting a pure SFP+ only switch is insane because the prices to get an 8 port switch is so insane that I could do better at 1/5 the cost and have better control over the traffic. I have decided to make my own whitebox SFP+ switch. I have taken my old i7-920 computer and put both of the two port NICs in it and bridged the ports together to act as a switch, which I have currently connected to my Ubiquiti ES-48-US Lite switch that has a 10gig SFP+ backbone port. Great cheap switch, cant say enough good things about it.

Now I have connected my media server to this whitebox build using the single port NIC card that I have. Cant test anything else until I get those SFP+ modules and another single port NIC card.

Running this whitebox switch is PFSense, which is really meant to be a firewall, but can act as a bridge and route traffic as well. This will add some latency to these connections, but I dont mind so long as I can get above the 1gbps limit (100MBps) during transfers.

So far the bridge between the switch and the media server work! I also have noticed a much more snappy connection and transfers between my gaming rig and the media server, such as getting substained speeds of 100MB p/s where as before I was getting 60 – 75MB p/s. I really blame that on the crap unmanaged switches I had everywhere so I could connect everything. All those hops were a nightmare. Having everything connected to a single managed switch has been the best upgrade I have brought for my homelab in a long, long time.

Well thats all I got for now, I will upload pictures when things start to come together! I will also post more specifics about what I have done so you can be better informed if you want to try this yourself! 😀

Compressing and Backing up Steam games

By | Blog, Computers, Gaming, Homelab | No Comments

So lately I have been trying to build a system in which I can store Steam games locally in my data center, so my gaming rig can have more free space for recording lets plays and editing the extreme amount of doggy pictures that I now have. I have learned a lot about how steam stores games and what I can do to move them and compress them.

First thing I learned was that Steam stores games at

C:\Program Files (x86)\Steam\steamapps\common\

by default, however in my config I have it stored on a storage pool with duplication turned on. This is so I dont lose any data (game, pictures, or otherwise) to a single drive failure. I will write more about this later in a different post.

So lets say I have my steam games stored in

D:\Games\Steam\steamapps\common\

I need to know this so I can put a .bat file in it that I wrote to compress the games and store them on my server. I use the command line version of 7zip to compress each individual folder under this directory then manually move the compressed games. I do this manually because I dont want 7zip to take up all my computing power while Im trying to do other things, and the server I store it on would take forever and a half to get any of it done. In the future I plan to have an automated transfer of this folder to a crunch box that can handle the compression needs and automate the moving and backing up, but for now this will have to do.

So here is my one liner that I have saved in the same directory as the game folders:

for /d %%X in (*) do “c:\Program Files\7-Zip\7z.exe” a -t7z -m0=lzma2 -mx9 -aoa “%%X.7z” “%%X\”

Save this line to a .bat file such as “compress.bat”. It will loop through each folder in the directory that it is in and compress each folder to its own .7z folder. Be sure to check if the 7z.exe is the same directory as mine is, or change it to match the location.

Now for the size savings, I have been seeing most games compressed to 1/3 to 3/4s of its original size, so this actually helps me out a lot because the server I am currently storing these backups on is an old dual core machine that has 7x500gb drives that are older than God himself running FreeNAS. I am not worried about losing this data because I can re-download it if needed, but would like to avoid that if possible. I abuse my internet connection way too much for that.

So lets see the storage differences between the uncompressed games on my gaming rig to the compressed games on my FreeNAS box. On my gaming rig checking the size on disk property I see this:

1.04 TB (1,152,646,602,752 bytes)

So all my currently installed games take up 1.04 Terabytes of storage space. Quite a bit. Now lets see how much it is when it is compressed:

503 GB (540,267,249,664 bytes)

That is less than HALF the space that the uncompressed games take up. This lets me save a lot of space on my cold storage FreeNAS server that only has 3,255 GB of storage space, which is actually more like 2 TB after ZFS does Raidz2 to it. This also allows me to use this server as a VM backup as well since this frees up space for it.

Thank you for reading! Let me know if you think there is a better way or if you would like to know more! I am still working on a way to automate this process to once a month or when a folder is changed in case of Steam downloading updates to games.