Category Archives: Hardware

Update: FCOE

By | Blog, Computers, Hardware, Homelab | No Comments

So I have done some more work on the Fiber Channel over Ethernet whitebox switch build! I got enough cheapo cards to test this idea, and man is it wonderful and a piece of crap at the same time. Im pretty sure it is because Im testing this idea on some really old hardware, but problems just keep cropping up. Like, if I reboot the machine, it will default the bridge for the LAN to instead to the 192 scheme for some reason. Then, the SFP+ ports will randomly not work when they just work off the two ports on my gigabit switch (hence showing why whitebox builds are a bad idea, they just dont work sometimes).

So what does this mean? I will still build the box, test it out, and giggle a little, but I will eventually just turn that box into my next gen hyperv environment. I will put my router on it as a VM, and have it be the 10gbps connection to my switch, and then just save up money to buy a small SFP+ switch to connect my SAN and servers together. Right now I just have my gaming / rendering box and my media server connected to the two SFP+ ports on my switch so I can have that 10gbps goodness for editing pictures and videos on the fly.

Would I recommend this project to people? No, unless you really want to play around with what you can do with such a build, I would not. It was nice when it worked, I got up to five computers with 200 MBps speeds between them! But it got to the point where I was rebooting that box two times a day to get it to work because it was running out of memory for the transfers. Not worth it since I needed that box to stay up and provide my services to the outside world such as for WWW and VPN. I will make a video eventually about it, but only when I get the new build up and running. Till then! :)

Fiber Channel over Ethernet

By | Blog, Computers, Hardware, Homelab | No Comments

So I have been looking into getting 10gbps connections between my servers and gaming rig, with a 10gbps backbone link for the rest of the 1gbps links on my network such as gaming consoles, wireless access points, and so on. I want the 10gig because I would like to consolidate all my storage to one or two servers and have those servers be iSCSI targets for my VMs and services to store data on, and also to deduplicate data as well as make everything easier to compress and backup to redundant storage, then to the cloud via BackBlaze. Right now as it stands my VM server has its own SSD storage for databases and HDD for websites and other services, backed up manually. The media server has a butt load of HDD for storage and serving up videos and music and etc. This is where I store most of my data and backups. And my cold storage server which has 7x500GB drives for secondary storage of VMs and games and impossible to re-download data. Its a big mess really.

So I started looking at prices for network cards and wire and switches to see what one would have to pay to get some 10gig awesomeness for my homelab. I have discovered that fiber wire costs $3 for 50ft and $7 for 100ft if you look on ebay. Awesome. I brought one of each of the lengths and am looking to get some shorter 1 – 3 ft cables as well that has the SFP+ modules attached already on the ends. I will use the 100ft cable to connect my gaming rig and the 50ft to connect anything else that I want across the house. Maybe a cheap 16 port switch for the gaming consoles that has SFP+ uplink?

NIC cards vary based on what condition it is in. I have gotten two Intel x520-da2 NIC cards for $100 because one of the clips that holds the SFP+ module is broken, but the cards still work. These cards usually go for $150 – $200 new. I have also gotten an Intel x520-da1 which is a single SFP+ port card that I will be testing in my servers.

Now for the SFP+ modules. These are usually expensive too, but you can find them for usually around $25 on ebay as well, so I have gotten four of those to test transfer speeds. Still waiting on them to arrive.

Now for the switch. I have determined that getting a pure SFP+ only switch is insane because the prices to get an 8 port switch is so insane that I could do better at 1/5 the cost and have better control over the traffic. I have decided to make my own whitebox SFP+ switch. I have taken my old i7-920 computer and put both of the two port NICs in it and bridged the ports together to act as a switch, which I have currently connected to my Ubiquiti ES-48-US Lite switch that has a 10gig SFP+ backbone port. Great cheap switch, cant say enough good things about it.

Now I have connected my media server to this whitebox build using the single port NIC card that I have. Cant test anything else until I get those SFP+ modules and another single port NIC card.

Running this whitebox switch is PFSense, which is really meant to be a firewall, but can act as a bridge and route traffic as well. This will add some latency to these connections, but I dont mind so long as I can get above the 1gbps limit (100MBps) during transfers.

So far the bridge between the switch and the media server work! I also have noticed a much more snappy connection and transfers between my gaming rig and the media server, such as getting substained speeds of 100MB p/s where as before I was getting 60 – 75MB p/s. I really blame that on the crap unmanaged switches I had everywhere so I could connect everything. All those hops were a nightmare. Having everything connected to a single managed switch has been the best upgrade I have brought for my homelab in a long, long time.

Well thats all I got for now, I will upload pictures when things start to come together! I will also post more specifics about what I have done so you can be better informed if you want to try this yourself! 😀

ASRock C2750D4I Update

By | Blog, Computers, Hardware, Homelab | No Comments


So many of you will remember this board I posted about a year ago, and that I have wanted to grab one to test out in my environment to see if it is something I could use to replace my hoss daddies that suck up power. Well I have finally got one. About five months ago. Meant to write something about it but have been busy. As always.

So what have I been using the board for? Well I tried out FreeNAS on it first with my hand-full of 500gb HDD that I had laying around doing nothing. It works great, but since I dont have any ECC memory or any other big drives that would make this setup more useful, I decided to put it to work as my virtual host box since I want to cut down power consumption in my home lab. I backed up my VMs from my old AMD FX-8320 build (350 watts, yikes!) and set them back up on the ASRock board. On my kill-a-watt I have mostly seen only 30 to 40 watts being pulled while at 10% load. The highest I have seen it was at 42 watts at 95% load. So far I have my external and internal web servers, DNS server, Minecraft server, MySQL server, Cassandra server, MSSQL server, Task Scheduler server, a render farm box, development test server, the SpiderBot scraper, and a reverse proxy server. I have many more VMs that I plan to setup soon so I can turn some more computers off but just dont have the equipment yet to do it.

While this has worked great for me, it is not my final use for this board. I plan to put it back into the SAN once I get something better for the virtual host, because I plan to virtualize my PFSense box that is the router for my network (and 10gbe fiber backbone). So while I could do it now and put my x520-da2 fiber card on this board, I wouldnt be able to add anything else PCI-E wise to it due to its one PCI-E slot. And I would really like to add in another gigabit network card to act as the WAN port to the virtual PFSense box and the fiber card as the LAN port, so that the two 1gbe ports can be teamed together. But that is just not possible with this board, so I will be looking at other low powered stuff to do it.

All in all, really great board! I really want to get another one and put it into a Silverstone DS380 NAS case as a tiny network rendering box for my lets plays. But at $400 a pop that wont be until far into the future. Maybe I can score another one on a deal from ebay like I did for my current one at $300. Probably not though.