So I have been looking into getting 10gbps connections between my servers and gaming rig, with a 10gbps backbone link for the rest of the 1gbps links on my network such as gaming consoles, wireless access points, and so on. I want the 10gig because I would like to consolidate all my storage to one or two servers and have those servers be iSCSI targets for my VMs and services to store data on, and also to deduplicate data as well as make everything easier to compress and backup to redundant storage, then to the cloud via BackBlaze. Right now as it stands my VM server has its own SSD storage for databases and HDD for websites and other services, backed up manually. The media server has a butt load of HDD for storage and serving up videos and music and etc. This is where I store most of my data and backups. And my cold storage server which has 7x500GB drives for secondary storage of VMs and games and impossible to re-download data. Its a big mess really.
So I started looking at prices for network cards and wire and switches to see what one would have to pay to get some 10gig awesomeness for my homelab. I have discovered that fiber wire costs $3 for 50ft and $7 for 100ft if you look on ebay. Awesome. I brought one of each of the lengths and am looking to get some shorter 1 – 3 ft cables as well that has the SFP+ modules attached already on the ends. I will use the 100ft cable to connect my gaming rig and the 50ft to connect anything else that I want across the house. Maybe a cheap 16 port switch for the gaming consoles that has SFP+ uplink?
NIC cards vary based on what condition it is in. I have gotten two Intel x520-da2 NIC cards for $100 because one of the clips that holds the SFP+ module is broken, but the cards still work. These cards usually go for $150 – $200 new. I have also gotten an Intel x520-da1 which is a single SFP+ port card that I will be testing in my servers.
Now for the SFP+ modules. These are usually expensive too, but you can find them for usually around $25 on ebay as well, so I have gotten four of those to test transfer speeds. Still waiting on them to arrive.
Now for the switch. I have determined that getting a pure SFP+ only switch is insane because the prices to get an 8 port switch is so insane that I could do better at 1/5 the cost and have better control over the traffic. I have decided to make my own whitebox SFP+ switch. I have taken my old i7-920 computer and put both of the two port NICs in it and bridged the ports together to act as a switch, which I have currently connected to my Ubiquiti ES-48-US Lite switch that has a 10gig SFP+ backbone port. Great cheap switch, cant say enough good things about it.
Now I have connected my media server to this whitebox build using the single port NIC card that I have. Cant test anything else until I get those SFP+ modules and another single port NIC card.
Running this whitebox switch is PFSense, which is really meant to be a firewall, but can act as a bridge and route traffic as well. This will add some latency to these connections, but I dont mind so long as I can get above the 1gbps limit (100MBps) during transfers.
So far the bridge between the switch and the media server work! I also have noticed a much more snappy connection and transfers between my gaming rig and the media server, such as getting substained speeds of 100MB p/s where as before I was getting 60 – 75MB p/s. I really blame that on the crap unmanaged switches I had everywhere so I could connect everything. All those hops were a nightmare. Having everything connected to a single managed switch has been the best upgrade I have brought for my homelab in a long, long time.
Well thats all I got for now, I will upload pictures when things start to come together! I will also post more specifics about what I have done so you can be better informed if you want to try this yourself! 😀