UPDATED Feb 22nd 2011, 9:40am – note that this build uses a MB that has a Realtek NIC (or more accurately MAC/PHY). In my build notes, I purchased/repurposed Intel Pro GT/PT NICs from older whiteboxes. If vSphere boots, but can’t load any network drivers, it will report an error like this: “vmkctl.HostCtlException Unable to load module /usr/lib/vmware/vkmod/vmfs3: Failure”. There are two easy solutions: 1) customize your vSphere install ISO to support the Realtek NIC (great instructions here); 2) just buy the Intel NICs. I **personally** recommend option 2 – not because option 1 doesn’t work (it does, I tried it), but because it means you have one more strange thing on an already Frankenstein whitebox :-)
So, as I’ve said before, when things get political, and get all messy – I like going into the lab and spending a solid day just playing with tech. It’s how I get back into a zen state. That home lab powers a ton of VMs, and let’s me play, learn and stay fresh – not just on EMC stuff (using Virtual Appliances) and VMware itself, but a lot of folks across the industry. The home lab has 5 “mainstream” ESX hosts – and some other loosey-goosey ones I use for various purposes.
Staying technical… Ooommm.
So – I’ve been playing with the latest ESXi 4.1 bits, and also with “future VMware software”, and noticed that my old whiteboxes – one of the most popular blogs I’ve done had the “parts list” – (basically Intel Q6600, Intel Q/P35 chipsets, 8GB of DDR2 mem, boot from USB,Intel PCI PRO/1000 GT and PCIe PT NICs) where starting to have some problems.
Specifically, with future VMware software, there were various UEFI chipset things – but I can’t blame VMware, we’re talking about very old, very non HCL hardware.
So – I did a refresh – and was stunned by how much you can get for how little. With a disclaimer that I can’t vouch for anything other than my current experience if you want the shopping list, and the results – read on….
So – I wasn’t running into too many CPU issues with the Q6600, and the process of going to Nehalem or Sandybridge would have upped the cost substantively. I was running into forward compatibility issues, and you can always use more RAM.
I do plan on doing a Sandybridge-based Home Lab build in a bit (when Intel fixes the P65 chipset issues), and will also post findings.
So, with out further ado – here are the details of my shopping list. All prices are in CDN, and I bought them all at www.Newegg.ca
Step 1) I was looking for a new motherboard, ideally with LGA775 support, support for DDR3 and 16GB max memory configs, all at lowest cost. Ideally, it would be ATX (lots of PCIe/PCI slots for NICs). I settled on the MSI P43-C51 LGA 775 Intel P43 ATX Intel Motherboard – for $84.99
- Low cost
- 16GB max mem capacity
- Support for my existing Q6600 CPUs
- Right form factor (ATX, lots of space, 4 DIMM slots)
- No onboard video – necessitating an additional VGA card.
Step 2) It’s a bummer that there are so few low-cost, ATX MBs with onboard VGA. If you go into mini-ATX, they are a dime a dozen (but you get less IO slots) and at the other end of the extreme, the Sandybridge P65 chipsets have onboard video, but come at a price point that doesn’t meet my “el cheapo” target.
Since the MB doesn’t thave video – I threw in a cheap $29 VGA card (MSI N8400GS-D256H GeForce 8400 GS 256MB 64-bit DDR2 PCI Express 2.0 x16 HDCP Ready Low Profile Ready Video Card) – the only key thing with an el-cheapo VGA card (since you really don’t need anything fancy for this function) is to make sure it’s low profile, and that the heatsink/fan doesn’t stop you from putting a PCIe/PCI card into the adjacent slot. Personally, I wanted something that used passive cooling as well (just a little less noise)
Step 3) The BIG upside of the new MB is that you can use 4GB DDR3 DIMMS. It’s crazy, but I picked up 16GB (2 x 4GB kit for $94.99 x 2) for less than $200. Think about that for a second. Man. It wasn’t a year ago that 6GB was “x-treme!”. I picked the Patriot Signature 8GB (2 x 4GB) 240-Pin DDR3 SDRAM DDR3 1600 (PC3 12800) Desktop Memory with heatshield. I went with DDR3 1600 because while it is way faster than my older Q6600 procs support, eventually when I jump the whole set of systems up to Sandybridge, the RAM will still be fast enough. I might be crazy, but I tend towards kingston, corsair, crucial – rather than go to the rock-bottom prices. Advice, run memtest on your systems before you go to far into the build. I have a little USB utility flash thumbdrive I boot the systems from and run a few tests after the physical build. Memory issues will make you pull your hair out later, so check up front.
I reused my previous heatsinks, HDDs and CD-R/DVD-R. The latter two are insanely low-cost these days – you can get a 1TB HDD for around $50, and a Lite-On CD/DVD-R for $15.
For a case, this may seem a bit of overkill (but in the immortal words of Corporal John T Hannibal from the A-Team “overkill is underrated”), but I use rack-mount cases even in the home-lab. You can even get surprisingly good rackmount cases pretty darn cheap. I picked the iStarUSA D-300-PFS Black Steel 3U Rackmount Server Case for $89.99. They are pretty solid. A little bit of a weird layout IMO, but not bad.
NICs… This is another item that gets VMware whiteboxes in trouble. The onboard NIC on the MB worked fine (and technically you could get by without adding any more – NOTE THE COMMENT BELOW, but I like having more interfaces on my whiteboxes. I’ve found that if you stick with Intel GT (PCI ~ $40 each) or Intel PT (PCIe ~$80 each) you can count on them working well with ESX.
Note that this build uses a MB that has a Realtek NIC (or more accurately MAC/PHY). If vSphere boots, but can’t load any network drivers, it will report an error like this: “vmkctl.HostCtlException Unable to load module /usr/lib/vmware/vkmod/vmfs3: Failure”. There are two easy solutions: 1) customize your vSphere install ISO to support the Realtek NIC (great instructions here); 2) just buy the Intel NICs. I **personally** recommend option 2 – not because option 1 doesn’t work (it does, I tried it), but because it means you have one more strange thing on an already Frankenstein whitebox :-)
So, tallying it all up:
- Motherboard: 84.99: https://www.newegg.ca/Product/Product.aspx?Item=N82E16813130252
- Video Card: $29.99: https://www.newegg.ca/Product/Product.aspx?Item=N82E16814127473
- 16GB Memory $94.99 x 2: https://www.newegg.ca/Product/Product.aspx?Item=N82E16820220570
- Case: $89.99 https://www.newegg.ca/Product/Product.aspx?Item=N82E16811165083
- I re-used my power-supplies, but if you didn’t, you can get a decent powersupply for about $70. Don’t scrimp on powersupplies. You don’t actually need huge wattage for this use case, so buy a better 500W PS rather than a cheap 700W or 800W PS.
- I had the USB flash (for USB-based ESXi install), HDD (for HDD-based install) and the CD/DVD-R – but if you didn’t, you would need to spend another $80.
- I re-used my old Intel Q6600 CPUs, but if you didn’t have CPUs to reuse, the key with this config is you would need the older LGA775 socket type (remember, I picked the MB to house my older CPUs). You can’t get the Q6600 anymore, but a wofldale-series Intel proc should work. Here’s one as an example (dual core costs $89.00) https://www.newegg.ca/Product/Product.aspx?Item=N82E16819116093. It’s worth pointing out two things: 1) MAKE SURE the proc supports Intel VT. 2) you might be able to find another CPU/MB pairing that gives you more juice for less $$, but in my experience, the key for a happy ESX whitebox is the chipset (HDD controller, bios, onboard network), so go off at your own risk.
Grand total – for a great rack-mount quad core ESXi host with 16GB of RAM?: $632.96 CDN.
I ordered several of these (5 in total) to update the home-lab completely, and here’s the kit all at home. It’s like Christmas :-)
So – after assembling and testing, I’ve found this config to work excellently not only with the current vSphere 4.1(both before and after update 1) ESXi, but also with future vSphere releases. For me, that latter part is important, as I know I won’t need to update the homelab again for a while.
For shared storage, you have a ton of options. You can go Iomega IX4, you can homebrew a Openfiler box, you can use VSAs like the EMC Celerra UBER VSA. There is a great updated “using the EMC Celerra UBER VSA guide here, and demo video series here.
Enjoy – and remember, no self-respecting VMware fanboi should go without a home vSphere lab :-)