I have developed a strange fondness for my home lab that is wholly unexpected.  It started as a way for me to get some real stick time with some of the technologies that I work with and talk to customers about every day, and has transformed into part Zen garden, part hobby and part pseudo competition.  To my wife’s chagrin it appears that this particular hobby seems to be just getting started!  It’s been a while since I posted an update on where things are, and there have been some exciting (to me, of course) new developments, so I wanted to share with those who are interested.  If you DON’T want to see pictures of gear, read about why I chose one motherboard over another or laugh at me while I retell stories of haunted power supplies, you should probably move on now.  I hear there’s a new Dilbert cartoon or something…

If, on the other hand, you want to know whether you can rack Cisco, HP and Dell servers in the same cabinet without having anything burst into flame…if pictures of cable management make your day…if you like to know how much power your lab is using at any time…if you want to know how cool it is to boot between three different hypervisors on demand in the lab…well, then I’ve got something for you!

Before we get to the lab, let’s talk about my baby.  Back in 2007 I purchased an HP/Voodoo Blackbird that has proven to be the best home workstation I’ve ever owned.  It was an actual HP product, but appears to be almost 100% a product that Voodoo brought over when they were acquired (yet another example of HP acquiring and then destroying something great).  The specs on the system were top-of-the-line at the time, with dual NVIDIA GTX 280 GPUs, 4GB of 1066MHz DDR2 RAM, lots and lots of USB ports and an Intel QX9970 quad-core CPU that sat under a custom-made Corsair liquid-cooling rig that was capable of letting the CPU overclock to over 4GHz with no problems.  It had two Raptor boot drives, three 1TB drives in a RAID5 as the data partition, and generally just screamed through whatever I asked it to do.  With the included TV Tuner card it served as DVR, DVD ripping station, web development workstation, VMware workstation guinea pig lab, video editor and audio streaming host.  After suffering through scratch-built problems and a nightmare couple years with the Dell XPS workstations, the Blackbird was everything I needed it to be.

BBEE010 BBEE013 BBEE015
BBEE016 BBEE017 BBEE018

Despite the internals, however, the real star of the show was the case.  Holy cow, it’s a work of art.  I’ve never seen a PC case before or since that is as functional, as easy to work on or that generates as much interest and commentary just based on appearance.  The pictures don’t do it justice.  It’s huge, solid, well-built and cools better than anything I’ve ever seen. If possible, it’ll be the last home PC case I ever buy.  Because it’s so modular I’ll just keep replacing the components!

Of course old age hits us all sooner or later.  4GB of RAM was awesome in 2007, but couldn’t handle the heavy lifting needed today.  The Raptors were super-fast, but ended up being the first component to fail, taking the RAID0 array with them.  After some other small issues, I went into the office one morning and the entire system was hard down.  Pressing the power button would make everything spin up for about 5 seconds, and then nothing.  Well shit.

If the system was new, I would have replaced one component at a time until I found the issue.  The CPU, motherboard and power supply were the most likely suspects, so it wouldn’t have been hard.  But with the age of the system, I decided to go ahead and replace everything and use some components that would put the ‘Bird back in the company it was used to from a performance standpoint.  After getting approval from the CFO (my wife) I started putting together a parts list…

Motherboard:

This one is always the touchiest for me.  You want a motherboard that supports the newest features and has all the cool gizmos, but you also need it to be rock solid forever.  External eSATA ports were a must for me, the ability to overclock (and back out gracefully if I bork it) was important and best-case it would support SATA3 drives in an on-board RAID configuration.  There were a couple good options out there, including the eVGA 131-GT, the ASUS Rampage III and the Gigabyte G1.Assassin all of which support the new Intel X58 reference architecture.  In the end, I went with the Gigabyte GA-X58A-UD3R because it’s got a reputation for being rick solid, and because it’s about half the price of the top-end boards while still supporting all of the features I wanted.  It doesn’t have have dual NICs on-board, the SATA3 RAID chip is from Marvell (which makes it shit, I’m sure) and the placement of the PCI slots could be better, but overall I’ve been extremely happy.

Processor:

Now that the chassis is built, what kind of engine are we putting in this thing?  For the lab I’m OK with using AMD, but not in this rig.  I went straight to the top and bought the Intel Core i7-990X Extreme Edition, which uses the 32nm Gulftown process to give you six cores at a ludicrous 3.46GHz.  Sure, it uses 130W to do it, but with 12MB of L3 cache and 6x256KB of L2 cache, there shouldn’t be ANYTHING that this bad boy can’t handle.  Yes, I paid a premium.  Yes, I don’t care.

Memory:

IMAG0732The motherboard supports up to 24GB of DDR3 RAM, and even though the 24GB pack was only about $50 more than the 12GB pack it’s stepped down to run at 1600MHz rather than the 1866MHz.  I decided to stay at 12GB since I won’t have to use this machine to run virtualized ESXi hosts anymore.  I picked the Corsair Dominator GT set and have been pretty happy.  With 9-9-9-24 timing and a nifty heat sink/fan combo to keep them cool, I haven’t had any issues at all.

Cooling/Environmentals:

DSC05912
IMAG0729

Like I said earlier, the original Blackbird came liquid cooled with a custom-made Corsair setup.  The cooling block had the Voodoo logo etched into it, which was cool, but it also included a double-sized radiator/pump combo, allowing it to keep the coolant closer to room temperature.  I was bummed about having to get rid of the OEM setup, but I found that although it’s custom the cooling block will actually fit into the new Corsair H50 brackets.  So I bought one, threw everything out except the bracket, and reused the original setup!  For $70 I figured it was cheaper than trying to find a replacement.  I used some Arctic Silver 5 Thermal Compound after cleaning the old stuff off the block and it’s been awesome.  Currently sitting at a toasty 37°C

Power Supply:

I hadn’t actually planned on replacing the power supply.  I put an ampmeter on the original 1100W version that came with the Blackbird, and couldn’t find any issues with it.  With that in mind I got everything re-cabled, got it all plugged in and then banged my head on the desk when I pushed the power button and everything spun up.  For 5 seconds.  And then powered off.  For the love of Pete.  As a replacement I was going to use the Corsair Professional Series Gold AX1200, but after doing the math I realized I wasn’t ever going to need that unless I went out and got three PCI-e video cards (and big ones at that). I decided to step down a couple notches and picked the Corsair Professional Series Gold AX850 instead. The thing that sucked the most that that the new power supply required all new cables, and that meant having to open the right-side panel and rip out all of my meticulously anal cabling.  Bah.  Once complete, everything powered right up and ran beautifully!

CPUOverall, I’ve been ecstatic.  While there’s a small issue where the SATA3 RAID controller steps on the parallel port, rendering the two slot-loading DVD drives useless until I get some SATA adapters, it’s been rock-solid.  The included Gigabyte overclocking tools are usable if a little clunky, and the performance is unbelievable, especially for CPU-heavy tasks.  I love Handbrake, but I love it even more now!  I am a bit of a digital pack-rat, and I’ve been storing 1:1 rips of Blu-Ray movies for years.  I wanted to compress them down into something more reasonably sized, since the raw disks can be upwards of 40GB each.  The issue was always that it would (literally) take days for Handbrake to rip, compress and encode the video with my old Blackbird.  With the new one I have been converting about 5 per day.  It’s a sight to behold when all six of those CPUs peg out. I’m still trying to decide what to use long-term for the boot drives.  I’ve thought about going back to the Western Digital VelociRaptor, but spending $200 on a 450GB hard drive, no matter how fast it spins, doesn’t sit well with me. I’m probably going to save my pennies (and wait until the CFO forgets how much I’ve spent) and go with a pair of 3.5” SSDs in a RAID0 set.  I’ve had good luck with Windows 7 on top of the OCZ Vertex 2 drives, so I will probably go with these. I wish there was a 3.5” hybrid SSD option, that would be interesting to try.  I don’t need a ton of space, since it’s just the boot volume, so we’ll see.

So part one of the great household computer refresh is complete.  I still need to get a good 30” monitor for this beast, but the PC itself is done.  Next, it’s time to tackle the home lab, and then there are some surprises in store!

19,716 total views, no views today

 

5 Responses to PCs and Home Labs and Data Centers, Oh My (Part 1)…

  1. Nice read. Almost makes me want to put together a high-end rig myself.

  2. Shaunguthrie says:

    Very cool. Makes me drool and wish I wasn’t in management. When we buy our new place in Kelowna, BC our place will be wired up the wazooo and will have a few labs so I can tinker.

  3. Honestly, Mike? Don’t do it. I think the golden age of high-end kit is over, and all of the veteran groups are owned by a major manufacturer. Voodoo, Alienware, everyone is corporate now. You are better off just buying something pre-built and getting a 3-year warranty. In 2007, the Blackbird was my way of getting out of the home-built game, even though it seems to have led me right back again…

  4. Well, I’ve been all Mac at home for several years, and while I buy a top-end iMac every year or so, and trickle down mine to the CFO, then her’s to a family member, I’ve had this itch to build a computer lately. Probably partly due to the Maximum PC subscription I’ve maintained.
    Now, though, having seen your next post about your home lab, I think I may be able to kill two birds with one stone. Scratch the building itch and also put together a home lab.

  5. The home lab (and the rest which will get posted this week, hopefully) has been wonderful. I’m always finding myself talking to a customer or a co-worker and saying “I wonder what would happen if…we decided to try a vCD implementation with only external networks” or “Can we present resources from multiple clusters to the same vCD?” Being able to then log into my view desktop and build what I want to see, without having to clear it with anyone or worry about messing anything up is AWESOME. If I had to pick the gaming rig or the lab, I’d keep the lab in a heartbeat. Luckily my CFO is nice to me!