25 December 2015

Trip Down Memory Lane

I got into building my own computers when Nvidia announced support for GPGPU on their G80 chipset (think GeForce 8800GTS/GTX/Ultra). Good times in 2008.

My one-upping started after this fellow rubbed his Core i7-920 in my Core 2 Quad Q9550's face. Game on! But really, I didn't have the funds or the knowledge to beat that.

Then a good friend Bayley showed up at MIT, and things got serious reasonably fast. In particular, my upgrade path looked something like this:
  • Core 2 Quad Q9550 4-core/8GiB DDR2
  • Some Intel Xeon W35xx 4-core/24GiB DDR3
  • 2x Intel Xeon X5650 6-core/24GiB DDR3
  • 4x AMD Opteron 6164 HE 12-core/128GiB DDR3 ECC
Then Bayley and I both realized that we had to switch to rackable computers for any sort of ease in managing hardware. I took a break from single-image systems and built half a blade server:
  • 2x Intel Xeon L5320 4-core/8GiB DDR2 FB-DIMM
While Bayley got a Sun Box®:
  • 8x AMD Opteron ??? 4-core/128GiB DDR2 ECC
In a valiant attempt to dethrone me, he patched the Sun's BIOS with the microcode for hex-core CPUs, but the computer only recognized one core of one processor of the eight that he put it. Much wow.

My AMD box cemented 1st place in sheer awesomeness for student-owned computers for a little over a year, until Bayley discovered mainframes on eBay:
  • Some IBM thing with 16x Intel Xeon E75xx 6-core/256GiB DDR2 ECC
Mind you, I have cemented awesomeness density, since my AMD box fits perfectly in 1U, while Bayley's monstrosity needs 16U, and a whole lot more power. :-) Not to mention that it blew a circuit breaker while trying to install Ubuntu…

Leave it to me, my worst enemy, to dethrone myself. I wanted to repackage the AMD box into a legitimate 1U chassis that I found on eBay for not too much money, since it had sat on a piece of cardboard (for insulation!) atop the dorm dresser. After the computer happily booted in the chassis, I jiggled one of the unscrewed heatsinks and shorted the motherboard. A nice spark flew from a voltage regulator to the copper heatsink and poof!

gg no re Bayley


There's also a GPU story, which I shall not neglect. I went to MIT with my trusty aforementioned 8800GTS 320MB. A few swapfests later, Bayley wound up with a quartet of GTX 460s. It was mainly used for benchmarking Holy Balls, a demanding multi GPU raytracer, and got really toasty after a while. Good old Fermi.

The cryptocurrency scene started getting traction with the advent of profitable scrypt-coin mining. AMD cards were far superior in hashes/watt and competent in hashes/sec. Bayley got his hands on a few AMD 7850s, the preferred mining card, among other cards. I went extra long and got 11 cards from various sources (newegg, eBay, etc) since retailers caught on to the mining demand. Luckily I was able to dump my cards at cost (after eBay commission) after Mt. Gox blew and before everyone else wanted to dump their cards. Bayley kept mining, and even upgraded to 7950s as well as a 6990 when prices decreased and good deals appeared.

But wait, I'm back! I had the privilege of acquiring a new Mac Pro (affectionately known as the "trash can") with dual AMD D700 graphics cards. For all intents and purposes, they are two workstation grade cards (think neutered W9000s), each with 6GiB VRAM. So I can confidently say that I retake the GPU crown.


Meanwhile, with the abundance of swapfests at MIT, Bayley acquired multiple storage nodes for not too much money (and the networking infrastructure to back the system). To top that, he accidentally won half an SGI Altix and soon to be the glorious 1TiB of RAM on eBay.


GiB and TiB stand for Gibibytes and Tebibytes, because I'm anal and care about these distinctions.

No comments:

Post a Comment