Building a Cryptocurrency Mining Rig - Part 1

When Ethereum exploded in popularity (and value) in Summer of 2017, I decided to educate myself about cryptocurrencies and blockchain technology. As part of that process, I built an Ethereum (Classic) mining rig.

Here I’ll discuss how I designed and built my miner, focusing primarily on the construction of the chassis using OpenBeam, Fusion 360, and a Shapeoko 3. (Later, I’ll discuss BIOS configuration in Part 2, mining strategy optimization in Part 3, and compute performance optimization in Part 4.)

If you’re not interested in the design/build process, click here to skip to the end result.

Google quickly turned up Cryptobadger’s blog, which I found to be enormously instructive. I learned the following from his series on building a mining rig:

The power-supply should have headroom beyond GPU consumption.

Few motherboards have sufficient (6+) PCI-E sockets for attaching GPUs.

Few GPUs are optimal for mining in terms of hashes per watt.

GPUs aside, powerful hardware is of little importance.

They are energy-efficient, and thus inexpensive to power over the long-term.

As low-power GPUs, I assumed they would run cooler than other options. (This is important to me because I live in Florida, and didn’t want my miner to be a space-heater.)

These GPUs will be used in Linux workstations when retired from mining, and I prefer NVIDIA to AMD for that use-case.

The remaining hardware decisions were relatively unimportant. If you’re interested in those details, see the bill of materials at the end of this article.

Build a lay-flat rectangular frame

Zip-tie GPUs side-by-side inside the frame

Mount the motherboard and power supply beneath the GPUs

Here’s an example from Motherboard:

The lay-flat solution seemed practical and inexpensive, but I had a few issues with it:

Spatial constraints in my apartment forced me to store my miner on my desk, and I was unwilling to sacrifice the amount of desk-space that the lay-flat design would require.

The lay-flat design places all GPUs side-by-side, such that only the outermost GPU has access to cold air - the others’ intake fans draw directly from their neighbor’s heat-sink. I feared that this arrangement could contribute to overheating.

Subjectively, I thought many of the lay-flat builds were ugly.

Given the above, I decided to eschew convention and design a vertical miner that resembled a common PC tower.

The power supply should be mounted at the bottom of the chassis to provide a low center-of-gravity and minimize the risk of tipping.

Six total GPUs should be positioned above the power supply, in pairs, in three 150mm bays. This arrangement would give 3 of 6 GPUs access to cold air, and would sandwich no GPUs.

The motherboard should be mounted off the side of the chassis, on the side opposite the GPU intake fans (such as not to obstruct cold air intake).

Satisfied with the plan, I began to research construction materials, hoping to find an attractive material that facilitated rapid prototyping. I eventually discovered OpenBeam, which touted itself as “a low-cost, open-source extruded aluminum construction system”:

It fit the bill perfectly.

OpenBeam was ideal for this application. While I have some experience with Sketchup and Fusion 360, I personally find it easier to reason through spatial problems in “meatspace” when possible. OpenBeam spared me from having to model GPUs and other components in CAD, and enabled me to iterate using real, physical components.

After a few days’ worth of experimentation, I settled upon the following design:

(You’ll see a 500W power supply in the photo. I was using it as a placeholder while I waited for a 1200W unit to arrive in the mail.)

(These pictures were taken before the final two GPUs arrived in the mail.)

Initially, nvidia-smi could not detect two of the new GPUs, and I suspected that the PCI-E risers were defective. After experimentation, however, I determined that the risers themselves were fine, but that the USB cables they shipped with were faulty.

I resolved the problem by replacing all of the PCI-E riser cables with new ones from Amazon, having lost confidence in the factory cables.

I also had to update my motherboard’s BIOS in order to detect more than four GPUs. I’ll discuss that process in detail in Part 2.

Cable tension would sometimes pull the GPUs together. (You can see this happening in the photos above.) Some GPUs were getting hot (~80C).

The first problem was easily solved. I purchased aluminum screws and stand-offs from McMaster Carr and constructed locking-lugs to hold the GPUs in place:

(This would have looked nicer had I purchased shorter screws or taller stand-offs. Oh well.)

The overheating problem required more thought. The obvious solution was to install fans on the chassis, but I was unsure how best to mount them.

As a short-term kludge, I zip-tied a 140mm fan in front of each GPU bay. This cooled the GPUs sufficiently for me to begin mining full-time, and freed me to solve the remaining problems at my leisure.

Side note: I was surprised by how much more effective it was to “suck” hot air away from the GPUs than to “blow” cold air over them (~7C). I oriented the fans to “suck” air accordingly.

Wanting to make the job easier, I started by purchasing a double mounting bracket for the hard-drives:

I then designed a panel for the top of the chassis, and cut it from 1/4” Lexan:

I attached the power switch and hard-drive mounting bracket to the panel, and then attached the panel to the chassis. It held everything neatly in place:

The mar in the photo occurred because my mill failed to withdraw to its “retract height” when travelling to make its first cut. I’m not sure why it did that, but I suspect that I overlooked some parameter in Fusion 360 that I should have set. (I’m really enjoying Fusion 360 so far, but I’ve found that it has a steep learning-curve.)

Milling blemish aside, I thought the panel turned out really well.

I wanted to position a fan in front of each GPU bay. Each bay was 150mm wide, and each fan was 140mm wide (with mounting holes slightly closer together than that).

Because I was happy with the Lexan mounting panel for the hard-drives and power switch, my first instinct was to take a similar approach with the fans. I designed and cut a prototype mounting panel:

The panel milled perfectly. After experimenting with positioning it, though, I observed some problems:

The holes that mounted the fan to the panel and the holes that mounted the panel to the chassis were so close together (out of necessity) that their screws competed for space. Each GPU bay had somewhat different geometry due to the hardware used to mount the motherboard to the side of the chassis. Accommodating this would require me to either mill a distinct panel for each bay, or to “jump” some of the mounting hardware with stand-offs.

After staring at the chassis for a bit, I decided to ditch the Lexan panel and opt for a simpler solution - mount the fans directly to a chassis rail:

(The fans are white, though they look somewhat blue in the photos.)

(The blue device sitting on top of the chassis is a fan controller. It hadn’t been mounted at this point.)

This approach positioned the fans slightly (5mm) off-center, which is why I avoided it initially. Functionally, though, that was inconsequential, and I found that the aesthetics didn’t bother me. So, I declared that solution “good enough” and moved on.

I had initially hoped to mount it above the topmost GPU fan, but the enclosure was too deep, and collided with the power switch.

I then changed plans and decided to mount it off the side of the chassis, above the motherboard. There existed no convenient way to mount the fan controller enclosure to the chassis, though, so I disassembled the fan controller to see if I could drill two holes in its enclosure to use as mount points:

The enclosure turned out to be mostly empty. The useful part of the fan controller was a circuit board attached to the front panel, which lifted effortlessly out of the enclosure:

Having discovered this, I discarded the enclosure and returned to my original plan of mounting the fan controller above the topmost GPU fan. I only had to make one modification to the chassis: the fan controller collided with a horizontal strut that connected the tops of the front chassis rails.

The offending strut was structurally unimportant, so I simply removed it. (Thanks to OpenBeam, it only took a few seconds to make this change.)

Next, I once again used Fusion 360 and my Shapeoko to design and cut a Lexan mounting panel for the fan controller:

The fan controller snapped cleanly into the mounting panel, and I reinforced its fit with Superglue. (In hindsight, I should have been tidier with the glue.) I then mounted the panel to the chassis:

With the fan controller installed, I powered on the machine and tested the fan acoustics. Dependent on the fan controller settings, the fan noise ranged from “barely audible” to “oscillating desk fan”. Even at the highest setting, the fan noise never became unpleasant.

With that, physical construction of the chassis was complete. In Part 2, I’ll discuss the BIOS changes that were necessary to bring all six GPUs online and fix corrupted video output.

If you have any questions or comments, tweet to me at @chrisallenlane.

Proceed to Part 2 »

nvidia-smi