One of the most common questions I get asked in the Gridcoin forums and social channels is: 'How much can I make with hardware X'? The difficulty in answering this question is that Gridcoin does not function like a conventional cryptocurrency with respect to mining. Your hardware is not put to use to endlessly hash data, but contributes to actual science which is less predictable with respect to performance.

Just like mining other crypto, with Gridcoin:

The amount you earn depends on how much you contribute to the network compute relative to everyone else.

You can join a pool to get paid more steadily, rather than having to get lucky staking a block.

More powerful hardware outcompetes less powerful hardware generally speaking.

GPUs outcompete CPUs.

However, unlike mining other crypto, with Gridcoin:

Your compute power is used to solve research problems, primarily for the scientific community. These problems are complex, and vary wildly between projects that the Gridcoin Network rewards contributions on. As a result, hardware performance will vary between projects, as will your mining rate.

Hashing power is meaningless, so hardware profitability cannot be readily compared

The daily Gridcoin mint is spread across the 'whitelisted projects', which are projects eligible for rewards (you run these projects - you get paid GRC). If the number of projects in the whitelist changes, this will significantly impact your daily GRC mint.

As a result of the above, the best way to maximise your yields is to contribute compute to the project that has the least competition from other Team Gridcoin members. I have been running a significant number of different GPUs on several of the most efficient projects to mine for the past month and a half, and collected data on the income they have generated. By calculating the average yield per GFLOP of compute and linearly extrapolating, we can predict the earnings of most GeForce cards in the series. Cards highlighted in yellow are cards for which I have actual data, which were used to linearly interpolate the rest of each series. Note that all values are averages, meaning none of the raw data remains in these tables. Further, all hardware is running stock settings - no overclocking or overvolting, which could further increase yield.

NVIDIA GeForce 10 Series

NVIDIA GeForce 900 Series

We will be skipping the GeForce 800 series as this only includes mobile models. This was because NVIDIA intended for the 800 series to use Maxwell microarchitecture but had already released 800 series mobile GPUs with Kepler architecture. Therefore the intended desktop 800 series GPUs were released as the NVIDIA GeForce 900 series instead.

NVIDIA GeForce 700 Series

Please note that some liberties have been taken in making these estimates:

I have not taken into account power costs as power prices vary significantly depending on where you live.

I have assumed that the GPU will be running 24/7

I have not optimised CUDA vs OpenCL (could yield about a 10% increase, depending on the project)

I have not considered FP64 performance, only FP32 (FP64 is relevant when running [email protected])

These estimates are for the current GRC price (USD$0.032), and ATH (USD$0.13 in June). You will have to adjust the income estimates based on markets rates at the time of reading.

If you are not sure what GPU you have in your machine, you can find out in less than a minute! Hit the start button, and type 'run'. Open the run dialogue box by pressing enter and type 'dxdiag' without the speech marks. If asked whether or not you would like to check if your drivers are digitally signed, select 'no'. In the window that opens up, navigate to the 'Display 1' tab to find your GPU. Note that on my computer shown below, I am running an ancient Quadro 600, which was not listed above and has pretty abysmal performance.

If you would like to get involved, you can get started by installing the research software BOINC and the Gridcoin Wallet.

Image credit, in order of appearance:

Banner, @joshoeah

Tables done by me based off my own mining data

Banner, @vortac