The Bremermann limit (named after the late H.J. Bremermann ) is approximately 1.356x10 bits per second per kilogram . It is the physical limit, given a certain mass , upon the maximum processing speed of a self-contained computer of that mass.

Now, in English:

Say that you want to build a fantastically powerful computer from a limited amount of matter -- let's say 10kg (about 22 pounds). To do this, you devise a highly efficient power source and invent a fiendishly clever computer architecture -- the details don't matter much. What matters is that you've built something that can and does process bits, for the ability to process a bit of data is the heart of computing. In order to process bits, you need to have a signal that can send bits in quick succession. The amount of data sent in one second is the frequency of the signal, expressed in bits per second, or hertz (abbreviated Hz). The higher the frequency, the smaller the interval between each bit, the more data is transmitted per second, the faster your computer can calculate.

Now for the physics. No matter what your signaling method is, each bit requires a certain amount of energy to be sent. Since you want your machine to be small and efficient, you'd like to use as little energy per bit as possible -- without sacrificing any computing power, that is. Quantum mechanics says that everything comes in discrete packets; there is a smallest amount that you can have of anything, be it energy, light or electrical charge. These packets are called quanta (singular is quantum), hence the name "quantum mechanics." Your signal, just like everything else in the universe, must be quantized. Specifically, the size of the signal energy quantum is directly proportional to the frequency of the signal:

E = hf

"E" is the signal energy quantum, "f" is the frequency of your signal (i.e. your bitrate), and "h" is Planck's constant, a fundamental constant of the universe equal to approximately 6.6x10-34 joule-seconds. This equation limits the minimum energy that you can use to send your signal. Why? Because in order to send the signal at all, you must be using at least one quantum of signal energy to send each bit. In other words, you must use at least one "chunk" of energy for each bit sent. To rephrase it once more (and turning it around in the process), given an amount of energy, the maximum rate at which you can process data with said energy is given by:

E/h = f

Given our 10kg to work with, there's a maximum amount of energy that we can possibly use in our computer (since we're talking a WHOLE computer here, power source included), and we can find this energy using an equation you've probably seen before:

E=mc2

So, we substitute this in for the E in the previous equation, and we get:

mc2/h=f

Since both c and h are constants, we can call c2/h a new constant, B, and the equation is now:

Bm = f

B is the Bremermann limit (as you can see if you substitute in the values of c and h in the equation above and compare it to the number at the top of this page) and we now know how fast a 10kg computer can go, in theory: 1.3x1051 Hz, or about a million billion billion billion billion GHz.

This is a big number. Really big. I mean, you might think it's a long way to the chemist's, but that's just peanuts compared to this. Despite this fact, there are still some computational tasks that are so complicated that the Bremermann limit makes it essentially impossible to perform them. For example, if we turned all the matter in the universe into the best computer possible, it would take at least 100,000 times the current age of the universe to compute all possible games of chess that can be played.

I believe you will join me in saying "Holy fuck."

For the motivated student: the original paper in which Bremermann first proved all of this is available here: http://www.aeiveos.com/~bradbury/Authors/Computing/Bremermann-HJ/QNaI.html.

It's also worth noting that if Moore's Law continues to hold for much longer (unlikely for various reasons), we will hit Bremermann's limit in approximately 300 years.