Bremermann’s Limit

It’s time for another issue of “Intractable Problems”, a column designed to challenge and stimulate your thinking skills. As I promised you last issue, we will examine what limits, if any, exist on the amount of information we can hope to process in a given interval of time.

A limit, you say? How can there be a limit for something like information, which is not real? Well, whether it is real or not, information needs to be moved around. We interpret the ‘processing of x bits’ to be the transmission of that many bits over one or many communication channels within the computing system. To do this, it’s obvious that the information in the computer system has to be physically encoded in some form.

In 1962, Hans Bremermann assumed that we would encode the information in terms of an energy level[1]. Of course, Heisenberg’s Uncertainty Principle showed that we can only measure the energy to a certain accuracy. The measurement accuracy limits the number of markers into which the energy level can accurately be divided. Now, Hartley’s information measure relates the information content of N markers to the logarithm of N. Using this, we can calculate the maximum amount of information that can be represented within our energy level.

Now, using Heisenberg’s uncertainty principle, Planck’s constant, and Einstein’s famous E = mc2 formula, we can easily show Bremermann’s conjecture that “no data processing system, whether artificial or living, can process more than 2 × 1047 bits per second per gram of its mass.”

There is a problem when mathematicians say things can be done easily.  Mathematicians are not normal people. To explain, Einstein said that E=mc2, where E is energy, m is mass, and c is the speed of light.  Planck said that E = hf, where E is energy, h is Planck’s constant, and f is the frequency or bits per second.  Planck’s equation relating energy to frequency is where the reference to Heisenberg’s uncertainty principle has come into this mix. And, Hartley’s information measure allows us to think of Planck’s frequency as if it were information bits per second.  So, by equating Einstein and Planck on the common energy term, we have:

mc2 = hf or f/m = c2/h.  The numerical value of c2/h is 1.37 x 1047 (bits per second per gram)

Oh my goodness! What does this mean? Well, hypothetically, let’s assume that we have a computer the size of the Earth and a time period equal to the estimated age of the Earth. Then this imaginary computer would not be able to process more than about 1093 bits. This number, 1093, is called Bremermann’s Limit, and problems that require processing more than 1093 bits of information are called transcomputational problems.

Again, for those of us who have feet planted firmly on the ground:  The Earth is about 4.5 billion years old ( 1.4 x 1017 seconds).  The Earth’s mass is about 6 x 1027 grams.  We have 8.4 x 1044 gram seconds for the mass time product, and when we multiply this by Bremermann’s Limit we get something around 1092 or 1093.  Close enough in my books.

The funny thing is, these types of transcomputational problems do exist in the real world. For example, exhaustively testing all combinations of an integrated circuit with 308 inputs and 1 output is a transcomputational problem. For those of us who dislike doing arithmetic, what we are saying is that 2308 is greater than 1093.  Did you know that such circuits exist in today’s computer processors?  Do you think the circuit works correctly for all inputs?  You now know that there is no way the engineers ever tested all combinations of their circuit!

Bremermann, in the conclusion of his paper, wrote the following:

“The experiences of various groups who work on problem solving, theorem proving and pattern recognition all seem to point in the same direction: These problems are tough. There does not seem to be a royal road or a simple method which at one stroke will solve all our problems. My discussion of ultimate limitations on the speed and amount of data processing may be summarized like this: Problems involving vast numbers of possibilities will not be solved by sheer data processing quantity. We must look for quality, for refinements, for tricks, for every ingenuity that we can think of. Computers faster than those of today will be of great help. We will need them. However, when we are concerned with problems in principle, present day computers are about as fast as they will ever be.”

Ah well, that’s life. If nothing else, at least it helps support my conjecture that cats must be dumber than dogs. On average, cat’s brains are not as massive as dog’s brains and therefore cannot hold as much information.  On the other hand, cats are all known to be tricky and devious, so it is possible that they have ingeniously discovered ways to work around their limitations.

Next issue, we will take a critical look at the serious problems involved in systems engineering.





Footnotes

1. Bremermann, H.J. “Optimization through evolution and recombination.” In: Self-Organizing Systems, edited by M.C. Yovits et al., Spartan Books, 1962, pp. 93-106