What is Reality's Bandwidth?
Posted on April 12, 2012 by oubiwann

![]() |
Original, source photo by Stewart Leiwakabessy |
It wasn't long before I was deep into wondering about the bits encoded in the bacterial DNA of my large intestines and the number of possible states for all the interacting systems at the quantum level. When I got home, I brushed the dust off some links I hadn't visited in a while, and things got even more interesting.
A Naïve Snap-shot in Time
The first draft of this paragraph was details on the bits contained in the genomes of humans, resident bacteria and viruses, CAD drawings for the car (everything from bolts, screws, and parts to the engine, electrical system, and GPS). Needless to say, there was lots of boring multiplication, addition, and hand-wavy guesses for when I couldn't find hard data.
I also started exploring the visual data of the landscape, coming in through the windows and review mirrors... and this opens up the "perception" can of worms.
The end result was an estimate of 427 terabits of "blueprint" data implicitly hurtling down the highway at 70 mph.
But what about data in time?
A One-second Interval
We can do obvious things, like calculating the data being pushed into the ambient environment by a CD (1,411,200 bits per second [2]) or a DVD (27,000,000 bits per second for (HDTV [3]) being viewed by passengers. Of course, since we're dealing with averages, we've only got .3 passengers (27,000,000 * 0.3 bps).
We can also turn those the snapshots of the scenery out the windows into "streaming, high-resolution media." Imagining 6 windows and 3 mirrors and calculating assuming that each is "generating" (allowing to pass) HDTV quality visual data, we have 9 27,000,000 1.3.
However, before we go too much further, there are some problems:
- we no longer have the simple static data problem: have included time, so now we get changes in state
- we're still referring to the data of the objects as the minimum required to "assemble" them, not the data of the completed units themselves, replete with possibly uncountable phenomenal interactions
- we have assumed that generators of information are something we should measure
- we have ignored perceivers of information
- we are ignoring scale
The Coast of Norway Problem
If we take the full spectrum of data-generating events into account, there is a LOT of information to be accounted for. Here is a brief overview of some of the data entering the car or being generated inside of it:
- viewing the stars (astronomical scale)
- viewing mountains in the distance
- watching the environment near the car (animals, other cars, trees)
- listening to conversation, music, or passengers watching a movie
- the presence and activity of bugs, dust, lint in the car
- similarly, bacteria and air moisture in the car
- air-borne viruses
That takes the external environment into account. Now, the internal:
- body organs
- internal flora and fauna, cells
- neurological, mechanical (microtubules), and chemical signals
- molecules
- atoms
- subatomic particles
- pl ank-level interactions and quantum foam
These conceptually fractal fjords portray an obvious case of vastly increasing rates of data when additional levels of scale are taken into consideration.
So when do we stop? At what point do we say "that's enough data" and abandon further explorations up and down the spectrum of scale? Is there such a thing as a "good approximation" of data in a finite portion of spacetime? Is there an upper bound we can use?
Indeed there is. Sort of. For the quantum states within a particular volume, anyway.
Working with Maximum Info
Based upon his work in 1972 on black holes, Jacob Bekenstein demonstrated that there is a maximum amount of information that can be stored in a finite region of spacetime using a finite amount of energy. He phrased this bound in terms of entropy with the following:
When recast in information-theoretic terms and after performing some algebraic substitutions, we have this inequality:
With this, we can estimate the total possible information carried within an average car, the passengers, and some stuff in the car:
- 2000 kg car, 4 cubic meters = 5e+46 bits
- 1.3 * 70 kg human, 0.005 cubic meters = 2e+44 bits
- 50 kg of junk, 0.001 cubic meters = bits 1e+44
If we just add them all together (which I'm not sure is kosher since the distribution of information is decidedly non-uniform and almost certainly interdependent), that's 4.7e+37 gigabits, or 5.9e+36 gigabytes. The highest we go in our naming scale for numbers right now is 5.2e+21 yottabytes. That's 52 billion gigabytes. The mind truly boggles.
That's just a Sunday drive. What about the road? The landscape? The entire planet?
That's just a Sunday drive. What about the road? The landscape? The entire planet?
The famous gravitational physicist Wheeler said it well in his paper Information, Physics, Quantum: The Search for Links:
"Enough bits to structure a universe so rich in features as we know this world to be? Preposterous! Mice and men and all on Earth who may ever come to rank as intercornmunicating < span class="s1">meaning-establishing observer-participantsBack to the car, though: we're talking about maximum possible states at the quantum level, and bits per second no longer makes any sense. But we'll continue anyway :-)will never mount a bit count sufficient to bear so great a burden."
What about subjective information? Perception? memories? The total inhabitants of the vehicle and their constituents, including their complete microbiome? What about anything capable of sensing the environment, processing data, consuming information – at any scale?
If I have interpreted Bekenstein properly, one could state that if the maximum amount of information "stored" in a human body is 2e+44 bits, then that data would have to represent all possible data entering the gates of perception and all memory of and reaction to said data. As such, it doesn't really matter how much perceiving is going on inside the car or our bodies – the bits already have that covered.
So no matter what I'm seeing – inside the car or out – is already taken into consideration with the Bekenstein bound. But there's a mighty big tangential here: what role does perception and cognition play in a universe of bits? Wheeler's got an answer for this, too:
"... every 'it'—every particle, every field of force, even the space-time continuum itself—derives its function, its meaning, its very existence entirely ... from the apparatus-elicited answers to yes-or-no questions, binary choices, bits... that which we call reality arises in the last analysis from the posing of yes–no questions ... in short, that all things physical are information-theoretic in origin and that this is a participatory universe."
Inevitably, M-Theory
http://en.wikipedia.org/wiki/Holographic_principle
Bekenstein Bound -> black holes -> volumes encoded on 2D surfaces
"The holographic principle is a property of quantum gravity and string theories which states that the description of a volume of space can be thought of as encoded on a boundary to the region—preferably a light-like boundary like a gravitational horizon. "
"First proposed by Gerard 't Hooft, it was given a precise string-theory interpretation by Leonard Susskind[1] who combined his ideas with previous ones of 't Hooft and Charles Thorn.[1][2] As pointed out by Raphael Bousso,[3] Thorn observed in 1978 that string theory admits a lower dimensional description in which gravity emerges from it in what would now be called a holographic way.In a larger and more speculative sense, the theory suggests that the entire universe can be seen as a two-dimensional information structure "painted" on the cosmological horizon, such that the three dimensions we observe are only an effective description at macroscopic scales and at low energies. Cosmological holography has not been made mathematically precise, partly because the cosmological horizon has a finite area and grows with time.[4][5]"
"The holographic principle was inspired by black hole thermodynamics, which implies that the maximal entropy in any region scales with the radius squared, and not cubed as might be expected. In the case of a black hole, the insight was that the informational content of all the objects which have fallen into the hole can be entirely contained in surface fluctuations of the event horizon. The holographic principle resolves the black hole information paradox within the framework of string theory.[6]"
"The holographic principle states that the entropy of ordinary mass (not just black holes) is also proportional to surface area and not volume; that volume itself is illusory and the universe is really a hologram which is isomorphic to the information "inscribed" on the surface of its boundary."
"In 1995, Susskind, along with collaborators Tom Banks, Willy Fischler, and Stephen Shenker, presented a formulation of the new M-theory using a holographic description in terms of charged point black holes, the D0 branes of type IIA string theory. The Matrix theory they proposed was first suggested as a description of two branes in 11-dimensional supergravity by Bernard de Wit, Jens Hoppe, and Hermann Nicolai. The later authors reinterpreted the same matrix models as a description of the dynamics of point black holes in particular limits. Holography allowed them to conclude that the dynamics of these black holes give a complete non-perturbative formulation of M-theory. In 1997, Juan Maldacena gave the first holographic descriptions of a higher dimensional object, the 3+1 dimensional type IIB membrane, which resolved a long-standing problem of finding a string description which describes a gauge theory.These developments simultaneously explained how string theory is related to quantum chromodynamics."
Into Being
A random musing about information velocity has provided an unexpected impetus for exploring these murky edges of physics, metaphysics, and philosophy. The musings have lead quite unexpecrtedly to the question posed by Wheeler "Why existence" (in the context of "it from bit").
Experts in the fields of quantum mechanics, cosmology, and the various string theoretic pursuits seem to be setting a tred of upholding the Anthropic Principle. Initially, I found this rather shocking, and having grown up on a steady diet of the scientific method, it seemed rather grotesque at first glance. However, as Leanord Suskind puts it, the Anthropic Principle is really just a matter of common sense:
"what is it that decides which kind of environment we live in—the temperature, chemistry and so on? ... The answer is that nothing does... Nothing determines the nature of our environment—except for the fact that we are here to ask the question! The temperature is between freezing and boiling because life (at least our kind) requires liquid water. That's it. That's all. There is no other explanation."But Wheeler takes this even further:
I've spent a couple nights pondering Wheeler's paper, having felt from the first reading that he was hitting upon something so subtle it was hard to capture... and even harder to communicate. Today, this is what I think he was trying to convey:
"Observers are necessary to bring the Universe into being."Barrow and Tipler believe that this is a valid conclusion from quantum mechanics, as John Archibald Wheeler has suggested, especially via his participatory universe and Participatory Anthropic Principle (PAP).
Links
[1] http://en.wikipedia.org/wiki/Fuelefficiencyin_transportation#Automobiles
[2] http://en.wikipedia.org/wiki/KiB/s#Examples
http://www.wired.com/wiredenterprise/2012/03/ibm-networking/
[] http://en.wikipedia.org/wiki/Genome#Comparisonofdifferentgenomes izes
[] http://en.wikipedia.org/wiki/Human_microbiome
[] http://en.wikipedia.org/wiki/Humanmicrobiomeproject
Comments?
This blog doesn't use standard (embedded) comments; however, since
the site is hosted on Github, if there is
something you'd like to share, please do so by
opening a
"comment" ticket!