Admit it. You’re an infomaniac. Your life is an orgy of information.
Now you’re no info-hussy. You’re discerning. You privilege only certain kinds of information. Still, you have instant access to such a glut you’re bound to find an abundance that suits your lusty tastes. Your computer is brimming with gigabytes of it and the Internet serves up info for any info-fetish. This is the 21st century ADD. With this much information flowing, what infomaniac wouldn’t experience attention deficit?
If you’re like most people though, a piece of crucial information eludes you.
Information please: What exactly is information?
We measure info in bits and bytes as though it were a thing. You can buy a bushel of apples, a pallet of rocks. Is a gigabyte of info a specific quantity like that?
Bits and bytes originate out of information theory, developed by the great Claude Shannon, an engineer at Bell Labs in the 1940s who noticed that information is not something added but something subtracted, a reduction in possibilities.
See All Stories In
For example, heads or tails? Before the flip you don’t know which it’s going to be. After the flip you’ve got information, having reduced the possibilities from two to one. That’s one bit of information.
A single bit of information is any “maybe” that turns into either a yes or a no. Think of a bit as a toggling switch whose position you didn’t know before, but now know. Heads or tails? Zero or one? Yes or no? On or off? Each of these questions is like a toggle switch equally likely to be in either of two positions. Once you know its position you’ve got a bit of information. A two terabyte hard drive which today sells for about $100 is a bank of about 16 trillion such switchable switches.
However the bit of information is not the switch itself. It’s not a thing like an apple or a rock, but an event: getting the news that turns a maybe into a yes or a no. Information can’t be distinguished from becoming informed. Weirdly “receiving information” is redundant, since information is not a think but the event of receiving the news, discovering the reduced possibility.
Is a terabyte more like the bushel, a container that holds information? We talk about empty and full hard drives as though it is, but that’s not right either. For one thing, information is a reduction in possibilities. Heads or tails is more like someone holding a bushel with two things maybe in it—heads and tails–and letting you peek in to see that there’s only one in there.
There are empty bushels but there are no blank switches. Every bit or switch is in a specific position. An empty hard drive’s switches are always in one position or another, and those positions are all readily accessible with the right technology. The emptiest hard drive is always full, its switches in one or the other discoverable position.
So why do we freak out when a full hard drive is erased but not when a full one is? What’s the difference?
A bit or byte is a measure of capacity, but isn’t a measure of information’s worth to us discerning infomaniacs. Consider the worth of yes or no answers to the following questions:
Will you marry me?
Do you want fries with that?
Is my son dead?
Should I put your groceries in plastic bags?
Did a nuclear bomb just level Philadelphia?
Is Jessica Simpson married these days?
Did the tests show I have lung cancer?
Did you like that movie?
Was he wrongly executed?
Is Nickleback a good band?
Though Shannon is credited with founding information theory his contribution wasn’t really about information but communication. He was interested in how much could be communicated over a Bell Telephone line and simply assumed a sender and receiver who find the reduction in possibility meaningful. For his purposes he didn’t need to pay attention to you the discerning informaniac’s preference for some info over other. The difference between useful and useless information is left entirely unexplained by Shannon’s theory.
If information were just switches then what isn’t information? Every physical state, the position of an apple, a rock, a galaxy or a subatomic particle is just one of many states the thing could be in. By this reckoning everything in the universe is information, just because it’s in one position and not another. Maybe it would take more than 20 questions to discover the state of a thing, but its state could be translated into answers to a series of yes/no questions. The majority of researchers working in info theory treat the origin of information as the origin of stuff in positions, as though the universe is just an unbelievably huge bank of switches. They conclude everything is information.
We laugh at past cultures for their monumentally misguided assumptions, yet we’re not exempt. We are certainly misguided about what information is. To give you a sense of folly, consider the great science writer James Gleick’s 2011 best seller “The Information: A history, a theory a flood.”
Gleick deftly presents a history of information theory and concludes that everything is information. Then he talks about the flood of information we face today. But if everything is information, how can there be more of it today? Gleick not only doesn’t answer this question, he doesn’t seem to notice it.
That we don’t understand information doesn’t necessarily mean it is difficult or impossible to understand. In medieval times, people thought lightning was an angry message sent by God. Today we understand lightning as electrical discharge, an explanation not much more complicated or counter-intuitive than God’s wrath.
And anyway, you traffic in information far more than you traffic in lightning. Infomaniac that you are, you have well-honed intuitions about what information is.
And it turns out that there is an intuitive scientific theory of information, one that builds on Shannon’s insight but fits your intuitions much better. I’ll present it in my next article. And then in a follow up, I’ll show you how to use this more complete information theory to hone your intuitions, satisfying your infomania all the more completely.