A while ago, I mentioned semiotics and information theory, both of which are studies of how information, meaning, sign reading, and, yes, mind reading happen. Semiotics is the more philosophical approach. Information theory, more scientific and quantified, is the brainchild of Claude Shannon, a mathematician working for Bell Labs. At its core is a measure of information content as the probability of a signal being sent in comparison to all the signals that could have been sent.

Let’s say I decided to write a really short essay this week: one single letter between A to Z. Whichever letter I sent, it would be 1/26th of the letters I could have sent, and therefore the Shannonian information content would be 26/1.*

Information theory provides a powerful tool-kit for the computer and telecommunications industries, as well as providing some insights for a broad array of sciences. Still, to provide a precise mathematical measurement of information, Shannon had to leave out information’s most important aspect, which is what it’s about.

Mind Readers Dictionary: The Podfast : Play in Popup

Mind Readers Dictionary : Play in Popup


See, here’s the problem: A signal is not its meaning. These clusters of letters I send you are, by themselves, just squiggles. If you didn’t read English, they would be utterly meaningless. And the same is true for all signals, the ones we actively send and the ones that we sense in our environment. (Hmmm . . . looks like rain!) The medium is not the message. Information is inseparable from the mind of the beholder. Information is a relationship, not a thing-a relationship that depends on an infinite number of factors related to both the source and the recipient of it.

Think of it in terms of mind reading. A guy’s eyebrows go up. Now, what’s that about? Well, it could be about an infinite number of things, from a strange sensation in his toe to something you said two weeks ago that he only just now misconstrues as an insult, or anything else, depending on what prompts it-but also on what interests you about it. With infinite possible meanings, the signal’s “aboutness” makes for not very manageable math.

All effects, including this guy’s eyebrow raising, are the products of causes that extend back infinitely in time. But it’s even more complex than that. Nothing happens for just one reason, so the causes can not be tracked backward through a single linear progression. Think of how many factors had to converge to cause a gesture. Indeed, the guy’s eyebrow raising wouldn’t happen if he didn’t have an eyebrow to raise. Thereby, its causes really extend back through all of natural history, through all of the evolution of his eyebrow. Not that all of history is likely to be about raising an eyebrow, but technically, where do you draw the line? There is no statute of limitations on aboutness.

So, information theory surrenders to a simplification: Given that signals are about something (never mind what-make it anything), whatever signal gets sent is one of lots of signals that could be sent. Concentrate on the signal in relation to possible signals. Ignore the aboutness. Then you have some tractable math to work with.

The tractable math has distracted science from the aboutness problem. Take cognitive science, a field that studies the mind as though it were just a very elaborate computer. Both computers and minds produce information, right? Wrong. Computers switch zeroes and ones in reliable ways. They are elaborate electrical-signal transformers that would have nothing to do with information without humans to program them and interpret the zeroes and ones that humans program them to transform. Ignoring the aboutness (which depends on the computer programmer and the user) has led cognitive scientists to treat computer signals (zeroes and ones) as information itself.

The same is true for biologists and DNA. We treat DNA as information, but, again, it’s information only in the context of something to interpret it. Otherwise, it’s just a complex set of inert chemicals. And yet biologists will talk about genes for speech, for learning disabilities, for intelligence, ignoring the complex of interpretive factors that turn inert chemicals into meaningful substances for the building of bodies.

By ignoring aboutness, information theory overlooks another aspect of information exchange that matters a lot in mind reading. We humans have very complex motives regarding the information we send and receive.

Maybe the guy doesn’t really want you to know what his raised eyebrow is about, or maybe he does and you really don’t—for example, if he’s flirting with you and you’re not interested. Or maybe he sort of wants you to know and yet also doesn’t, so he raises it just enough that you say, “What was that about?” and he says, “What was what about?” Or maybe he very much wants you to think it’s about something other than what it’s about. Or he wants to convince you it’s not what you think it’s about. Human information exchange is extremely complex.

I want to close with a few standard moves we make given the infinitude of things a signal could be about, and given our ambivalences about particular information. These are terms I coined a while ago (a few with accompanying limericks—there was a time when I thought limericks were the ideal informational form):

Veridical stance: For all the evidence to the contrary, we often pretend that conversation is really just a Shannonian exchange, that we’re only trying to exchange information about what’s true, that we have no biases, and that if we found a truer idea than one we have cherished and lived by for years, we would show no loyalty to our cherished idea and would drop it instantly for the more accurate idea. The veridical stance is the common social posture that signals, “Just the facts. We all just want the facts.”

Cogito ergo summarize: This is a reminder that among the many other features that make some ideas preferable to others, brevity and simplicity are really appreciated. If I have a choice between a simple inaccurate interpretation and a complex accurate one, I’m likely to choose the simpler one.

I think, therefore, I must shrink
Every thought, or it goes down the sink.
I need short, sweet, and sassy
Affirming and classy
Synced with thoughts that I already think.

Self-authoritative norm: A social standard by which we assume, for the sake of polite social conduct, that the message sender is the final authority on what he or she means to say. If you think someone meant to insult you and they deny it, their word is treated as the last word. The dispute ends with “Don’t tell me what I meant! I know what I meant!” even though it’s not always the case that we know what we mean. Indeed, on sensitive personal matters, we could be considered the least likely authorities on ourselves. One’s self-defensive biases are so strong, it’s reasonable to consider us each the least qualified to be honest about our motives.

Beliteralling: Belittling an interpretation of something we said by demanding that what we said be taken literally. For example, “Look, all I said was, ‘Your dress looks tighter on you now than last month. I didn’t say you were getting fat.’”

We assume that words can be flexed,
By inflection, rhetorically hexed
This assumption we shed
With “Hey, I merely said . . .”
As though all we convey is the text.

Aspirational tense: The present tense employed ambiguously to represent either the present state or the wished-for future. For example, when people say, “I’ve really improved,” it is often part description of improvement that has occurred and part prayer that the improvement will occur.

“I’ve changed,” he cried, “and today hence,
my life will start making more sense.”
Is he fixed or still broken?
You can’t tell ’cause he spoke in
The aspirational tense.

Onetruesation: An accusation focused on a single bad motive: “You did that just for selfish reasons.” Onetruesations are dubious, because we rarely do anything for just one reason. Onetruesations are often countered with equally dubious single-motive defenses: “No, I didn’t—I did it just to help you.” Debates about single motives are duels over aboutnesses, with both sides selectively focusing on what something was about without exploring the wider field of possible aboutnesses.

I’ll win the debate if I focus
Our thoughts on my preferred locus
On your basest needs
So my argument leads
With that old “You just want . . .” hocus-pocus.

*This probability ratio is just the core calculation. Shannon’s formula calculates how many on-off switches you would need in order to send the information. For example, you would need at least five switches to send 26 different signals. (see “Information Theory” on Wikipedia).