One of my favorite decision theory jokes goes like this: A man is boasting about his successful marriage. A friend asks him what’s the secret. “It’s a simple division of labor,” the happy husband says. “She makes all the little decisions and I make all the big ones.”

“For example?” the friend asks.

“She decides where we should live, where I should work, where the kids should go to school, and how to invest our money, and I decide whether the U.S. should invade Iraq.”

Her little decisions and his big ones are distinguished by more than just scale. The little ones are practical; the big ones are abstract. The little ones have direct consequences. Assuming the husband is not a government leader, the big ones have no practical consequence.

Mind Readers Dictionary: The Podfast : Play in Popup

Mind Readers Dictionary : Play in Popup

Lately I’ve been interested in another difference. Often, the bigger the decision, the less potential for verifying whether it is made correctly, and the less confident we should be about it.

As I noted in the last couple of columns, it’s relatively easy by trial and error to figure out that raccoons are trespassing in the kitchen and upsetting and eating the garbage. It’s harder to figure out what’s eating at you psychologically, upsetting your internal garbage. You can guess at reasons, but how would you test whether your guesses are correct? The more complex the potential interactions involved, the harder it is by trial and error to figure out what causes what. Deciding whether the U.S. should invade Iraq is a big decision—big, as in ungainly and complex. It’s hard to know what the consequences will be.

By “complex” I mean something specific, well understood in the field of complexity theory. To illustrate, imagine that you have a computer simulation of a forest in which you can set the average distance between trees. If the trees are set close together, then the probability is very high that what affects one tree will affect its neighbors. A fire in one tree will soon spread uniformly to all the trees in the forest, producing a simple, uncomplex burn pattern. If instead the trees are set far enough apart, then what affects one tree won’t affect the others. A fire in one tree doesn’t spread, and again the burn pattern is simple, not complex.

Complexity arises right at the transition between this low and high probability of trees’ affecting one another. When the trees aren’t entirely interdependent or independent, but instead are partially interdependent, a fire started in one tree may spread in a complex pattern to the trees around it. The more trees and the more ambiguous the influences of one tree upon another, the harder it is to predict the path of a fire through the forest.

Complexity happens at the edge of interactivity—and that’s precisely the edge we live on with one another.

I think the U.S. should not have invaded Iraq, but—unlike some who share my opinion—I have to admit that it’s not a no-brainer. When I wish to be a political force, my respect for the complexity is a disadvantage. People who respect the complexity have more reason to doubt their own opinions. As the world has become more complex, more of us doubt, and therefore either voice subtler opinions or don’t voice them at all. The people who tend to voice opinions are those who can ignore the complexity and insist that they are right. In this respect, complexity undermines democracy. The more complex the world becomes, the more democracies are driven by the confident few who can ignore the complexity.

Psychologists measure confidence levels—people’s estimates of how likely it is that they have answered a question correctly—and have come up with some fascinating work on the topic in recent years. For example, in controlled experiments, confidence levels go up when people are standing in front of funeral homes or thinking about death (see

Confidence levels are clearly influenced by many factors having nothing to do with the complexity of the situation involved. People can be as confident that the U.S. should invade Iraq as they are that the car will start tomorrow morning, even though invading Iraq is complex compared to starting a car.

Your confidence levels have an enormous effect on how actively you’ll promote your opinion. If you’re 100 percent sure you’re right, there’s nothing to lose and a lot to gain in imposing your opinion on others. If instead your confidence level about a particular opinion is low, your doubt constrains your assertiveness.

That guy just insulted you, or maybe you’re just being too sensitive. If you’re sure he insulted you, you should assert yourself. If you’re being too sensitive, you should just take it in stride. But if you can’t tell whether he insulted you or you’re being too sensitive, you’re stuck in the middle, torn about what to do, because asserting yourself and taking it in stride are mutually undermining acts. It’s difficult to do both at once.

Auto mechanics is an intrinsically high-confidence-level profession as compared to, say, psychology. Cars are human-made for straightforward reliability. At the level at which they operate there really are no mysteries that can’t be solved by trial and error. But in psychology it’s far harder to know what influences what, and there are so very many influences. It’s the realm of big ungainly complex intractable. Confidence levels should be lower.

To get a sense of how attitudinal and circumstantial confidence levels interact, consider the metaphysician who posits a new paranormal force, and then “proves” its existence with an instance of it at work: “I’m sure that people’s thoughts can affect things at a distance because I knew someone who imagined a train wreck and the next week there was one.”

If human thought could cause a train wreck, we’d be living in an impossibly complex universe. Confidence levels should plummet, not rise. A more realistic assertion would be, “I knew someone who imagined a train wreck and the next week there was one. As a result, I’m not sure of anything anymore. The world is way too complex, because really, if someone’s thoughts can cause a train wreck, there’s no knowing what causes what.”

If confidence levels rise just as we posit new interconnectedness and therefore new complexities, it’s clearly attitude and not circumstances that suggest greater confidence.

On the big decisions, I’ve believed some pretty silly things over the years. I take some comfort in recognizing that all of us are likely to believe silly things about big, complex matters. Philosophers, theologians, metaphysicians, and psychologists are notorious for following ideas to illogical conclusions. At face value they look dumber than auto mechanics, doctors, or bankers, who seem to possess knowledge that is much more grounded. But a lot of it is circumstantial. The big, complex questions don’t impose a lot of corrective feedback. On the big decisions, you can believe without consequence, whatever comes to mind.