What’s the probability that your co-worker will steal your work? That your mom will stress you out on her upcoming visit? That your computer will lock up? That your partner is going to be cold to you tonight? That your son is failing again?

I mean given the history what are the odds?

We intuit answers to such questions. We do because they matter. A blow you’re expecting hurts less than one that takes you by surprise. If you know it’s coming, you don’t get thrown as much when it happens. You don’t have to re-think the whole relationship. Knowing the odds is like having a nice tidy box to sort the problem into when it arises. You say, ‘Yeah, they do that sometimes. Got to take the fleas with the dog.’ You know how many fleas to expect.

Mind Readers Dictionary: The Podfast : Play in Popup

Mind Readers Dictionary : Play in Popup

In statistics such guestimates are called ‘Bayesian priors’- probabilities estimated prior to an event that take into account past experience not just for whole populations of co-workers or moms, but the particular histories of a sub-population, and even a sub-population of one—your co-worker; your mom. They’re named after Reverend Thomas Bayes whose formula for calculating them was published in 1763 two years after his death.

Since Bayesian priors are based on past history, which accumulates with every encounter, Bayesian priors are subject to change. Mom comes to town and this time she’s particularly hard to deal with. So you change your guestimate. Cranky not 40% but 50% of the time. This is called Bayesian updating. Rather than simply sorting her crankiness into the existing 40% box, you trade in the ‘crankiness box’ for a larger one that can hold more instances.

For a long time, every time my computer crashed I got agitated. I’d wonder whether the Gods were out to get me, whether I should switch to Mac, whether I shouldn’t be using computers at all, whether I was jinxed or doomed.

When I’d called tech support I’d express my agitation hoping I could inspire them to pay special attention to my problem. They never did. Our Bayesian priors were different. I expected computers to work 100%. Tech support expects nothing but failing computers. For them, freaking out over a failing computer would be as absurd as postmen going postal upon discovering that even though they already delivered the mail yesterday there’s more to deliver today.

Instead of my Bayesian priors rubbing off on tech support, theirs rubbed off on me. Noticing the difference in our priors, I asked myself, really what percentage of the time is my computer down? About 2%, I estimated. I asked myself whether, with a 2% fail-rate I want to use computers. The answer, of course, was yes.

I built myself a new expectation, a 2% box for computer problems and ever since, when they come up, I know what to do with them. I just slot them into that little box. I don’t wonder what size box I should have. I say, ‘Yeah, the 2%. Fleas with the dog.’ I stay calmer now because I expect the formerly unexpected. Because I updated my Bayesian priors, I don’t have to update them each time there’s a problem.

There’s a tough judgment call built into our relationship with Bayesian priors. The box is there to put the question to rest, to keep you from reassessing every time something bad happens. Having the box reduces agitation. But having the wrong size box causes agitation too.

So when do you slot the latest problem into the existing box, and when do you re-size the box? When do you stick with your expectations and when do you change them? When do you rely on your Bayesian priors and when do you update them? The dance we do in response to this tough judgment call is complex and fascinating. From the outside it’s even pretty.