If you’re reading this post, you’re probably interested in forecasting future outcomes and basing strategy on those forecasts. If that’s true, then you’ve probably fallen victim to the Rational Actor Error.
The entire field of economics is based on the presumption that people act rationally. So, we lay out our decision models, and employ backward induction on that premise, to determine which actions we should take based on the idea that our friends on the other side of the table are doing the same. And then, we are continually surprised when the other party takes us down an illogical path.
This happened to me just the other day. Again, I was first surprised, then frustrated, then embarrassed that I fell into the same trap yet again. Obviously, people are not machines. We make mistakes. We make bad decisions. It’s part of being human. But, when we study decision analysis and economics, we seem to forget this basic truth. As we improve our own decision-making capabilities, we seem to think that everyone else has learned what we now know. This leads us down this paradoxical path where we must simultaneously study what decisions should be made by us and also what decisions will be made by others. To do so, we must assess not only what our counterparts’ options and payoffs are, but we also must assess their knowledge of decision analysis and the extent of their emotional bias.
There are very good reasons why we have these biases. They evolve naturally out of the social structure that our civilization has grown into. Most notably is the tribal constructs that place us in repeat interactions with others. These repeated interactions give rise to the fact that our actions today will have impacts on our ability to achieve future success. Our reputations follow us into future decisions and our trustworthiness becomes a valuable commodity. So, our default emotional state encourages us to punish cheating and encourage fair dealing. Even if a situation would benefit both parties, we get bogged down with how it will impact our ability to negotiate in the future. For example, vampire bats can often be seen sharing their take after a night of hunting. On its face, this seems illogical. Why would one bat forego its hard-earned haul to another?
It turns out, these hunting expeditions are a numbers game. Sometimes they have success, other times they come up empty. The social structure of the bat colony encourages sharing so that when you strike out, you don’t starve. But occasionally you can observe an exiling of some bats with whom others won’t share. This is often a tit-for-tat exercise to punish one bat for not sharing on an earlier occasion. Or, sometimes the bat brings in no blood for many nights, bringing into question the effort they are employing. This natural occurrence of punishment permeates many social species, including humans.
The Ultimatum Game
Perhaps the most famous observation of this phenomenon is what has become known as the “ultimatum game.” In this experiment, two people are paired and one is given 10 units of something of value (in college students, it’s usually dollar bills; in children, it’s usually chocolate). That person then gets to make an offer to the other party for some of those units. If the offer is accepted, it is completed. But, if it is rejected, neither party gets anything.
When this game is played without any prompting, training, or further instruction, people tend to offer 5 units to the other party. This is the natural tendency that society has drilled into us. “Equal is fair,” especially when the object in question was a windfall. Every once in a while, someone will get cute and offer 4, which is usually accepted with a bad taste in the mouth of the receiver about the character of the other person. This tendency to offer 4 or 5 is found regardless of age, gender, or education level, with one exception. People trained in economics or related fields tend to offer less.
The rationale is simple. 1 is better than 0, so a rational person should accept any offer. Therefore, these analytically minded people routinely offer 1, keeping 9 for themselves. When other analytically minded people are paired with them, they accept, again realizing that 1 is better that 0. And, they are content, knowing that when the shoe is on the other foot, they will be the victor.
However, when these people are paired with regular people, they become surprised when the other party rejects their offer. This sends them reeling. “Why would any rational person reject receiving 1, knowing that means they get 0?” It baffles them. They get angry, confused, and blame the other person for being illogical. What they fail to realize is that the rejector is building reputation. The cost of losing 1 (or even 2 or 3) is worth making the point that you better treat them fairly in the future. This outcome prevails even when people are told that they will never interact with one another again. This is because our culture has already engrained in us this sense of fairness and willingness to punish others, even when it hurts us as well.
This is really an emotional bias, and it has cousins in our minds. We all have these deep-rooted ideas that create our default reactions. We cling to the ideas we create as a way to inform others that our ideas should not be discarded in the future. We hurt ourselves in the name of honor, equity, and reputation. Too often we forget that we are human, subject to making mistakes and standing on principle. I think we should all remember that our emotions matter, even when they shouldn’t. In my recent case, I played a very bad strategy. I thought that getting to a point in which my preferred outcome was the logical choice, would result in success for my client. I forgot to consider that how we got to that point would continue to influence the people on the other side of the table. From there, they are now unable to assess the situation from a clean slate, and are willing to punish me for the way we got there, regardless of the value of the decision itself. I’ll learn from this experience. I hope you do as well.