In this mindware strategy tutorial you will learn:
- How to identify System 1 cognitive biases in your decision making
- How to use System 2 expected utility calculations to overcome these biases to make better decisions
Risk-averse for gains, risk-seeking for losses
Which do you choose?
A. Gamble 1: On the toss of a coin you win $10 if you get heads or win $50 if you get tails.
B: Gamble 2: You will get $30 for certain.
C: Either Gamble 1 or Gamble 2. They are equivalent.
Most people in this situation instinctively (using System 1 thinking) choose Gamble 1 (A).
That is, we show risk-aversion for the possibility of getting less than we know we can get for certain. (It’s like there’s a mental comparison between $30 and $10.) This is called being ‘risk-averse for gains’.
But now what about this choice?
A. Gamble 1: On the toss of a coin, you lose $10 if you get heads or lose $50 if you get tails.
B. Gamble 2: You lose $30 for certain.
C. Either Gamble 1 or Gamble 2. They are equivalent.
Most people given this choice instinctively (using System 1 thinking) choose Gamble 1 (A).
That is, we are risk-seeking for the possibility of reducing our losses when we know that we will otherwise have a bigger loss for certain. This is called ‘risk-seeking for losses’.
Overcoming the bias with System 2 thinking
But wait! These two are actually identical in terms of expected utilities (it’s just that the signs are different). If we choose Gamble 2 in the first choice, we should choose it in the second: you are contradicting yourself if you choose Gamble 2 in the first, and Gamble 1 in the second (or vice versa).
We need a System 2 thinking override here! Let’s use our working memory to apply expected utility theory.
In the first choice above the expected utilites are:
A. (0.50 x $10) + (0.5 x $50) = $5 + $25 = $30
B. 1.00 x $30 = $30
The same expected utilities for both options! So the correct answer should be C.
But now let’s look at the second choice above.
A. (0.50 x -$10) + (0.50 x $50) = -$30
B. 1.00 x -$30 = -$30
Exactly the same expected utilities! – the only difference is the sign.
Here again of course, you should choose answer C.
To be rational it is critical that we are consistent for both ‘gain’ scenarios and ‘loss’ scenarios.
Outcome bias
When we judge the quality of a past decision by its ultimate outcome instead of based on the procedure used for making the decision, given the information we had at the time, this is called outcome bias. No decision maker ever knows whether or not a calculated risk will turn out for the best, and a good decision is based on calculated risks, not on the outcomes.
In the last 2 problems (questions 3 and 4), your answer should have been the same for both – that is – irrespective of the outcome that ended up occurring. The eventual outcome (survival or death of the patient) shouldn’t affect how you judge the quality of the decision.
No comments yet.