Tuesday, July 14, 2009

Nudge


Nudge: Improving Decisions About Health, Wealth, and Happiness, by Richard Thaler & Cass Sunstein


Excellent book.Should help you think about thinking about complex decisions in a different light and hopefully better and more efficient manner. This book belongs with a bunch of similar books - Predictably Irrational: The Hidden Forces That Shape Our Decisions, Sway: The Irresistible Pull of Irrational Behavior, How We Decide, Influence: The Psychology of Persuasion, Mistakes Were Made (But Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts - that have come out in recent years that talk to basically the same theme - fallacies and shortcomings in the way we think, what drives our decision making process, and how we can become better aware of these dynamics.

The main point of the book is that small actions (or inactions) can have major consequences down the road, that people underestimate the power of inertia, and that we can be gently 'nudged' into making choices that leave us better off than without. The key here is what Thaler and Sunstein call 'libertarian paternalism', that provides and preserves choice for the end-user (libertarianism) while at the same time helping and nudging the user into making the right choices (paternalism). 'Right' as in choices that leave the person better off and as measured by the user himself.

You do not have to force people into making decisions you want them to make. Nor should you eliminate choices for the people. Nor make it hard for them to select something other than what is offered by default. A gentle nudge may be all this is needed. Take the example of the order in which food items are laid out in a school cafeteria. Placing healthier food upfront is not eliminating choice, since students can always pick up the twinkie from the end of the line:
Would anyone object to putting the fruit and salad before the desserts at an elementary school cafeteria if the result were to induce kids to eat more apples and fewer Twinkies? [page 12]
How to make people save more and more efficiently for retirement, how to reduce smoking, how to help people avoid binging on credit cards, how to improve the rate of organ donations, make people eat better, improve the US medical health program for prescription drugs for senior citizens, how to reduce pollution, even marriages. Turns out all these and more could be improved via nudges.


None of these topics is controversy free. Indeed, people more often than not hold very strong and definite views on these topics. They do not take lightly to even being 'nudged' towards alternative proposals. It is likely inevitable that suggestions, proposals, and arguments put forth by the authors will make some readers see a hidden political agenda or unwanted insinuations. To the authors' credit, they try, conspicuously, to avoid taking political stands, and make sure to avoid criticisms that could be interpreted as political. They succeed, mostly. I will leave it to you, the reader, to read the book and figure out which way they lean, or seem to lean. This judgment will, I suspect, be based in large part on your own political inclinations.

Paternalism does not mean you do not present the end-user with choice. As people have argued (The Paradox of Choice: Why More Is Less for example), a plethora of choices does not always lead to more satisfaction. It is in fact somewhat counter-intuitive but true that choices lead to a sense of frustration and dissatisfaction. However:
To eliminate complexity is to stifle innovation. A better approach is to improve transparency and disclosure. [page 259]
Disclosure is akin to sunlight - just as sunlight disinfects, the light of disclosure can itself weed out many undesirable practices.
The false assumption is that almost all people, almost all of the time, make choices that are in their best interest or at the very least are better than the choices that would be made by someone else. [page 10]
In other words, we argue for self-conscious efforts, by institutions in the private sector and also by government, to steer people's choices in directions that will improve their lives. [Page 5]
.. never underestimate the power of inertia. [page 9]

Heuristics
Although rules of thumb can be very helpful, their use can also lead to systematic biases. This insight, first developed decades ago by two Israeli psychologists, Amos Tversky and Daniel Kahneman (1974), has changed the way psychologists (and eventually economists) think about thinking. Their original work identified three heuristics, or rules of thumb - anchoring, availability, and representativeness - and the biases that are associated with each.
Anchoring
This process is called 'anchoring and adjustment.' You start with some anchor, the number you know, and adjust it in the direction you think is appropriate. So far, so good. The bias occurs because the adjustments are typically insufficient. [page 15, 16]
Availability
... most people use what is called the availability heuristic. They assess the likelihood of risks by asking how readily examples come to mind. If people can easily think of relevant examples, they are far more likely to be frightened and concerned than if they cannot.  ... Homicides are more available than suicides, and so people wrongly tend to believe, wrongly, that more people die from homicide. [page 27]
Accessibility and salience are closely related to availability, and they are as important as well. If you have personally experienced a serious earthquake, you're more likely to believe that an earthquate is likely than if you read aloud about it in a weekly magazine. [page 27]
... Such misperceptions can affect the policy, because governments are likely to allocate their resources in a way that fits with people's fears rather than in response to the most likely danger. [page 28]
Representativeness
... when asked to judge how likely it is that A belongs to category B, people answer by asking themselves how similar A is to their image or stereotype of B.
.... Again, biases can creep in when similarity and frequency diverge. [page 29]

... loss aversion operates as a kind of cognitive nudge, pressing us not to make changes, even when changes are very much in our interests. [page 37]

Don Norman has written about it, quite eloquently, in his book - The Design of Everyday Things, and Nudge also refers to badly designed products, stating
... they violate a simple psychological principle with a fancy name - stimulus response compatibility. The idea is that you want the signal you receive (the stimulus) to be consistent with the desired action. When there are inconsistencies, performance suffers and people blunder. [page 90]

Therefore, the principles of good (and bad) choice architectures are:
1. Defaults
2. Expect Error
Leaving the gas cap behind (when refueling at a gas station) is a special kind of predictable error psychologists call a 'postcompletion' error. The idea is that when you have finished your main task, you tend to forget things relating to previous steps.
Examples the authors cite are leaving your ATM card behind after collecting your cash, leaving the originals in the photocopier, forgetting to include an attachment when sending an email, etc...
In the case of ATM cards, we have ATM machines that beep very loudly should you leave the card behind, or in some cases first require that you remove the ATM card before the cash is dispensed. For car fuel caps, the fuel cap is now attached to the fuel tank, so that the chance that you will leave it on the hood of the car are almost zero (unless you happen to break the cable connecting the cap to the tank).

3. Give Feedback
4. Understanding Mappings
5. Structure Complex Choices
When we face a small number of well-understood alternatives, we tend to examine all the attributes of all the alternatives and then make trade-offs whenever necessary.
But when choices become too numerous,
... one strategy to use is what Amos Tversky (1972) called 'elimination by aspects'. Someone using this strategy first decides what aspect is most important, then eliminates all the alternatives that do not come up to the standard. The process is repeated, attribute by attribute, until either a choice is made or the set is narrowed down enough to switch over to a compensatory evaluation of the 'finalists.' [page 104]
6. Incentives
As we have seen, people are most likely to need nudges for decisions that are difficult, complex, and infrequent, and when they have poor feedback and few opportunities for learning.
But the potential for beneficial nudging also depends on the ability of the Nudgers to make good guesses about what is best for the Nudgees. [page 247]
Pollution is an example of a topic where decisions taken today have consequences far out in the future. This is one situation where people will find it difficult to make choices that leave them better off. Also, the belief is that their action - of polluting - , taken individually, has too small an impact to have any consequence. Perhaps. But when taken in the aggregate, it can lead to catastrophic results for the environment in the future.


Another example is smoking, where the action of smoking is in the present, but the consequence, an increased probability of developing cancer, is so far out in the future, that self-serving biases spring up, making good decision making almost impossible. The presence of hundreds of millions of smokers worldwide is proof that people have difficulty in making the right decision.
 
When it comes to the issue of government employing nudges, one good test of whether it is should happen or not is what the authors describe as:
... we endorse what the philosopher John Rawls (1971) called the publicity principle. In its simplest form, the publicity principle bans government from selecting a policy that it would not be able or willing to defend publicly to its own citizens. [page 244]
...
In the abstract, subliminal advertising does seem to run afoul of the publicity principle. People are outraged by such advertising because they are being influenced without being informed of that fact. [page 245]
Pollution
People who celebrate freedom of choice are well aware that when 'transaction costs' (the technical term for the costs of entering into voluntary agreements) are high, there may be no way to avoid some kind of government action, even of the coercive kind, When people are not in a position to make voluntary agreements, most libertarians tend to agree that government might have to intervene.
...
If you engage in environmentally costly behavior next year, through your consumption choices, you will probably pay nothing for the environmental harms that you inflict. This what is often called a 'tragedy of the commons.'
...
The second problem that contributes to excessive pollution is that people do not get feedback on the environmental consequences of their actions. [page 195]





© 2009, Abhinav Agarwal. All rights reserved.