Monday, March 12, 2012

Recognition-Primed Decision-Making

I'm now going to look at Gary Klein's book Sources of Power: How People Make Decisions. I mentioned Klein a month or two ago when I was talking about Daniel Kahneman, who writes about biases and mistakes in decision-making.

Kahneman thinks in general you can't trust expert judgment. There is little or no evidence stockpickers or forecasters actually deliver any value. But he describes a long series of exchanges he has had with Klein, who thinks that in many cases expertise does count for something. Sometimes experienced firefighters just know when it is time to run out of a burning building, for example.

After reading Klein's book, I am inclined to side with him. He arguing that perception matters.
His first main point is that real decision-making in the field - "naturalistic decision-making" - is not the same as lab experiments:
Features that help define a naturalistic decision-making setting are time pressure, high stakes, experienced decision makers, inadequate information (information that is missing, ambiguous, or erroneous), ill-defined goals, poorly defined procedures, cue learning, context (e.g., higher-level goals, stress), dynamic conditions, and team coordination (Orasanu and Connolly 1993). ....In contrast, in most laboratory studies, experience is considered a complicating factor. Subjects who know something about the task may have preconceived notions that could get in the way, or their strategies could distort the results. Therefore, subjects are given totally novel tasks to make sure all of them start with the same level of experience: zero.
Lab experiments usually strip out experience. But it is experience which makes the real difference in the effectiveness of decision-making.
Well, OK, but what is it about "experience" which makes a difference, then? It is a matter of perception. Skilled decision-makers do not generate lists of options and the choose between them, as rational choice models have it. They recognize patterns. They just see what has to be done.
Decision makers recognize the situation as typical and familiar-a typical garage fire, or apartment building fire, or factory fire, or search-and-rescue job-and proceed to take action. They understand what types of goals make sense (so the priorities are set), which cues are important (so there is not an overload of information), what to expect next (so they can prepare themselves and notice surprises), and the typical ways of responding in a given situation. By recognizing a situation as typical, they also recognize a course of action likely to succeed.
So Klein says he developed the Recognition-Primed Decision (RPD) model to try to explain the way experts in the field actually thought. Skilled firefighters, pilots, or army commanders did not think through probability trees.
We began to realize that the force of our findings was in their obviousness. Of course the RPD strategy was the strategy used most frequently.
It was a surprise to academics who studied decision-making. But of course it is what we recognize in real life as intuition.
This is one basis for what we call intuition: recognizing things without knowing how we do the recognizing.
Experts in a field see things that others do not. It is not just a matter of knowing more facts.
.. expertise is learning how to perceive. The knowledge and rules are incidental.
The accumulation of experience does not weigh people down; it lightens them up. Experts see the world differently. They see things the rest of us cannot. Often experts do not realize that the rest of us are unable to detect what seems obvious to them.
Like what?
There are many things experts can see that are invisible to everyone else: • Patterns that novices do not notice. • Anomalies-events that did not happen and other violations of expectancies. • The big picture (situation awareness). • The way things work. • Opportunities and improvisations. • Events that either already happened (the past) or are going to happen (the future). • Differences that are too small for novices to detect. • Their own limitations.
Experts have a sense of the way things ought to work.
Experts see inside events and objects. They have mental models of how tasks are supposed to be performed, teams are supposed to coordinate, equipment is supposed to function. This model lets them know what to expect and lets them notice when the expectancies are violated. These two aspects of expertise are based, in part, on the experts' mental models.
In fact, economics-style rational decision-making can be disastrous:
Hyperrationality is a mental disturbance in which the victim attempts to handle all decisions and problems on a purely rational basis, relying on only logical and analytical forms of reasoning. In the initial states, this condition can be mistaken for a healthy development of critical thinking. Only later do we observe an unwillingness to act without a sound, empirically or logically supported basis. The final stages degenerate into paralysis by analysis.
And this is where he disagrees strongly with Kahneman.
Those who favor analytical approaches to decision making believe poor decisions are caused by biases in the way we think. Naturalistic decision-making researchers disagree. We tend to reject the idea of faulty reasoning and try to show that poor decisions are caused by factors such as lack of experience.
But he acknowledges there are limits to the applicability of the RPD model. There are many things in which it is very difficult or impossible to develop genuine intuition or expertise. Klein says
Jim Shanteau (1992) has suggested that we will not build up real expertise when: • The domain is dynamic. • We have to predict human behavior. • We have less chance for feedback. • The task does not have enough repetition to build a sense of typicality. • We have fewer trials.
One reason I liked the book so much is it reminds me of other concrete studies of policy decisions in actual situations, most especially Robert Jervis's Perception and Misperception in International Politics (Center for International Affairs, Harvard University). Indeed, I think perception is the key to risk, rather than the sterile measures of volatility which are the core understanding of risk in financial economics.
The key issue about Klein's argument is scope, however The approach works in some kinds of situation but not others. And that is the most difficult thing for many people to understand, trained as we are in the notion of universal rules.
Firefighters may have genuine expertise. Stockpickers still do not.


Labor Force Participation Has Been Declining As Long As We've Been Tracking The Data

So says Slate, which has a chart showing the male participation rate falling from over 87% in 1948 to 73% now.

The trend waxes and wanes to an extent, for what I imagine are a variety of reasons. But the secular trend is clear and the reason for it seems obvious. There's more to life than work and gross domestic product. America is better at doing stuff in 2012 than it was in 1962. We've used that improved know-how in part to amass more stuff per capita than people had in 1962. But we've also used that improved know-how to enjoy much cleaner air than prevailed in 1962. And we've used that improved know-how to do less work than people did in 1962.