We depend primarily on two hardwired processes for decision making. Our brains assess what's going on using pattern recognition, and we react to that information - or ignore it - because of emotional tags that are stored in our memories. Both of these processes are normally reliable; they are part of our evolutionary advantage. But in certain circumstances, both can let us down.
Pattern recognition is a complex process that integrates information from as many as 30 different parts of the brain. Faced with a new situation, we make assumptions based on prior experiences and judgments. Thus a chess master can assess a chess game and choose a high-quality move in as little as six seconds by drawing on patterns he or she has seen before. But pattern recognition can also mislead us. When we're dealing with seemingly familiar situations, our brains can cause us to think we understand them when we don't.
What happened to Matthew Broderick during Hurricane Katrina is instructive. Broderick had been involved in operations centers in Vietnam and in other military engagements, and he had led the Homeland Security Operations Center during previous hurricanes. These experiences had taught him that early reports surrounding a major event are often false: It's better to wait for the ground truth from a reliable source before acting. Unfortunately, he had no experience with a hurricane hitting a city built below sea level.
By late on August 29, some 12 hours after Katrina hit New Orleans, Broderick had received 17 reports of major flooding and levee breaches. But he also had gotten conflicting information. The Army Corps of Engineers had reported that it had no evidence of levee breaches, and a late afternoon CNN report from Bourbon Street in the French Quarter had shown city dwellers partying and claiming they had dodged the bullet. Broderick's pattern-recognition process told him that these contrary reports were the ground truth he was looking for. So before going home for the night, he issued a situation report stating that the levees had not been breached, although he did add that further assessment would be needed the next day.
Emotional tagging is the process by which emotional information attaches itself to the thoughts and experiences stored in our memories. This emotional information tells us whether to pay attention to something or not, and it tells us what sort of action we should be contemplating (immediate or postponed, fight or flight). When the parts of our brains controlling emotions are damaged, we can see how important emotional tagging is: Neurological research shows that we become slow and incompetent decision makers even though we can retain the capacity for objective analysis.
Like pattern recognition, emotional tagging helps us reach sensible decisions most of the time. But it, too, can mislead us. Take the case of Wang Laboratories, the top company in the word-processing industry in the early 1980s. Recognizing that his company's future was threatened by the rise of the personal computer, founder An Wang built a machine to compete in this sector. Unfortunately, he chose to create a proprietary operating system despite the fact that the IBM PC was clearly becoming the dominant standard in the industry. This blunder, which contributed to Wang's demise a few years later, was heavily influenced by An Wang's dislike of IBM. He believed he had been cheated by IBM over a new technology he had invented early in his career. These feelings made him reject a software platform linked to an IBM product even though the platform was provided by a third party, Microsoft.
Why doesn't the brain pick up on such errors and correct them? The most obvious reason is that much of the mental work we do is unconscious. This makes it hard to check the data and logic we use when we make a decision. Typically, we spot bugs in our personal software only when we see the results of our errors in judgment. Matthew Broderick found out that his ground-truth rule of thumb was an inappropriate response to Hurricane Katrina only after it was too late. An Wang found out that his preference for proprietary software was flawed only after Wang's personal computer failed in the market.
Compounding the problem of high levels of unconscious thinking is the lack of checks and balances in our decision making. Our brains do not naturally follow the classical textbook model: Lay out the options, define the objectives, and assess each option against each objective. Instead, we analyze the situation using pattern recognition and arrive at a decision to act or not by using emotional tags. The two processes happen almost instantaneously. Indeed, as the research of psychologist Gary Klein shows, our brains leap to conclusions and are reluctant to consider alternatives. Moreover, we are particularly bad at revisiting our initial assessment of a situation - our initial frame.
Our brains leap to conclusions and are reluctant to consider alternatives; we are particularly bad at revisiting our initial assessment of a situation.
An exercise we frequently run at Ashridge Business School shows how hard it is to challenge the initial frame. We give students a case that presents a new technology as a good business opportunity. Often, a team works many hours before it challenges this frame and starts, correctly, to see the new technology as a major threat to the company's dominant market position. Even though the financial model consistently calculates negative returns from launching the new technology, some teams never challenge their original frame and end up proposing aggressive investments.
Address: 5636 Lemon Ave.
Dallas TX 75209
Phone: +1 214 5203694