Contact us

Think Clearly – Beware the Curse of Sloppy Thinking

What do a dramatic decline in fish exports, 160 thousand homeless, and 200 billion euro in damages have in common? These are all consequences of a single unfortunate decision on where to locate backup diesel generators.

On 11 March 2011 Japan experienced a severe earthquake east of the city of Sendai, with a magnitude of 9.0 on the Richter scale. The quake cut off all external power supplies to the Fukushima Daiichi nuclear power plant. As a result, the essential cooling of the reactor cores was threatened, risking a disastrous meltdown.

However, within 10 seconds the plant’s emergency power system automatically kicked in. With power now supplied by twelve diesel generators, the cooling system pumps continued to operate, and the danger appeared to have been averted.

The earthquake also caused a massive tsunami that flooded the site of the nuclear power plant, filling the cellars – where the diesel generators were installed – with water. The seawater knocked out the non-waterproofed equipment, and the generators came to a halt. Fukushima Daiichi subsequently underwent a total blackout of the station. The disaster that followed was unprecedented, and had an enormous impact on people, the environment and the economy, both inside and outside Japan.

Fukushima

The Fukushima Daiichi example demonstrates that the many far-reaching consequences can be traced back to just a few poor decisions (see box). Despite the greatly increased awareness in Japan of the risk of tsunamis in the forty years since Fukushima was commissioned, the parties involved have mainly behaved reactively, and relied on directives from the regulators.

The Fukushima disaster, triggered by an earthquake and tsunami on March 11, 2011, affected several nuclear plants in Japan simultaneously. We show that three variables were crucial during early stages of the disaster: plant elevation, sea wall elevation, and location and status of backup generators. Higher elevations for any of these three variables, or watertight protection of backup emergency diesel generators (EDGs) and electrical circuits, would have likely prevented the disaster at Fukushima Daiichi NPP. (Stanford, 2013: The Fukushima Disaster and Japan’s Nuclear Plant Vulnerability in Comparative Perspective).

Sloppy thinking and reactive behavior are clear signs of a fire-fighting culture in organizations. Because issues are addressed only partially or superficially, old problems recur and new problems arise. This has a detrimental, self-reinforcing effect, creating a downward spiral that increasingly absorbs capacity, and keeps the organization in a stranglehold.

Fire-fighting behavior is typically the result of the perceived pressure from management to deal with problems quickly. “We don’t have time to tackle issues in a proper, structured manner”, is the much-heard argument. For those facing a backlog of overdue work, ambitious deadlines, concerns about uptime and the desire of getting everything up and running again, ‘speed is key’. The resultant hasty and inadequate measures frequently have a counter-productive effect.

With more problems than the problem-solvers have time to address, a state of urgency prevails and problem-solving degenerates to no more than treating symptoms. Before the first problem has been solved, the next is already clamoring for attention, with executives increasingly embroiled in the operation to help fight the escalating fires.

Problem-solving degenerates to no more than treating symptoms

A well-known food manufacturer experienced problems with product quality while canning one of its products. During inspection white flakes were found in the canned product and it was assumed that these were due to insufficient cooking of the meatballs in the product. Despite various ‘corrective’ measures, the problem resurfaced from time to time, and it was temporarily assigned ‘high priority’. Three years later, it had happened so often that it was more or less considered normal, and had ceased to be a cause for rejection of the product.

A closer examination of this situation also provided some interesting insights. Firstly, the sampling method introduced major uncertainties. Based on a sample of just one in 2000 products, a decision was taken on the quality of the entire batch, and whether or not white flakes were present. Secondly, it was discovered that the flakes also appeared in products that did not even contain meatballs, so that what was always thought to be the cause, in fact could not have been. Lastly, when asked exactly what these white flakes were, nobody had an answer, even after all these years. In the meantime, the problem had already cost many hundreds of man-hours and needless modifications to the production process.

Symptoms of Fire-fighting

Causes are not being addressed
Symptoms are being treated, but the underlying causes remain.

Problems keep recurring
Inadequate solutions cause earlier problems to resurface, or lead to new problems.

Urgency supersedes importance
Structural improvement activities are interrupted or postponed because fires have to be fought.

Many problems escalate
Problems smolder until they flare up, often just before a deadline. In the ensuing crisis it’s all hands to the pumps to deal with them.

Too little time to solve every problem
One problem is traded for another. Even before the first is solved, the next is already on your desk.

Problems are often ‘solved’ by trial and error
The above example illustrates nicely how problems are often ‘solved’ in a fire-fighting mode: by trial and error, because there is no time for analysis. And if this is without result, the problem ultimately comes to be seen as a normal situation. “That door? Yes, sometimes it’s hard to open”.

Multiplier Effect

The Fukushima Daiichi disaster also had consequences in unexpected areas. In Tokyo, for example, government employees and company salarymen were suddenly permitted to come to work without a jacket and tie, and even to leave the top two buttons of their shirt open. Unheard-of! The reason? As a precaution, all nuclear power plants in Japan were shut down after the disaster, and the generated power fell by 30%. Obliged to take drastic measures, the government launched the Super Cool Biz campaign. Jointly with industry partners, they promoted a voluntary dress code for the summer. This would enable air conditioners to be set considerably lower, reducing demand on the strained power supply. The temperatures in the offices were sometimes set as high as 30°C.

The sloppy thinking that often prevails in a fire-fighting culture has a much greater impact than one at first might expect. The consequences, usually indirect and relatively hidden, are many, and more extensive due to a knock-on effect. Every measure leads to new actions, which in turn result in other reactions, and so on. This bears a striking resemblance to a well-known economic phenomenon: the multiplier effect, in which government spending leads to a much greater increase in income and consumption than the sum initially invested.

Consequences

The above phenomenon could be called the Multiplier Effect of Sloppy Thinking. The unintended consequences fall into six categories:

  • Performance degradation: Increased costs, reduced quality or performance, time overruns or more risks.
  • Unintended changes: Every action leads to a reaction, with as consequences, unintended changes.
  • Recurring problems: Problems recur due to treating symptoms rather than addressing root causes.
  • New problems: Sloppy solutions lead to new problems, bugs and incidents.
  • Inefficiency cost: Loss of time and money due to trial & error, unnecessary actions, etc.
  • Opportunity cost: The cost of missed opportunities because time, money and capacity spent cannot be used elsewhere.

Possibly the most-dramatic impact of the Multiplier Effect takes place at a strategic level within organizations. Poor strategic choices lead to an avalanche of ineffective programs, projects and investments that ultimately undermine the future of the entire organization.

A compelling example is BlackBerry (formerly Research in Motion): in a timeframe of just three years, the company fell from being one of the strongest players in the telecom market to a negligible role on the sideline. A major cause was the initial decision to repel the advance of iPhone and Android smartphones with their own outdated operating system.

Poor strategic choices lead to an avalanche of ineffective programs, projects and investments

When they finally realized that this strategy was doomed to fail, they bought-in a new system. The downside of this choice was that most applications had to be redeveloped from the ground up. Severely handicapped in a rapidly changing environment, they continued to lose ground. The products they finally managed to launch were ‘too little, too late’ and BlackBerry saw its market share shrivel even further.

New Problems

Computer hardware manufacturer iOmega sold thousands of network drives on which the external access login security was disabled by default. As a consequence, 30 million GB of business and private data – including sensitive information at Unilever, KLM and ING – was publicly accessible on the Internet.

Performance Degradation

During the construction of the new Berlin airport (BER) many technical challenges had to be overcome within a tight time-frame. This resulted in more than 4 years overrun, a doubling of costs due to essential re-engineering and resolution of 66,000 open issues (e.g. lighting software so complex that the lights cannot be switched off).

Invisible Sloppy Thinking

If we want to understand why we are apparently blind to the rash decisions we make, then we must turn to psychology.

Our knowledge and experience play a central role in our thought processes. They enable us to quickly – and largely subconsciously – assess a situation, and determine what needs to be done. The unconscious mind can make complex deliberations and deliver an answer in a split second, through what we learned to call our intuition.

The answer suggested by our intuition is accompanied by a strong sense of confidence. For example, when we think of ‘2 + 2’, the answer ‘4’ comes naturally and it feels right. However, we cannot see how the answer was reached, and we can only be certain that it really is the right answer if we give it conscious attention.

In more complex cases this is precisely where the danger lies. Self-assured as we are in our logical thinking, we put too much faith in the flawlessness of our intuition. Daniel Kahneman writes in his book ‘Thinking, Fast and Slow’:

The confidence people have in their beliefs is not a measure of the quality of evidence
but of the coherence of the story that the mind has managed to construct.

The invisibility of our thinking makes it hard for us to assess its quality. In a somewhat new or complex situation, there are many aspects that are partially or completely unknown to us. What we do not know, however, is ignored by the subconscious. So, being unaware of what we don’t know, we leave it out of our deliberations. The subjective feeling of confidence that we get from our intuition is often unfounded, and it leads to overconfidence.

We put too much faith in the flawlessness of our intuition

So unfortunately, you can’t always rely on your intuition. Before you know it, you will be leading yourself and others up the garden path. The risk of sloppy thinking is therefore lurking when specialists and managers – acting under pressure – draw confidence from what their intuition suggests.

Overconfidence

We often interact with professionals who exercise their judgment with evident confidence, sometimes priding themselves on the power of their intuition. In a world rife with illusions of validity and skill, can we trust them?

How do we distinguish the justified confidence of experts from the sincere overconfidence of professionals who do not know they are out of their depth?

We can believe an expert who admits uncertainty but cannot take expressions of high confidence at face value.

Overconfident professionals sincerely believe they have expertise, act as experts and look like experts. You will have to struggle to remind yourself that they may be in the grip of an illusion.
Daniel Kahneman, 2011: Don’t Blink! The Hazards of Confidence

Dunning-Kruger Effect

A study in the late nineties referred to this overconfidence phenomenon as the Dunning-Kruger effect:

People tend to hold overly favorable views of their abilities in many social and intellectual domains. This overestimation occurs, in part, because people who are unskilled in these domains suffer a dual burden: Not only do these people reach erroneous conclusions and make unfortunate choices, but their incompetence robs them of the metacognitive ability to realize it.

Paradoxically, improving the skills of participants, and thus increasing their metacognitive competence, helped them recognize the limitations of their abilities. J Kruger, 1999: Unskilled and unaware of it: how difficulties in recognizing one’s own incompetence lead to inflated self-assessments.

Thinking Clearly

In order to break out of the downward spiral of fire-fighting – and to get a grip on the multiplier effect – we need to shift our focus to tackling sloppy thinking. To achieve this, and to start thinking more clearly, there are three preconditions: time, attention and structure.

Time
The first priority is to allocate sufficient time to those things that do get done, so they get done right. It is often the details that determine success or failure, and without adequate time, they will be sure to go wrong. As they say: ‘the devil is in the details’.

To accomplish this, it is crucial to ease the pressure exerted by other, less important matters. Failure to say ‘no’ often enough is a persistent threat to our mental breathing space. In his book ‘Scarcity: Why Having Too Little Means So Much’, Eldar Sharif describes how a lack of bandwidth makes us less resourceful, less level-headed and less forward-thinking.

Attention
The second precondition is to pay more conscious attention to what we do, because our conscious mind has the important task of curbing our intuitive impulses, and correcting them when necessary. Only our conscious mind is capable of dealing with ambiguity, uncertainty and doubt, and of critically evaluating the quality of available information.

Daniel Goleman writes in his book ‘Focus’: “Paying full attention seems to boost the mind’s processing speed, strengthen synaptic connections, and expand or create neural networks for what we are practicing.”

The challenge is to avoid landing on the ‘OK plateau’, where people get that ‘good-enough’ feeling, and go through the motions more or less effortlessly. Our attention starts to wander, and we transition to ‘automatic pilot’ mode. To keep ourselves alert, we need to purposely counteract the brain’s urge to automate processes. Conscious attention can be triggered by asking (the right) questions and making our thinking more visible.

Structure
And finally, the third precondition is to add structure to the thinking. The strategy that we see people follow is often very ‘ad hoc’ and intuition-driven. Real structure and coherence are missing in the way things are tackled (the ‘how’), and most attention is given to the content (the ‘what’). However, it’s not about what you know, but what you do with what you know.

Saying No

“People think focus means saying yes to the thing you’ve got to focus on. But that’s not what it means at all. It means saying no to the hundred other good ideas that there are. You have to pick carefully.

I’m actually as proud of many of the things we haven’t done as the things we have done. The clearest example was when we were pressured for years to do a PDA, and I realized one day that 90% of the people who use a PDA only take information out of it on the road. They don’t put information into it. Pretty soon cellphones are going to do that, so the PDA market is going to get reduced to a fraction of its current size, and it won’t really be sustainable.

So we decided not to get into it. If we had gotten into it, we wouldn’t have had the resources to do the iPod. We probably wouldn’t have seen it coming.” Carmine Gallo, Forbes, 2011, Steve Jobs: Get Rid of the Crappy Stuff

Adding structure to thinking is like creating a roadmap for the mind. It helps us to navigate the various issues and to focus on the right thing at the right time. Key elements of this mental roadmap are:

Thinking Spaces
These are the landmarks, the must-see places to visit. What do we need to be thinking about? What deserves our attention and time?

A Route
This is the path that guides our mental journey through the Thinking Spaces towards a certain goal. Different situations call for different routes: are we seeking new product ideas, investigating an incident, developing a strategy, or building a rock-solid project plan?

Treating thinking as a process
Both the route and the various thinking spaces can be treated as (sub)processes. By thinking in a series of steps with inputs and outputs, we can streamline our thinking and raise its quality. Process is your ace in the hole when intuition causes sloppy thinking.

Thinking Spaces
Without structure you can easily get ahead of yourself and ‘jump to conclusions’. On the mental roadmap, this means expending your energy in the wrong thinking space at the wrong time. This is the case when early, resolute thoughts are formed on the basis of a weak previous thinking step. For example, when – prematurely – discussing solutions for a problem that is still unclear (let alone its underlying cause).

By adding more structure, you can get better by working smarter, not harder. Or, in the words of Ross Brawn when describing the improvement of Formula 1 pit stops: “Reducing time means working more intelligently.
Not running faster.”

NEEDLE IN A HAYSTACK

At a Siemens microprocessor plant a corrosion problem occasionally developed on aluminum interconnects. As a result, production batches had to be discarded, and customers had to wait until the plant could supply defect-free chips.

They next carefully chose the appropriate corrective action, which proved to be 100% effective, and production was subsequently converted to use the new process.

Despite the problem resurfacing every few months over a period of 10 years, it defied resolution. The complexity of the production process and the elusiveness of the problem – occurring infrequently and vanishing during testing – made finding the cause like finding a needle in a haystack.

By eliminating the faulty steps, costs were cut by €108,000. More importantly, production delays were eradicated, customer satisfaction improved, and €2.8 million in losses attributed to defects avoided.

A new cross-functional team again tackled the problem. They first developed a clear understanding of the problem – this time in a structured manner using KT’s Clear Thinking processes – and at last they succeeded in identifying the root cause.

Conclusion

With time, attention and structure, a culture of sloppy thinking can be transformed and a clear thinking organization created. However, it does call for strong leadership, a range of new thinking skills and a willingness to swim against the tide. As Albert Einstein said:

The world we have created is a product of our thinking; it cannot be changed without changing our thinking.

Related

Closing the 21st Century Service Capability Gap

On the Edge and In Control

We are experts in:

Contact Us

For inquiries, details, or a proposal!