Kontakt

Watch out, our brain is playing games.

Why customer satisfaction is dependent on critical thinking

Firefighters sometimes lose their lives because of a loss of Situational Awareness. Situational Awareness is the ability to capture the clues and cues, and see bad things coming in time to change the outcome. In IT support Situational Awareness can be hard to maintain in high stress situations, and some simple actions can help teams drive more successful outcomes with small changes to their environment and working practices.

Customer satisfaction highly depends on the speed with which incidents are solved. Yet we must be careful not to rush to an answer too quickly when an incident is reported to us. Our brain can trick us into jumping to incorrect conclusions. Nobel laureate Daniel Kahneman advocates ‘slow’ thinking under certain circumstances, and those triggers happen during the lifecycle of some incidents in Incident Management.

How often have you, in resolving an incident, thought: “How could that wrong turn really have happened to me?” To what extent did your intuition let you down? Incident Management relies on intuitive thinking from knowledge and experience, and once an issue is more complex, a different kind of thought process is needed.

During consultancy practice we frequently encounter organizations paying insufficient attention to critical moments in the lifecycle of an incident;

  • An accurate understanding of the priority of an Incident (not just “P1”, as that’s meaningless, but the actual textual description of the effects of an incident)
  • The quality of the understanding of the thing that is failing and the way in which it’s failing
  • A well thought out risk analysis of actions that are being planned.

When a problem is not clearly described, the search for a solution is hindered. It slows down the resolution of the issue, which is not only expensive but also makes everyone involved unhappy.

The first, is not the best

Daniel Kahneman, who was the first psychologist in 2002 winning the Nobel Prize in Economics, explains why an intuitive reaction is not always the best. In his groundbreaking book, Thinking: Fast and Slow, he discusses intuitive (‘fast’) and rational (‘slow’) thinking. He shows us how an intuitive reaction could lead to problems and what the limitations are of our common sense.

We seem to think we assess problems correctly and that we have a quick and accurate understanding; so we respond quickly and intuitively. But beware, our brain plays games, warns Kahneman.

The following five examples show that intuition is not necessarily the best trusted advisor

  1. The halo effect: a certain quality suggests that other qualities are present as well. For example: a child is good in languages and reading and therefore will probably also do well in other subjects …
  2. WYSIATI (What You See Is All There Is): due to tunnel vision, we are not open to other observations. A familiar example is a video in which a gorilla walks through the image and no one notices if they are instructed to pay attention to an activity while watching the video.
  3. Framing: the same information can be viewed both positively and negatively, depending on how the message is stated. Which product do you prefer? A product that is contaminated 1% ? Or a product that 99% pure?
  4. The anchoring effect: we take a decision based on a certain reference point, the anchor. Here we are strongly influenced by the way in which facts and figures are presented to us, and which are not really relevant to the issue. “Just this week 20% discount on all DVDs.”
  5. The availability bias: we consider an event more likely, if we can memorize a clear example of this event. We suffer from selective memory and recall the impactful, unusual occurrences. For example, there are many media reports about kidnappings, so we think more kidnappings have occurred this year.

Does auto-pilot provide the best customer service?

When under time pressure managing an incident, people therefore complete ‘the picture’ quickly by filling in blanks with data that is not true. In incident management, speed is an important factor. Customer satisfaction depends strongly on the speed with which problems are solved. It is therefore important that we know when intuition should lead us, and when thorough analysis is required. Since we biologically prefer ‘system 1’ thinking, and we lose Situational Awareness of time passing when our working memory load is too high, we need an external trigger to drive the switch.

Kahneman believes that fast and intuitive thinking (‘system 1’, in his terminology) is safe if:

  • The issue is simple
  • You have seen an issue like this many times before and resolved it successfully
  • The cost of being wrong is low and the consequences are acceptable

Kahneman believes that we can think more slowly (‘system 2’), when:

  • Issues are complex and the solution is not obvious
  • You have not seen an issue like this before. For example, a new machine falters and existing procedures and protocols are not bringing the solution
  • The cost of being wrong is high and the consequences unacceptable. For example, the machine stops which significantly impairs the operation

Switching at the right time

Everyone is able to respond intuitively to a simple issue. It is essential to distinguish between simple and complex issues, and between a fast response and a slow, more thoughtful one.

Effective triggers include whichever of these conditions arrive first

  • A pre-agreed number of fix attempts have been attempted, or
  • A pre-agreed number of people are now involved, or
  • A pre-agreed time has elapsed

“Slow” thinking does not mean that in response to a complex issue you need to stand on the brakes and continue to address the issues in slow motion. It means that the thinking mode switches from knowledge and experience to thorough data gathering, critical thinking, and evidence based decision making with an appropriate degree of risk management.

Making the thinking available

Thinking is itself invisible, and while technology is increasing the resolution of diagnostic tools to help make thinking more visible, it’s going on inside the skull. The results of clear thinking are a common understanding of a situation, and a clear and supported plan for resolution. How to bridge that gap? Make the results of the thinking visible and available to the stakeholders. With screen-sharing tools now freely available, sharing information about the most current understanding of the situation has been made easy.

Verwandte Blogs

Die Kultur des Incident Managements in Frage stellen

Am Rande und unter Kontrolle

Wir sind Experten in:

Kontaktieren Sie uns

für Anfragen, Details oder ein Angebot!