It’s inevitable that people will notice what’s wrong with various systems in the organization. Perhaps decisions are too slow; perhaps the technology is out-of-date; perhaps the headquarters is pushing bad decisions onto the regions. People notice what’s wrong—what could be wrong with that?
The downside of noticing what’s wrong is that most people, most of the time, are not good enough at systems thinking and not careful enough at overcoming cognitive biases, to propose actions that will make things better. We need to be cautious in assuming that if something appears wrong to us then it really is something that needs to be fixed.
Examples of noticing what’s wrong
Let’s consider some examples of what is noticed and why it could lead to a poor decision.
What is noticed
Why the intervention is problematic
The leader notices that people came in late three times this week, so he decides to fix the problem with some intervention.
This may well be a blip not a recurring issue so it's a waste of resources to make it a priority.
Regions dislike poor decisions by corporate and push for decentralization.
Lack of systems thinking
Centralization creates problems, so does decentralization. Noticing the problems of the alternative (e.g. understanding the whole system) will lead to an ineffective intervention.
A leader decides there are two kinds of salespeople: hunters and farmers, then decides to get rid of the farmers.
Humans tend to see populations as divided into two groups with a big gap between them. They don't check the data to see if this is true. Often the data will show that most people are in the middle. If the characterization of the population distribution is wrong, then the intervention will be too.
Mitigating the effects
The research on cognitive biases (most famously the work of Daniel Kahneman, Thinking, Fast and Slow) is discouraging because there are so many different biases and they are so hard to change. In fact, saying “hard to change” is overly optimistic, it’s unlikely that we’ll ever get a world where most people, most of the time do a good job of mitigating their own cognitive biases; we are just not wired that way. However, we can do several things:
Promote humility. Any wise person who understands the limits of their knowledge and the pervasiveness of cognitive biases will be humble. We should encourage this trait in all our employees. This is almost opposite to the current trend in the US (and probably elsewhere) where people are encouraged to have a strong opinion on topics, even if they know little about the subject.
Insist that people seek a devil’s advocate. It should be a cultural practice that when people present an idea, they also present reasonable counter arguments. For example, if someone suggests promoting humility then that needs to be balanced with a commentary on the downsides of humility such as passivity.
Continuous education on how to make data-informed decisions. Employees need continuous education on the need to ground decisions in data, not only gut feel.
Continuous education on alternative views. Employees, and particularly leaders, need to be constantly exposed to alternative views of the world, their industry, and human behaviour so that they don’t overvalue their own limited knowledge.
Subject big decisions to a decision quality process. Any decision that will have a wide impact should be informed by an internal or external consulting function that works with the decision makers to uncover any cognitive biases that undermine decision quality. This is a little different from the usual due diligence in that it specifically checks for the kind of blind spots we know humans suffer from.
The research on cognitive biases should give us all pause. When we notice something “wrong” we need to be aware that there is a very good chance it’s not wrong at all (from an overall systems perspective). We can’t fix human cognitive limitations, but we can mitigate them with a few principles. Humans may never be great at decision making but we can get better. (For more ideas on better thinking see Hans Rosling’s “Factfulness: Ten Reasons We're Wrong About the World--and Why Things Are Better Than You Think”)