Whether you’re a manager of a large business unit, a team leader, or the HR Director of a listed company, you will be called upon to make judgement calls on people and issues almost daily.
Think about how many times per week you have to make a decision, a judgement, or a gut-feel reaction about something or someone? How many of these will you be required to make in a year?
It is part of the fabric of business and the nature of work that we have to make decisions about a great many things: Will Sarah get that promotion? Is this a sound investment? Will Timothy be able to deliver that project? Is this direction the right one for the team?
Psychological science has revealed a host of cognitive biases that influence our decisions. Being more aware of how these biases could shape your judgements is a good first step in becoming more measured, fair, and objective when making decisions.
Before we delve into three biases that are likely to influence your decisions, it’s worth noting that cognitive biases often, if not mostly operate completely outside your conscious awareness. That means that they are difficult to control and difficult to change.
Scientists agree however, that awareness is key if you’d like to be less influenced by these unconscious patterns.
- The Fundamental Attribution Error
When someone in your team is late for an appointment, is it because they were delayed in traffic or because they are poor at timekeeping? When you cut someone off in traffic, is it because you were momentarily distracted, or because you’re a bad driver?
Psychological science tells us that when we explain other people’s behavior to ourselves, we tend to prefer internal reasons or attributions (e.g. “That guy has no sense of timekeeping!”). However, when we explain our own behaviours to ourselves, we tend to prefer external reasons or attributions (e.g. “My kids distracted me, and that’s why I cut off that other motorist”).
This is especially true when the situation is a negative one, like being late for work, or flying off the handle at someone.
This tendency is called the Fundamental Attribution Error. It’s an error because we’re skewing the statistically equal chance that people behave due to both internal and external reasons.
This bias can be destructive when applied to people in the workplace, because their failings are likely to be attributed to stable, internal traits rather than situational reasons that can apply to anyone.
Do you constantly seek out sources of information that contradict your beliefs? Do you enjoy the company of people who disagree frequently with your opinions? If you’re like most people, you probably would have answered “no” to both the preceding questions.
This tendency to seek out information that conform to our beliefs and avoid information that contradicts them, is at the heart of the confirmation bias.
The confirmation bias is like having your own, personal “yes-man” installed in your brain, always looking for evidence that your preconceptions and assumptions are valid and true, and pretty much ignoring evidence or information that says any different.
The confirmation bias can seriously compromise our judgments in almost all spheres of work. When applied to people, the confirmation bias can ensure that you never change your opinion about anybody. No matter how much the other person may have changed, the confirmation bias may influence you to see only information that confirms your initial ideas about that person.
In financial or strategic judgment, the confirmation bias can blind us to the risks or dangers of a strategy. Once the judgement has been made, people are likely to only look for positive evidence, and ignore information that could contradict the decision.
In psychology, an in-group is defined as the group to which you belong to and an out-group is the group people (everyone else) that doesn’t belong to your in-group.
In-groups and out-groups depend greatly on context. In a large context, such as within a society or country, in-groups can include people who conform to one, visible attribute, like gender, or race.
In small contexts, like a dinner party, in-groups and out-groups may be formed because of individual interests or hobbies.
Whatever the defining principle, one thing is clear from the science: We tend to advantage, often in subtle ways, members of our in-groups and disadvantage members of the out-group. This is known as in-group bias or in-group favoritism.
This bias is particularly insidious because it often operates outside of one’s conscious awareness. People often experience in-group favoritism as a sense of “just liking” the other person, or finding reasons other than in-group membership for their preference (e.g. “I like Jim because he’s a funny guy, not because he’s a guy”).
In-group bias lies at the heart of workplace discrimination based on gender, race, and other social categories. But it can also affect judgements within such groups. For instance, even within an apparently homogenous group (e.g. Black women), in-groups and out-groups will still form and influence decisions (e.g. Mothers vs. Non-Mothers; Old vs. Young; Fans of Science Fiction vs. Not, etc.).
To counter in-group bias (as well as the other biases discussed in this post) consider the following three strategies:
- Test your assumptions: Don’t just go on gut-feel. Look at all the evidence (both positive and negative) for your beliefs or assumptions
- Play devil’s advocate: Even if you have an initial idea or assumption about a decision, speculate about what could make the opposite true. For instance, if you’re convinced that a particular strategy is foolproof, consider what would have to happen to make if fail.
- Apply empathy: Empathy is nothing more than asking yourself the question: “How would I have reacted?” or “When have I experienced something similar?” An example: when someone is very emotional at the office, ask yourself: “What kind of circumstances would make me feel the same way?” rather than assuming the person is “just emotional”