At a glance
- Cognitive biases are mental shortcuts that can distort professional judgment and decision-making.
- These biases can lead to poor advice, especially under pressure or information overload.
- Common traps for accountants include confirmation bias, status quo bias, and groupthink.
- Combat biases with structured safeguards like checklists, peer reviews, and independent judgments.
Accountants are in the business of judgment calls. Which risks are acceptable? Which numbers deserve more weight? Which path will truly benefit the client?
But beneath the spreadsheets and scenarios, our thinking is shaped by brain mechanisms we may barely notice. One result: “cognitive biases” – unhelpful distortions in the way we see and assess the world.
Cognitive biases are the brain’s way of coping with the sheer volume of information we face every day, says Dr Ryan Jesson, a cognitive scientist at the University of Queensland’s School of Psychology.
“To get by in day-to-day life, we have a lot to contend with cognitively,” he says. “And our mind takes that into account. Instead of breaking down every single piece of data and working through it slowly, we rely on a system of mental shortcuts, or heuristics.”
“These heuristics exist because they help us. They let us navigate the world efficiently. But the cost of those shortcuts is that – if they’re geared too strongly in one direction or shaped by past experiences that don’t apply – they won’t take everything into account. That’s when we end up with biased responses or decisions that can lead us astray.”
Particularly in a fast-changing business environment, relying too heavily on mental shortcuts risks leaving advice out of step with reality.
“We all owe our clients and each other truth and accuracy. We never want to lead people in a direction that’s unhelpful, or create a situation where they end up worse off because of our advice,” he says.
“If we’re not careful with our own biases, the risk is that we provide poor guidance and even make the world a worse place.”
Cognitive bias may be innate, but it doesn’t have to rule our decisions, he says. With the right structures in place, accountants can protect the quality of their judgment and ensure advice remains grounded in evidence, not assumptions.
Common cognitive traps for accountants
Sometimes, the nature of accounting work creates the perfect conditions for cognitive shortcuts to take hold. Think long hours, tight deadlines, information overload, and the pressure to deliver advice in uncertain situations.
“Any situation with a high degree of cognitive load – when we’re under time pressure, juggling lots at once, or even just hungry or tired – reduces our capacity to see all the important things around us,” says Ryan Jesson. “And that will increase our chances of making those biased decisions.”
The common traps he identifies in professional decision-making include:
1. Confirmation bias
This is the tendency to favour or seek out information that confirms existing beliefs or assumptions.
“Confirmation bias is probably the mother of all biases,” says Jesson. “We all have a tendency to pay more attention to or seek out information that aligns with what we already believe…. And no matter what the belief is, we will always be able to find evidence for it.”
In practice, this often means advisers miss the bigger story by overlooking contradictory evidence or failing to consider alternative scenarios.
2. Status quo bias
This is the preference for keeping things as they are, even when change would deliver a better outcome.
“We have a tendency to stick with processes and protocols that feel familiar,” says Jesson. “That becomes our status quo, and it makes us much less likely to change our behaviour or try something different.”
This bias is particularly obstructive in the current climate of rapid technological development, he adds.
“AI technologies have the potential to revolutionise the way we work, but that potential is meaningless if people remain stuck in their status quo.”
“If we’re not careful with our own biases, the risk is that we provide poor guidance and even make the world a worse place.”
Dr Ryan Jesson, School of Psychology, University of Queensland
3. Anchoring bias
“Anchoring is a tendency to put too much weight on initial figures,” says Jesson. “For example, if I was buying a used car, and the salesman told me the price was $20,000, my counter-offer isn’t likely to be $5,000 – it’s probably closer to $18,000. That first number has already anchored my thinking, even though I don’t realise it’s happening.”
In an accounting context, this might look like relying too heavily on figures from previous years when a client’s circumstances or the broader market conditions have changed.
4. Groupthink
Groupthink is a tendency to nod along with the prevailing message in the room rather than thinking critically.
“Groupthink can happen when you have a bunch of people in a room trying to make a decision,” says Jesson. “One person says, ‘I’ve got this idea,’ and someone else replies, ‘Yeah, that sounds good.’ And before long you’re spiraling into a place where everybody agrees, and there are no real checks on anyone’s bias,” he says.
Left unchecked, these patterns reduce the quality of advice and can weaken trust. Understanding them is the first step to keeping bias in check. The real challenge is minimising their influence when decisions matter.
Building your defences against bias
Because cognitive biases are unconscious, awareness on its own rarely prevents them. Instead, accountants need structures that make it harder for shortcuts to take over, says Jesson.
Structured safeguards like standardised checklists, formal decision protocols and rigorous peer review processes create deliberate pauses in the workflow. They force advisers to slow down, weigh competing options, and test their assumptions.
“Having systematic protocols and frameworks in place makes a huge difference,” he says. “If I have a concrete checklist to work through of things to check and consider, that becomes a powerful cue. It pushes me to do my due diligence and leads to a much better decision than if I’m just going based on my intuition.”
It can also help to run a ‘pre-mortem’ on any important decisions by deliberately looking for ways a plan could go wrong before committing to it. By playing devil’s advocate with our own ideas, he says, we are forced to surface blind spots we might otherwise miss.
Another effective technique is what Jesson calls the “wisdom of the crowd effect”.
He offers an example: if 50 people each guess the number of jelly beans in a jar, the average of their guesses will usually be far more accurate than any individual estimate. However, independence is crucial: if people hear each other’s guesses first, the accuracy drops because group influence begins to distort the results.
“If you have a high-stakes decision to make as a group, the best you can do is actually ask people to make that judgment independently,” he says.
Strategies like these don’t aim to eliminate cognitive bias, he adds. They work by shaping the decision-making environment so that blind spots are harder to ignore.
“We’d actually be worse off if we could switch these biases off completely. Without them, our thinking would slow down to the point where even simple decisions became difficult.
“Instead, the goal is to enhance our thinking in situations where bias could harm us or others. And the best way to achieve that is by putting concrete strategies and environmental support in place.”
Boost your capabilities and skills across a variety of technical and business management learning options with IFA’s CPD On-Demand Learning.









