Banner

News

Bias: the pervasive influence

First published in Accountancy Ireland

Ronan Nolan explores how bias affects professional judgement.

In an article I wrote for the October 2013 edition of this journal,I suggested that the ability to exercise professional judgement with integrity is the hallmark of a Chartered Accountant.I quoted Sir David Tweedie who has stated that the objective of Financial Reporting Standards is to arrive at an answer which is neutral, which means getting the right answer without bias. This prompted me to think about how we get to an unbiased answer and whether we understand what bias is and how it affects us?

Bias: the pervasive influence (pdf version)

Professional scepticism

The Auditing Practices Board (APB) addressed some aspects of this question in a paper on Professional Scepticism issued in March 2012. They noted that AuditingStandards define professional scepticism as an attitude that includes a questioning mind, being alert to conditions which may indicate possible misstatement due to error or fraud, and a critical assessment of audit evidence. Key issues addressed in the APB paper include the need for auditors to offer a robust challenge to management assertions, and the nature of the auditors’ mind-set – whether it should be neutral, inquiring or challenging.

The conclusion reached is that a sceptical audit requires a mind-set towards the challenging end of the spectrum, appropriately tailored for the particular circumstances. In an earlier APB paper this was characterised as ‘presumptive doubt’, though this precise term was dropped from the final paper, probably because it was considered too confrontational. In any case,the required mind set clearly brings with it a requirement to avoid bias.

Thinking, fast and slow

I needed to look elsewhere for more insight into what is involved in having an‘inquiring mind’and I recalled reading a fantastic book, by Nobel Laureate, Daniel Kahneman, a psychologist who won the prize for Economics in 2002. Thinking, fast and slow – one of the best books I have read in recent years – provides brilliant insights into how the brain works and, in particular, how unaware we often are of the pervasive influence of bias in all our thinking.

The author’s basic premise is that our brains have two distinct ways of thinking: System 1 is fast and instinctive while System 2 which is slow and logical. System 1 dominates even in situations where we are doing our best to take a careful, deductive approach to a problem.

Kahneman discusses heuristics which Wikipedia defines as experience-based techniques for problem solving, learning and discovery that give a solution which is not guaranteed to be optimal…examples include using a ‘rule of thumb’, an ‘educated guess’, an intuitive judgement, stereotyping, or ‘common sense’.

While marketing psychologists have been using techniques based on an understanding of heuristics for years,accountants’education has been somewhat neglected in this area. Kahneman characterises System 2, the logical one, as ‘The Lazy Controller’ and suggests that people generally find cognitive effort at least mildly unpleasant and instinctively try to avoid it. It is easy to get distracted, and logical thinking requires conscious, focussed attention.

System 1 can easily deal with a request to multiply, say 5 by 10, and we can carry on whatever else we are doing without interruption, but if we are asked to multiply 37 by 217, we need System 2 – we have to stop everything else, and probably need to sit down with a pencil and paper, maybe for quite a while.

System 1 is important to our survival and uses experience and instinct to enable us to react quickly when required.

For a good example of System 1 in effective action imagine the scene where a chess grandmaster walks past a chess game being played in a café. At a glance he will see the key patterns in the position, and may be able to say something like, White checkmates in three moves.

He can do this because he is talented,but also because he has spent thousands of hours studying past games. A note of caution though, this won’t work in every position, and even grandmasters can make mistakes.

Kahneman says: Expert intuition is the result of experience and practice but not all professionals’intuition arises from true experience. When faced with a difficult question, we often answer an easier one instead, usually without noticing the substitution.

A less appropriate use of System 1 is quoted by Kahneman,who recalls asking an investment manager how he had arrived at a decision to invest in a large holding in Ford Motor Co: he replied that he had recently attended an automobile show and had been impressed.“Boy, do they know how to make a car!” was his explanation.

Kahneman notes that the investment manager made it very clear that he trusted his gut feeling and was satisfied with himself and his decision. He found it remarkable that the manager had apparently not considered the one question that an economist would call relevant, is Ford stock currently under-priced?

The irony of this is that we can all remember from our own exam experience being told over and over to make sure we understood the question being asked, and did not to fall into the trap of answering the question we would have preferred. It would seem that we are not always very good at taking that lesson on into real life.

Kahneman gives many examples of biases and common illusions that interfere with our ability to think as rationally as we might like, and it is not difficult to make the connection to financial reporting issues:

  • Illusion of validity – the question is not whether these experts are well-trained but rather whether their world is predictable.
  • Planning fallacy – we are assuming a best case scenario,but there are many different ways for the plan to fail, and we cannot foresee them all.
  • Sunk cost fallacy – we are making an additional investment because we do not want to admit failure (and won’t want to acknowledge that this is our motivation)
  • Illusion of control – through excessive optimism, we seriously underestimate obstacles.
  • Confirmation bias – System 1 favours uncritical acceptance of suggestions, so that will be our instinctive bias. System 2 is in charge of doubting and disbelief, but is sometimes busy, and often lazy.


Kahneman offers some helpful suggestions including:

  • We should conduct a pre mortem. Someone may come up with a threat we have neglected;
  • We should not focus on a single scenario since that will lead us to overestimate its probability. Instead, we should set up specific alternatives and make the probabilities add up to 100%;
  • Each of our executives is loss averse in his or her domain, which is natural, but the result is that the organisation is not taking enough risk;
  • Let’s reframe the problem by changing the reference point.Imagine we did not own it, how much would we think it is worth?

 

Conclusion

The point is that we need to think about how we develop and apply our ‘inquiring mind’ and make a real effort to recognise our biases explicitly before considering a decision.As far as possible, we should put them to one side and start with a blank sheet of paper, an open mind and encourage System 2 to get to work!

“Kahneman gives many examples of biases and common illusions that interfere with our ability to think as rationally as we might like, and it is not difficult to make the connection to financial reporting issues."

Did you find this useful?

Related topics