Avoiding your own Deloitte AI moment

Deloitte's latest AI blunder earned it global embarrassment. For accountants using AI, it's a stark warning on how quickly things can go wrong without proper oversight.

by | 2 Dec, 2025


At a glance

  • A major firm’s AI error highlights the need for responsible use and disclosure.
  • Disclose to clients why and how AI tools were used in their work.
  • Establish clear internal rules for using AI to manage quality and security risks.
  • Adopt the “Pareto Flip”: 20% AI creation, 80% human curation and review.

In October this year, accounting giant Deloitte’s Australian arm made what it knew was an embarrassing admission. A report it had produced for an Australian government department contained AI-generated errors, including references to material that did not exist.

The contract value was relatively modest by industry standards: A$440,000, about £217,300. But to Deloitte’s undoubted dismay, the incident attracted not just the attention of Australian media but snowballing global coverage as well.

For other professional firms, in accounting and beyond, the fiasco should illustrate how quickly errors involving AI can escalate if not identified early.

And Deloitte’s strife has also amplified a broader discussion across the professional services sector. It’s this: if AI is now embedded in everyday workflows, what should firms be disclosing to clients?

And after they’ve disclosed, what capabilities do they need in order to truly deliver the responsible AI use that clients expect?

Disclosure becomes a new necessity

AI is no longer an abstract concept or an experiment for accountants; it has become part of the profession’s standard toolkit. Recent research shows 91% of UK accountants are either using AI now or intend to soon.

Despite this uptake, many firms are still working out how to integrate AI responsibly, says Professor Marek Kowalkiewicz, chair in digital economy at QUT Business School.

“We’re still in this world where generative AI is seen as a magical tool that’s applied to everything and anything that professional services firms do,” he says. As he points out, some of the use cases are “perfectly valid”. But he adds that the Deloitte incident and others like it reveal critical gaps in capability and human oversight.

“AI fluency is unevenly distributed within organisations,” he says. “There are pockets of expertise where people know what is best practice and how to work with those systems. And then there is what I would call sloppiness.”

Clients are increasingly aware of this variation, and many are wary of unverified AI-generated content making its way into final deliverables. As a result, disclosure is shifting from a “nice to have” to a routine expectation.

“You might also give clients an option of saying, ‘Please do not use any generative AI for this.’”

Professor Marek Kowalkiewicz, QUT Business School

Kowalkiewicz notes that disclosure should not function as a way to shift liability onto the technology itself, but instead reinforce that responsibility for the final work remains with the firm.

Designing effective disclosure statements

An official disclosure statement gives firms a structured way to demonstrate that AI has been used responsibly and within established parameters.

At a minimum, says Kowalkiewicz, a disclosure should explain why AI was used, what tools were involved, and where in the process they contributed. 

This does not need to be overly technical. A clear description – for example, “AI was used to summarise transcripts and generate initial theme groupings” – is often enough.

“It should also recognise that there are some potential challenges with the use of generative AI in the workflow, and [state] what you have done to address them,” he says.

“You might also give clients an option of saying, ‘Please do not use any generative AI for this.’ This might impact the pricing of the offering, but I’m already seeing [companies] give clients the chance to opt out.”

Internal communication is equally important, he adds. If employees are unclear about which systems are approved or what data can be used with them, they might turn to unapproved tools or personal devices, creating security and quality risks. 

A clear set of guidelines on permissions, boundaries, and expected review steps will help ensure consistency and alignment with the firm’s disclosure statement.

Applying the ‘Pareto Flip’ to AI use

To minimise risk and strengthen the quality of AI-assisted work, Kowalkiewicz recommends applying the “Pareto Flip” – a practical shift in how time is distributed across a task.

Traditionally, professionals have spent roughly 80% of their time creating content and 20% checking it, he explains. Since the arrival of generative AI, many teams have cut down on creation time, but spend the same amount of time reviewing.

The Pareto Flip proposes maintaining the same time and effort investment, but moving towards 20% creation and 80% curation, recognising that AI speeds up drafting but demands far more time in review.

Curation is far more than proofreading and fact-checking, says Kowalkiewicz. It involves working with AI tools to refine the output by challenging assumptions, uncovering alternative perspectives, testing whether arguments are well supported, and identifying any gaps or contradictions.

“If we spend that time, we actually get to engage with the content much more and have a sparring partner in the form of generative AI,” he says.

“We might end up spending the same amount of time, so there’s seemingly no efficiency gain, but what we get out is much higher quality.”

The approach also encourages teams to bring human judgement to the forefront and to treat AI as a supporting tool rather than a shortcut to a finished deliverable, he says.

This rebalanced approach also reflects a broader shift in how clients perceive value.

“In a world where AI creation is cheap or even free, it’s human curation that becomes priceless,” says Kowalkiewicz.


Enhance your skills in emerging areas of cybersecurity and technologies with IFA’s self-paced short courses.

Share This