Artificial intelligence looks likely to improve business, but ethics, data quality and skills shortages remain barriers to take-up
By
- Stephen Pritchard
Published: 08 Aug 2024 11:45
Artificial intelligence (AI) is viewed by many as a threat to white collar jobs, yet research from a UK-based think tank suggests that, used well, AI technologies could bolster both the economy and employment.
The research was conducted by Demos, together with the Association of Chartered Certified Accountants (ACCA) and software supplier Sage.
Although the study focused on accountancy and bookkeeping – Sage’s main markets – it suggests that AI, including large language models and generative AI (GenAI), are a positive development for the professional services market.
The accountancy industry is already a significant contributor to the UK economy. It’s worth £33.3bn and employs 323,000 people. The survey consulted just over 1,100 accountants and bookkeepers, and the research suggests that “widespread adoption of AI” could add a further £2bn to UK GDP, create 20,000 new jobs and add £238m in exports.
The Demos-Sage projections do require accountancy firms, and other finance professionals, to introduce AI to their operations. According to the research, 61% of accountants and bookkeepers think AI will create “more opportunities than risks”. More than two-thirds (68%) are confident they can make use of AI in their business.
Moreover, firms that deploy AI expect to grow three times as fast as those that do not, and potentially employ 10 times as many staff. This might seem ambitious. “At this stage, putting a particular number on the economic growth potential and job creation is challenging due to several variables, including macroeconomic factors and the recent UK elections,” said Charles Aladesuru, research manager for European enterprise solutions at analyst IDC.
But Sage cited some specific improvements from its customers’ existing use of AI, including a two- to three-times improvement in productivity from AI-powered invoice processing, for example, as well as significant improvements in how long it takes to close financial periods.
Barriers to AI-powered business growth
This potential, though, needs several elements to fall into place. The predictions are based on those firms not currently using AI – some 51% – to come on board with the technology. This is not guaranteed.
Then there is the question of whether all firms will grow equally as AI take-up expands. Although UK firms could pick up more overseas business – as the researchers suggest – growth among the UK’s early adopters of AI could come at the expense of their less technologically focused competitors.
Research earlier this year, by the Institute for Public Policy Research, suggested that in at least one scenario, as many as 7.9 million UK jobs could be lost due to AI. Any productivity gains from the technology would be wiped out by job losses from a “second wave” of AI, and the resulting drop in GDP, the IPPR found. And there are other barriers to the take-up of AI, especially as accountancy depends so much on trust.
Questions of trust
Firms will need a strong ethical foundation to make the most of AI, and they need to ensure technologies such as large language models do not undermine trust through erroneous results or hallucinations. Accountants will also need to establish who is liable for poor advice or inaccurate reporting by AI tools.
Skills, too, are a concern. Not all firms in sectors such as accountancy, especially smaller firms, have the knowledge to deploy AI tools. Nor will they have the experience to deal with the data privacy and security issues that AI can bring. Data quality could also make AI less useful than its proponents suggest. Accountancy firms may be able to tap into external AI models. Suppliers already offer this to an extent, such as Sage’s AI-powered General Ledger Outlier Detection tool in Sage Intacct.
But accountants will remain highly dependent on the quality of their client data. Professional services firms will need guardrails to ensure that poor customer data does not cause AI models to produce errors, and firms will likely need to work closely with their client base to help them build up “AI-ready” datasets.
Sage, for its part, has recently filed patents focused on “detecting and preventing AI hallucinations”, according to its chief technology officer, Aaron Harris.
Although not specific to the professional services or finance industries, a study by industry analyst Forrester found that data quality is now the biggest limit to the effective use of AI.
At Sage, Harris said his customer base was in fact well placed to exploit AI, as the industry is already used to dealing with data quality issues. “This is perhaps why AI has such promise to accountants and why accounting is so appealing to the data scientists building AI solutions,” he said.
“The nature of accounting requires strong discipline around data consistency and reliability. After all, we count on accounting teams to report business performance to markets, to provide reliable information for credit and investment decisions, and to report compliance with government regulations. Versus other areas, we’re starting from a very strong foundation.”
And this links directly to the question of trust. Accountants’ clients need to trust the accuracy of any work carried out with AI, just as investors and other stakeholders need to trust businesses’ accounting statements.
The way AI systems work – with complex algorithms that are often likened to a “black box” – can make this trust hard to maintain. This is why professional bodies such as ACCA are promoting AI literacy among their members, as well as creating principles and frameworks for the use of AI in the industry.
“We are very conscious that systems are continually developing, and so for providers, we’re emphasising transparency and explainability as far as possible,” said Alistair Brisbourne, the ACCA’s head of technology. “We recognise that trust in financial information is paramount, and a big part of that trust is founded on interpretable and auditable outcomes. This includes supporting the evolution of new assurance techniques specifically for algorithmic systems.”
Accountants, he said, will continue to apply their ethical standards to any AI tools they use. “The principles of integrity, objectivity, professional competence and confidentiality serve as the foundation for all of our members’ work and approach to new technologies,” said Brisbourne.
Quiet optimism
Whether this all adds up to support some of the more optimistic claims for AI remains to be seen. Certainly, analysts report strong levels of interest in AI.
IDC, for example, reports that 77% of leaders in finance, HR and operations see GenAI capabilities as moderately or highly important when it comes to selecting new business applications. This suggests that Sage, and others, are right to prioritise the technology.
“There is certainly significant potential for AI to be a driver of both these in the short-term to long-term future,” said Aladesuru. “Conventional or ‘traditional’ artificial intelligence, as you would imagine, is further ahead in maturity than generative AI. The two will coexist and complement each other, with the convergence of both expected to deliver even more powerful tools for the end user.”
Read more on Artificial intelligence, automation and robotics
-
Sysdig Sage early adopters kick the tires on CNAPP AI agents
By: Beth Pariseau
-
Channel moves: Who’s gone where?
By: Simon Quicke
-
Sage: SMEs want the channel to be trusted advisors
By: Simon Quicke
-
ThoughtSpot update adds feedback loop to help train GenAI
By: Eric Avidon