An AI virtual assistant leverages various artificial intelligence technologies to fulfil general-purpose functions for its users. Today, over 18 million people in the UK have tried Generative AI.
Some of the most well-known AI virtual assistants include:
Of course, using AI virtual assistant technology for financial data presents a unique set of challenges.
Financial data represents the type of ‘high stakes domain’ that OpenAI cautions users against in their usage policy. Their policy instructs users not to use OpenAI’s products – including ChatGPT – ‘in domains that affect an individual’s safety, rights or well-being’.
Financial data, accordingly, can represent such a domain. Of course, financial data reveals sensitive information about an individual’s or company’s assets, transactions, credit history and financial behaviour if misused. Unauthorised access can lead to identity theft, fraud, regulatory violations and substantial reputational damage.
Yet, besides being high-stakes, financial data embodies several other qualities that AI virtual assistants like ChatGPT must accommodate.
ACCA Global suggests that the qualitative characteristics of financial data should include:
An AI virtual assistant will be successful if it quickly presents relevant, comprehensible data that can be compared with the (original) source.
With this framework in mind, let’s examine three benefits and drawbacks of using AI virtual assistants for financial data.
This benefit doesn’t specifically relate to financial data and, thus, may be outside the framework. A non-existent price tag is an undeniable advantage for fledgling financial firms. Do note that users may struggle with the free versions for any tasks involving uploading and downloading data, as these models may have strict limits.
Timeliness is a key characteristic of financial data. In 2025, AI virtual assistants can analyse and output financial data quickly. ChatGPT, for instance, can produce responses to complex queries in under eight seconds.
Accordingly, one of the benefits of using an AI virtual assistant – instead of completing financial data tasks manually – is that it will often be quicker. For example, using AI to calculate financial ratios in bulk is faster than any manual alternative.
As we’ll discuss, AI virtual assistants continue to improve their accuracy (alongside their speed and accessibility). Therefore, some of the listed disadvantages below may become obsolete within the next few years.
Perhaps the most striking disadvantage of using an AI virtual assistant is that it can hallucinate. Producing fictitious yet plausible data contravenes the verifiability and faithfulness of the financial data.
Additionally, verifying financial data interferes with its timeliness, potentially compromising one of AI virtual assistants’ primary advantages.
’AI tools can sometimes hallucinate by presenting misleading information, often extracting data from isolated statements or breakdowns rather than drawing insights from consolidated financial statements. This can lead to inaccurate conclusions, and it may take considerable time to investigate where the mistake originates’.
~Irina Staneva, former auditor at PwC
Currently, the hallucination rate of most AI virtual assistants falls between 0.7 and 29.3%. Realistically, users exploring large financial datasets will likely encounter at least one hallucination. If the hallucination is innocuous enough, or the (human) reviewer doesn’t check carefully, it could pollute a financial dataset. This result can lead to compromised decision-making, potentially causing compliance issues and expensive fines.
The security of AI virtual assistants has been debated since their introduction. Horror stories exist about training data being misused. For example, there is an ongoing issue about AI models retaining user interactions.
Some AI systems may also inadvertently store fragments of confidential data within their training framework. If mismanaged by the AI developers, sensitive financial details, personal information or proprietary business insights could be exposed in unintended ways.
Though AI virtual assistants take certain measures to safeguard financial data (e.g. user authentication and the classification of sensitive data), the main issue with using an AI virtual assistant is preventing unauthorised access and data breaches.
In the coming years, we may start seeing AI-specific compliance and regulations focused on how users and AI handle high-stakes data. Until then, it’s essential not to expose sensitive financial data to AI virtual assistants, lest it get misused and cause regulatory issues, fines and reputational damage.
AI virtual assistants may struggle with multi-step prompts (e.g. ‘Extract this data from this PDF, analyse it and return it in an Excel format’). They can often provide inadequate responses when faced with ambiguity.
As for the consequences?
Incomplete and inaccurate responses can compromise the understandability of financial data. Plus, teasing out the desired response from an AI virtual assistant may take just as long as completing simple tasks. Again, this may compromise the timeliness of AI virtual assistants.
Through an objective lens, an AI virtual assistant is worth it for low-weight financial data tasks that the user has time to review manually. If you want to check the trends in a spreadsheet or calculate financial ratios, an AI virtual assistant might save time and introduce insights. Consider whether an AI virtual assistant is ideal for your complex, sensitive tasks.
If you’re looking for a specialised tool for financial data, try Financial Statements AI. Financial Statements AI is our dedicated product for extracting and computing data from financials. It avoids certain pitfalls associated with AI virtual assistants insofar as:
Early reviews have been extremely positive (‘I use it every day’, noted one financial data analyst). If you’d like to trial Financial Statements AI, book a demo with our financial data project team or email hello@evolution.ai.