UK Watchdog Sounds Alarm Over AI Chatbots and Faulty Financial Guidance

AI-powered chatbots are increasingly being used to answer everyday financial questions, but British consumers may be unknowingly relying on flawed or misleading advice. A recent investigation by consumer group Which? reveals that some of the most widely used AI assistants are providing inaccurate information on vital topics such as investment rules, tax refunds, and insurance requirements.

Published on
Read : 2 min
AI chatbot advice
©Shutterstock

With the rise of generative AI tools like ChatGPT, Microsoft Copilot, and Google Gemini, many UK users are turning to these platforms for personal finance queries. But Which? warns that their study, based on 40 test questions posed to each platform, exposes troubling gaps in reliability and accuracy, raising concerns about potential financial consequences for users.

AI Tools Gave Incorrect Advice on Key Money Matters

The Which? study examined six of the most popular AI assistants — ChatGPT, Copilot, Meta AI, Google Gemini, Gemini AI Overview, and Perplexity — and assessed their responses based on accuracy, clarity, ethical responsibility, and usefulness.

According to the findings, nearly all tools returned incorrect or misleading information on several occasions. Notably, ChatGPT and Copilot failed to identify a deliberately false figure in a question about ISA allowance, incorrectly advising users how to invest £25,000 — despite the legal limit being £20,000. Such advice, if followed, could result in breaching HMRC rules.

Meta AI returned the lowest performance score, achieving just 55% in the evaluation, while Perplexity performed best with 71%. Meanwhile, Google’s Gemini advised withholding payment from a builder if work was unsatisfactory, a suggestion that Which? warned could risk a legal claim for breach of contract.

In another test, both ChatGPT and Perplexity recommended services from premium tax refund companies when asked how to claim a refund from HMRC, rather than directing users to the free government platform. These firms are widely known for charging significant fees, sometimes hidden or unjustified, raising further concern about the trustworthiness of AI recommendations in sensitive financial areas.

Millions Using AI for Money Tips, but Regulation Lags Behind

While generative AI is not intended to replace financial advisers, its widespread use for money-related queries suggests that many people are already leaning on it in that role. A survey conducted alongside the Which? investigation found that one in six UK adults have used AI tools to get financial advice. Common uses include searching for low-fee investment options, choosing the best credit cards for travel, and comparing household appliance deals.

Yet as the study shows, the guidance provided can be flawed or outdated. Kathryn Boyd, a self-employed business owner in Ireland, told Which? she received outdated tax code advice from ChatGPT, stating: “It just gave me all the wrong information. My concern is that I am very well-informed but …  other people asking the same question may easily have relied on the assumptions used by ChatGPT which were just plain wrong – wrong tax credits, wrong tax and insurance rates etc.”

The UK’s Financial Conduct Authority (FCA) has made it clear that AI-generated advice is not regulated and falls outside the protections offered by the Financial Ombudsman Service and Financial Services Compensation Scheme. That means if a consumer loses money after following AI-generated advice, they have no formal recourse.

In response, the companies involved stressed the limitations of AI. Microsoft and Google said they encourage users to verify information independently and consult professionals for legal or financial matters. OpenAI noted that improving accuracy remains an industry-wide goal, stating their latest model, GPT-5, shows progress in that direction.

Leave a comment

Share to...