Don’t Believe Everything you Read! 

Don’t Believe Everything you Read! 

Artificial intelligence (AI) apps like ChatGPT and Copilot have become very popular in recent years. AI apps are versatile, and can help with a wide range of tasks, from writing reports to language translation. AI can also be helpful when it comes to tax. For example, AI can be a useful tax research tool. However, it is important to be aware of its current limitations, and not to blindly trust its responses to requests for information on tax matters.

You Could not Make it Up

For example, in Harber v Revenue and Customs [2023] UKFTT 1007 (TC), the taxpayer’s dispute with HM Revenue and Customs (HMRC) reached the First-tier Tribunal (FTT). The taxpayer cited case law (i.e., nine cases) at the appeal hearing in support of her appeal. However, none of those case authorities were genuine; they had apparently been generated by AI. The FTT accepted that the taxpayer had been unaware that the AI cases were not genuine and that she did not know how to check their validity. The FTT did not take the AI cases into account. The taxpayer’s case was ultimately unsuccessful.

Inaccurate Cases

Even if AI identifies real (as opposed to fictitious) cases, that does not necessarily mean that the cases selected by AI are accurate, relevant or instructive. In Zzaman v Revenue and Customs [2025] UKFTT 539 (TC), the taxpayer appealed against HMRC’s discovery assessment of the high-income child benefit charge (HICBC) for 2018/19. He contended that he should not suffer the HICBC tax charge and put forward a variety of reasons. Before the FTT, the taxpayer readily accepted that he had used AI to assist him in finding cases to support his arguments because he did not have the skills to look for them. The FTT considered that it was “logical and reasonable” for the taxpayer to use AI to assist with his case preparation. Some of the case citations in the taxpayer’s statement were inaccurate, but the use of AI did not appear to have led to the citing of fictitious cases. However, the FTT concluded that the taxpayer’s statement of case, written with the assistance of AI, did not provide grounds for allowing his appeal. The cases cited by the taxpayer did not provide authority for the propositions advanced. The taxpayer’s appeal was dismissed.

Tips from the FTT

The FTT in Harber and Zzaman offered some advice for those taxpayers intending to use AI to present their case, to reduce the potential dangers. These broadly include:

  • Checking case law authorities by using the FTT website, BAILLI or other legal websites.
  • Using clear AI prompts.
  • Asking AI to cite specific paragraphs of authorities (to check if the paragraphs support the argument advanced).
  • Checking that the AI has access to live internet data.
  • Request that the AI tool does not provide an answer if it is unsure.
  • Asking the AI tool for information on the shortcomings of the case being advanced.

PracticalTip

It may be tempting for taxpayers to save on professional fees by using AI instead. However, if doing so, proceed with caution. The FTT in Harber and Zzaman accepted that the taxpayers had used AI in good faith. However, if taxpayers fail to learn the lessons from such cases, the FTT may not be so understanding in the future.

Ask an Expert! Book a Demo Request A Callback

Looking for a Qualified Accountant? Compare Accountants Now.

Accountants? Looking to Grow? List Your Firm Now?

Looking for a Qualified Accountant? Compare Accountants Now.

Accountants? Looking to Grow? List Your Firm Now?

Looking for a Qualified Accountant? Compare Accountants Now.

Accountants? Looking to Grow? List Your Firm Now?