Government's access to AI data may face restrictions following a recent court decision affecting HMRC
HMRC Ordered to Reveal Use of AI in Handling R&D Claims
The First-tier Tribunal (Information Rights) has ordered HM Revenue and Customs (HMRC) to confirm whether it holds information about its use of artificial intelligence (AI) in handling research and development (R&D) claims.
The order was issued following an appeal by software-based tax claims business owner Thomas Elsbury against the Information Commissioner's decision. Elsbury argued at the tribunal that HMRC's use of AI was resulting in nonsensical responses to R&D claims, potentially deterring small businesses from making such claims.
Initially, HMRC stated it held the requested information but would not disclose it under section 31(1)(d) of the Freedom of Information Act (FOI Act), citing potential prejudice to the assessment or collection of tax or duty. However, after Elsbury's appeal to the Information Commissioner, HMRC changed its position and said it would not confirm or deny whether it held the information under section 31(3) of the Act.
The tribunal overturned the Information Commissioner's decision and ordered HMRC to either supply the requested information or serve a refusal notice under section 17 of the FOI Act. The decision was welcomed by tax experts, who expressed concerns about the transparency and accountability of HMRC's use of AI.
Jake Landman, a tax expert at Pinsent Masons, stated that the ruling shows the dangers for HMRC of using AI without transparency regarding its use. Landman added that the use of AI by taxpayers in preparing submissions for appeals has led to hallucinated case law and waste of time and resources.
Cerys Wyn Davies, an expert in AI and intellectual property law at Pinsent Masons, emphasized the importance of transparency in the use of AI by public bodies. Wyn Davies stated that it is particularly important to ensure transparency when handling confidential intellectual property information involved in R&D claims.
The tribunal found that HMRC's failure to confirm or deny the holding of the requested information reinforces the belief that AI is being used by HMRC officers, potentially in an unauthorized manner. The tribunal warned that HMRC's changing basis for avoiding disclosure was "beyond uncomfortable."
Wyn Davies also noted that concerns about the information being fed into large language models (LLMs), used to train them, and becoming accessible to others create risks to R&D in the UK. Wyn Davies added that the ICO guidance requires accountability and transparency in AI deployment, including clarity over the use of taxpayer data.
The tribunal's decision comes at a time when building public confidence in the responsible use of AI is essential to continue innovation in the public interest. HMRC has 35 days to confirm whether it holds the information and either provide it to Elsbury or serve a refusal notice explaining why.