A woman used cases generated by an AI system to try to appeal against a £3,265 penalty for failing to pay capital gains tax on a property she sold.
Felicity Harber appealed against the penalty on the basis that she had a reasonable excuse, because of her mental health condition “and/or because it was reasonable for her to be ignorant of the law.”
She supported her argument by making reference to nine previous cases where people had apparently been successful in showing that this reasonable excuse existed.
But a judgement published this week said none of the cases cited were genuine, and were “generated by an AI system such as ChatGPT.”
The first-tier tax tribunal accepted that Mrs Harber had been “unaware” that the AI cases, which she claimed were found by a friend helping her research her case, were not genuine and that she “did not know how to check their validity by using the tribunal website or other legal websites.”
Her appeal was dismissed and the penalty upheld, but the judgement said that the judge “did not take into account her reliance on the AI cases.”
But the judgement adds: “Nevertheless, providing authorities which are not genuine and asking a court or tribunal to rely on them is a serious and important issue.”
The nine “cases” included five which concerned mental health and four which dealt with ignorance of the law, and Mrs Harber said these, which included “Smith v HMRC (2021)” and “Baker v HMRC (2020)” had been provided to her by “a friend in a solicitor’s office” whom she had asked to assist with her appeal.
The tribunal said it had been unable to find any of the cases on legal websites.
It added: “We asked Mrs Harber if the cases had been generated by an AI system, such as ChatGPT. Mrs Harber said this was “possible”, but moved quickly on to say that she couldn’t see that it made any difference, as there must have been other tribunal cases in which it had decided that a person’s ignorance of the law and/or mental health condition provided a reasonable excuse.”
The Solicitors’ Regulation Authority recently said that AI models such as ChatGPT can have what is known as a “hallucination” where a system “produces highly plausible but incorrect results.”
Mrs Harber bought a property for £135,000 in 2006 and then sold it in 2018 for £252,000.
She told HMRC that she “did realise that after paying off the mortgage and loans taken against the house there would possibly be some tax to pay” and her evidence before the tribunal was that she had been aware at the time of the sale that HMRC “would want money”. Having estimated the CGT payable, Mrs Harber put £20,000 into government bonds.
At this stage, Mrs Harber took no professional advice from an accountant or solicitor about the CGT position, because this would have been “expensive,” the tribunal judgement said.
She also did not call or otherwise contact HMRC, and she did not look on the HMRC website to find out whether she had to notify her CGT liability. She knew the sale would be reported on the Government’s Land Registry site in due course, and thought that she would hear from HMRC “after the sale was lodged”. When asked if she would have contacted HMRC to report the gain if she hadn’t heard from them, she said “if time had gone on I might have done”.
Mrs Harber put forward two bases on which she had a reasonable excuse – her mental health condition and because it was reasonable for her to be ignorant of the law. She said she suffered from anxiety and panic attacks, which were exacerbated by the break-up of a long-term relationship which occurred around the time she sold the property.
But the tribunal concluded: “Mrs Harber does not have a reasonable excuse for her failure to notify liability, and we confirm the penalty. The appeal is therefore dismissed.”
Tax campaigner Dan Neidle, the founder of Tax Policy Associates, said the first public version of GPT – which stands for Generative pre-trained transformer – was “rubbish” at legal research and “would make things up in a heartbeat.”
He added: “The newest version, GPT 4 [which is a paid only version], is really good – for many questions it’s equivalent to a trainee at a law firm, if the trainee had a really good memory but also was occasionally a psychopathic liar. So it’s a useful tool, but anyone who doesn’t check its output very carefully will get themselves into trouble.”