News

Can you sue AI for defamation?

We use it more and more, but what happens if AI generates fake information about you?

Mobile phone displaying OpenAI's ChatGPT function to represent AI (Artificial Intelligence).

From information seeking and data-tracking to copywriting and photo-shopping, we are seeing how AI is becoming more integral to our personal and business lives. This, we already know, is increasing at a speed that at times feels hard to keep up with.

We train our children to fact-check Google searches (though routinely use this as our first call for everything). Inevitably, how much we can rely on information we receive from AI is a question that will become more and more pertinent.

As a specialist in defamation law, the common sense advice that I would give to clients and colleagues is always to check before passing on, repeating or relying on views and content that could be the product of AI or otherwise. Treat AI with the same fact-checking and research as to its veracity as you would with all online sources.

However, as we have recently learnt from a case in Norway, it is not yet clear the proactive steps (if indeed there are any) that can be taken to guard against AI generating incorrect information about you to the worldwide users when people utilise these tools, such as ChatGPT and Microsoft’s CoPilot.

What happens when AI gets it wrong?

We may soon be closer to finding out what steps can be taken to challenge and seek compensation from such products. The headline-hitting case in Norway saw Mr Arve Hjalmar Holmen report Chat GPT to the Norwegian Data Protection Authority for falsely stating that he had been jailed for 21 years for killing two of his sons. In addition, Mr Holmen has demanded that Chat GPT’s creator OpenAI is fined.

It will be interesting to see how the Norwegian Date Protection Authority will address his case, and the steps that may be implemented more generally to prevent – or protect individuals – who may suffer from such a mistake occurring in the future.

Can I claim for defamation from AI?

In the meantime, it is possible that that a claim for defamation could be brought against the company that runs an AI user interface. It will be important to see how the common law and statute reacts to the everchanging world of AI, especially in the speed at which false information generated can be created and spread before being found to be unreliable.

This can pose huge risk to individuals and businesses putting out information into a public forum, as has already been evidenced on X and Facebook. Even if we cannot be proactive, it is not clear whether there is any reactive steps that can be taken to limit the reach or correct information once it is in the public domain. The incorrect information being posted about Mr Arve Hjalmar Holmen has been described by its creator OpenAI as a ‘hallucination’, which is when an AI system invents information but then proceeds to present it as fact. It is concerning that currently it does not appear that there is a way to stop this happening in the future.

Find out more information about defamation from our experts here, or contact our Dispute Resolution team for help on 020 8944 5290.

This article was written by Daniel Bolster

Please note the contents contained in this article are for general guidance only and reflection the position at time of posting. Legal advice should be sought before taking action in relation to specific matters.

More Articles

Easing the Divorce Process

Whilst ending a marriage can be stressful, it need not be overwhelming.  The...

Will Rain Stop Play? The Wimbledon Tennis Windows Competition

Racquets at the ready! The annual AELTC Championships brings out the competitive nature...

Written by Rebecca Cox

Covid-19: what about my business?

With growing concerns over the increasing possibility of an isolation period, reduction in...

Written by Clare Veal

Find out how we can help you

GET IN TOUCH

© Peacock & Co 2025. All Rights Reserved.

Peacock & Co is authorised and regulated by the Solicitors Regulation Authority.