It was the polish that gave it away. In the District Court in Christchurch, a woman who had pleaded guilty to arson submitted apology letters to her victims and to the court. They were articulate, balanced, and emotionally precise. Judge Tom Gilbert said they were well written. But something about them felt formulaic.
So he tried an experiment.
Out of curiosity, he typed a prompt into two artificial intelligence tools asking for a letter to a judge expressing remorse for offending. What came back, he said in court, looked strikingly similar to the letters in front of him, with only small tweaks. To him, it was obvious they had been generated by AI, the New York Times reported.
That discovery left the judge in a difficult spot. In sentencing, remorse matters. Courts routinely consider whether a defendant has accepted responsibility and shown genuine understanding of the harm caused. A heartfelt apology can reduce a sentence. But what if the words are technically yours, yet not written by you?
Remorse is not just grammar
Judge Gilbert was careful not to condemn the use of AI outright. He did not suggest the defendant was forbidden from using it. But he drew a line between eloquence and authenticity. A computer-generated letter, he said in essence, does not tell him much about what is happening inside a person.
In the end, he gave the defendant some credit for remorse, but not as much as her lawyer had sought. Instead of a 10 percent reduction in sentence, he allowed 5 percent. She was sentenced to 27 months in prison.
The decision was modest, but the implications are not.
We are now in a moment where anyone can generate a moving apology in seconds. The language can be compassionate, reflective and carefully structured. It can say all the right things. But the very ease of producing those words may undercut their weight.
The cost of outsourcing sincerity
Researchers have begun to study what some call the “outsourcing penalty.” Studies published in the journal Computers in Human Behavior suggest that when people know A.I. was used for personal communication, they often view the effort as less meaningful. The tool may save time, but it also signals reduced personal investment.
The reaction is strongest when the task is intimate. Using AI to debug code feels practical. Using it to write wedding vows or an apology feels different. In those cases, effort itself carries moral meaning.
An apology is not just about content. It is about the struggle to find the words, the discomfort of confronting what you did and the act of choosing language that reflects that reckoning. When that process is replaced by a prompt, something essential can seem missing, even if the final text sounds flawless.
A courtroom test of a cultural shift
The Christchurch case is one of the first clear examples of a judge openly questioning AI-assisted remorse. It will not be the last. As generative tools become woven into daily life, courts, employers and families will face similar dilemmas.
If a machine can produce the perfect apology, how do we measure sincerity? Is it in the phrasing, or in the effort behind it?
Judge Gilbert looked past the smooth sentences and made his own assessment. But the larger tension remains. In a world where words can be generated instantly, proving they came from you may require more than sounding convincing.
Discover the latest Business News, Sensex, and Nifty updates. Obtain Personal Finance insights, tax queries, and expert opinions on Moneycontrol or download the Moneycontrol App to stay updated!
Find the best of Al News in one place, specially curated for you every weekend.
Stay on top of the latest tech trends and biggest startup news.