Artificial Intelligence (AI) has infiltrated almost every nook and cranny in modern life. Most people have used AI to find out the weather, send a text or add something to their virtual cart. A simple “hey Siri,” or “Alexa,” fires up your virtual assistant, who can even tell you a good joke if you need a laugh. AI is increasingly used in the legal field as well.
In March 2025, a New York appeals court allowed plaintiff Jerome Dewald to provide his argument via video. When the video started to play, judges were shocked to have the argument presented by an avatar created by Dewald. The judge shut down the video but allowed him to continue to argue his case.
Dewald apologized to the court and explained that he thought the virtual character would do a better job presenting his position – after all, the avatar isn’t going to be nervous or intimidated by the legal process. In my opinion, I don’t think licensed attorneys should be allowed to substitute AI for themselves in court, but self-represented individuals are often afforded more leniency. Arguments for an artificial advocate could be made, especially if the litigant has a disability or learning disorder. The litigant should still be present to answer any questions or be cross-examined, and there’s always the risk that technology could glitch (think about how many times we’ve told people they are frozen when Zooming).
Interestingly, Arizona’s Supreme Court created AI Reporters Daniel and Victoria to deliver information about every ruling. Videos online have the pleasant and well-spoken avatars explaining the reasoning – the court wants to ensure timely access to accurate court decisions and have them presented in an understandable way. It only takes 30 minutes to generate the AI videos, which are written by court employees, significantly shortening the amount of time to “get the news out.”
AI certainly has its downfall – lawyers have been fined for using it to create legal arguments that turned out to be supported by non-existent law, or “AI hallucinations.” Short cuts are not always the best route and fact checking should be a must for anyone using AI in court. After all, you shouldn’t believe everything you read on the internet.