Lawyer or AI?
- Adam Reed
- 1 day ago
- 3 min read

In mid-July, news broke that a Federal Judge had ordered 2 attorneys representing Mike Lindel, the MyPillow CEO, to pay money as a sanction for filing Motions with false information. Seems that the attorneys had used an AI tool to write the Motion and to cite cases. The problem is that the AI tool cited cases that did not exist.
That’s right. AI made it up.
But this isn’t the only example of AI making up information for legal matters. Damien Charlotin, a senior research fellow at the HEC Paris Business School, catalogs legal cases in which lawyers have faced discipline due to AI “hallucinations” (the term for information that AI makes up). He has identified at least 220 examples, including 25 just in the first half of July 2025. You can read more about this trend here: AI Hallucination Cases Database – Damien Charlotin
Using AI for legal work is a risky proposition. There is always the danger of hallucinations and false information. Or AI simply doesn’t get the law correct. Or it fails to completely understand the user’s needs or intention.
The inherent problem with AI is that AI is “artificial,” meaning that AI cannot create new thoughts and is dependent on the database of information that is available to it. If that information is not large enough, or if it contains certain biases or inaccuracies, the results generated by AI can be unreliable.
The other day I read about a person who was hurt in an accident and that person said that he did not need to hire a lawyer and that he would use AI instead. Said it would do just as good a job as a lawyer.
Wrong.
AI does not understand nuances. It has difficulty in analyzing uncertain legal areas—and there are a lot—and new legal arguments. It sees the world in black and white and legal issues are seldom black and white.
Further, AI cannot express or understand emotion or pain, except as numbers that have been awarded in other cases. But every case is different and unique. Every person has a unique pain experience. Your pain might be more or less than a person who had a very similar accident the same day. This difference is because, to put it simply, every person is different. AI cannot take that into account.
So, the person who is using AI to handle his personal injury matter risks missing all the legal claims he could have asserted, risks undervaluing non-economic damage such as pain and mental anguish, and overall risks resolving his case for far less than it is worth.
That is why you should always hire a human lawyer to handle your legal matter. Only a human can understand nuance, create new or modified legal claims, and comprehend your unique qualities and experiences. Even more importantly, a human can feel empathy. AI cannot put your mind at ease, tell your family that it has someone who will fight for them, and do everything in his power to make sure justice it brought by those who have injured because someone else did wrong. AI cannot have the passion to make sure a company does not put profits over safety, or to make sure that a careless person learns to be safer.
Only a human can do that.
Don’t get me wrong: AI has its place. There are some AI tools that can help your lawyer be more efficient and move your matter more rapidly to resolution. But there are no AI tools that can be creative, passionate, empathetic, and your zealous advocate. There are no AI tools that can give you analysis and advice based on judgment, intuition, and experience. There are no AI tools that can understand and interpret human behavior in the negotiating room or in the courtroom.
Only a human lawyer can do those things.
And that is why you should always hire a human to be Your Lawyer.
Comments