What Self-Reps Need to Know
With the rapid advancement of artificial intelligence, a question on many minds—especially those navigating the legal system alone—is this: Will AI replace lawyers? And more importantly, as a self-represented litigant, can I use AI to help me fight my own legal battle?
The short answer is: AI is already changing how legal work gets done. But using it effectively—and safely—requires understanding both its power and its limits.
How AI Is Entering the Legal Field
It’s no longer science fiction. Many law firms are already integrating AI into their workflows. Tools powered by large language models can assist with legal research, document review, contract analysis, and even drafting basic submissions. The goal is often to reduce time and cost—though whether those savings get passed on to clients, or simply increase billable efficiency, is another question entirely.
For self-represented litigants, this shift is worth paying attention to. If lawyers are using AI to streamline their work, it stands to reason that self-reps might benefit from similar tools. But the key is knowing how to use them responsibly.
My Experience with AI
When I went through my own self-represented battle, AI wasn’t advanced enough to be of much help. But over the past year, I’ve been exploring what these tools can do. And I’ll admit: the potential is exciting.
That said, I’ve also learned that AI is not a magic bullet. It’s a tool—and like any tool, it works best when you understand how to use it.
Getting to Know Your AI
If you’re considering using AI to help with legal research, drafting submissions, or organizing your thoughts, my first piece of advice is this: take the time to get to know the technology. Every AI model has its own “personality,” strengths, and weaknesses. You’ll only discover them through practice and experimentation.
For example, ChatGPT famously passed the Uniform Bar Exam in the United States, scoring in the 90th percentile. On the surface, that sounds impressive—and it is, in terms of demonstrating the model’s ability to process and apply legal concepts. But passing a multiple-choice test is not the same as drafting a persuasive legal argument tailored to your specific facts and jurisdiction.
The Risk of Hallucinations
One of the most important things to understand about generative AI is that it can hallucinate. That is, it can confidently produce information that sounds plausible but is completely wrong—including citing cases that don’t exist, misstating legal principles, or inventing statutes.
For a self-represented litigant, this is dangerous. If you file a submission that contains a hallucinated case or an incorrect legal statement, you risk not only losing credibility with the court but also harming your case. Judges expect accuracy. They rely on you to present the law correctly, even if you’re not a lawyer.
How to Use AI Responsibly
So how can a self-rep use AI without falling into these traps? Here are a few practical guidelines:
- Use AI as a starting point, not an authority. Let it help you brainstorm arguments, organize your thoughts, or summarize complex concepts. But always verify everything against primary sources: actual legislation, court rules, and decided cases.
- Treat AI like a junior researcher. It can point you in the right direction, but you wouldn’t rely on a first-year law student without checking their work. The same applies here.
- Double-check citations. If AI gives you a case name or a statutory reference, look it up yourself. Make sure it exists and stands for what the AI claims.
- Understand your jurisdiction. AI models are often trained on broad datasets that include laws from multiple countries and provinces. What’s true in one jurisdiction may not be true in yours. Always confirm that the information applies where your case is being heard.
- Never paste confidential information. Free AI tools are not secure. Avoid entering details about your case that could identify you, your opponent, or sensitive facts.
The Bottom Line
AI is a powerful tool, and it’s only going to become more integrated into the legal landscape. For self-represented litigants, it offers exciting possibilities—from demystifying legal language to helping structure arguments. But it is not a replacement for your own judgment, diligence, or the need to understand the law that applies to your case.
Used wisely, AI can be an ally. Used carelessly, it can become a liability.
So by all means, explore what AI can do. Just remember: in the end, you are the one responsible for what you file with the court. Make sure it’s right.
And that’s why I wrote Condozilla: to pull back the curtain on the legal system and guide readers through a realistic legal battle—so they know what to expect, what to watch out for, and that it’s possible to win.

Leave a Reply