I’m a mid-career lawyer who’s seen many technology hype cycles. AI in legal services is actually making a difference, but not always in the ways the press claims. I want to share how AI is transforming day-to-day work at law firms in the United States and Canada, from Big Law to solo practitioners, what’s working versus what’s overhyped, and how it’s impacting the cost of legal services.
Introduction: Hype vs Reality in Legal AI
A few years ago, “robot lawyers” were the hot topic, and many were skeptical. Fast-forward to today, and AI has gone from buzzword to daily use for lawyers like me. In 2023-24, the adoption of AI in law experienced a significant surge. One industry report showed that AI usage among lawyers jumped from 19% to 79% in just one year. Everyone started trying out software like Chatgpt for work. I went from casually experimenting with artificial intelligence to relying on it for tedious tasks.
That said, not all the hype panned out. We’ve seen grand promises (like a chatbot “lawyer” that can beat any parking ticket) meet harsh reality: regulatory pushback and hilariously bad legal citations. In this post, I’ll explain how AI is used in legal practice: the good, the bad, and the game-changing. I’ll cover concrete examples from both the United States and Canada, as I’ve worked in both and noticed many similarities (and a few differences) in adoption trends.
AI for Legal Research and Writing – My New “Associate”
One of the most exciting areas has been legal research and drafting. I joke that AI has become my new junior associate, minus the coffee runs. Two standout examples are Casetext’s CoCounsel, Caseway, and Harvey.ai:
CoCounsel (Casetext) – Launched in 2023, it’s an artificial intelligence legal assistant built on GPT-4 that can research case law, summarize documents, and even draft briefs or memos. Big law firms jumped on this early. For instance, 600-lawyer firm McGuireWoods and global giant DLA Piper announced they’re using CoCounsel to help attorneys with tasks like reviewing contracts and prepping depositions. I’ve used CoCounsel for quick research. You can ask in plain English, such as “Find five cases that support a negligence claim for icy sidewalks in Illinois,” and it provides results with summaries. It’s imperfect, but it often surfaces relevant cases faster than I could manually review them. Importantly, it keeps data secure and private, which is enormous for client confidentiality.
Harvey – Another Genai software provided (backed by the Openai fund) that made headlines when Allen & Overy (a Magic Circle firm) rolled it out to 3,500+ lawyers. Harvey uses Openai’s models to assist in document drafting and research. A partner at A&O stated that not utilizing AI would soon become a “serious competitive disadvantage.” They also said that Harvey was saving their lawyers “a couple of hours a week” on routine work. In my practice, saving a few hours on research or first-draft writing is gold. That time gets redirected to higher-value strategy or to talking with the client. As more law firms follow this lead, they integrate AI into the research and drafting workflow. (Even the makers of Westlaw and Lexis are adding GPT-powered features so they don’t get left behind.)
Caseway
Caseway is a legal research assistant that utilizes machine learning to speed up tasks such as researching case law, reviewing contracts, and drafting documents. Its chatbot-style interface can sift through hundreds millions of court decisions in seconds. It is then capable of delivering answers to legal questions and relevant case citations for verification.
Caseway has gained traction quickly, with over 2,000 lawyers and non-lawyers adopting it since its launch in Spemtember 2024. Partners at major law firms have praised its ability to answer nuanced queries clearly and succinctly. It is also backed by strong citations. Thisleads to significant time and cost savings in case law research. By automating tedious research, Caseway enables lawyers to focus on higher-value work and client service. At the same time it can safeguard confidentiality with enterprise-grade data encryption and robust privacy compliance measures.
But there are still issues that need to be addressed with law firms using machine learning software…
Remember the lawyer who got busted for submitting a brief full of fake case citations courtesy of Chatgpt? (Yep, Chatgpt invented cases that didn’t exist.) That happened in mid-2023, and the court was not amused. A New York attorney faced sanctions because he trusted an AI’s output without verifying it. Following that cautionary tale, firms (including mine) established rules: AI can assist in drafting, but a human lawyer must verify every quote and citation.
Modern Systems Reshaping Legal Work
The Federal Court of Canada now explicitly requires lawyers to disclose whether they used generative AI in a court filing and confirm that a human has reviewed the content. Courts in the United States haven’t yet universally mandated disclosure, but judges are issuing similar warnings. So, while AI is like a super-smart assistant, it occasionally makes confident claims. Our law firm has learned to double-check everything it produces (“trust, but verify,” with a heavy emphasis on the “verify”).
It’s not just fancy new GPT, either. We’ve had earlier artificial intelligence in legal research. Before it was shut down by the courts, ROSS Intelligence was the top AI legal research platform. It was launched around 2014 using IBM Watson technology. Some colleagues tried it for finding case law by asking natural questions. ROSS had success with a few law firms, but it shut down in 2020 after Thomson Reuters, the owner of Westlaw, sued for allegedly scraping propriatary data from its platform. (Ironically, that lawsuit might’ve slowed down AI progress for a bit, and highlighted how valuable legal data is, but it didn’t stop the wave. Thomson Reuters later bought Casetext/CoCounsel in 2023, as if to say, “if you can’t beat ‘em, join ‘em”.) Or maybe TR was just trying to buy itself time to come up with it’s own AI product, who knows?
I now use machine learning software as a starting point daily. It’s like having a very eager first-year associate. The software draft a decent “first cut” of a memo or contract, but I still have to review, edit, and ensure it’s accurate. Having AI generate a quick outline or summarize a 50-page case is a lifesaver when I’m swamped. Basically, it’s good for first drafts.
Smarter Document Review and Contract Analysis
Another area AI truly shines (and is already pretty mature) is document review and contract analysis. This is about handling large volumes of text. It can involve thousands of emails in discovery or stacks of contracts in a deal. In the past, we’d staff armies of junior lawyers or paralegals to slog through due diligence. Now, we have artificial intelligence software that can perform the initial heavy lifting much faster.
For example, my legal practice started using Kira Systems a few years back for M&A due diligence. Kira, a Toronto-based startup now part of Litera, utilizes machine learning to identify and extract clauses in contracts. This includes things such as “change of control” and “termination,” among others. Instead of manually reading 100 leases for assignment clauses, Kira flags them in minutes.
Similarly, another company, Luminance (from the UK), is used by many BigLaw firms worldwide; it can identify over 1,000 concepts in contracts and even spot anomalies that deviate from standard language. According to Luminance’s data, it’s trusted by over 400 organizations and a quarter of the world’s largest law firms for contract review tasks. That’s a considerable adoption. Essentially, contract review done by machines is becoming standard practice for significant transactions.
Digital Transformation of Law Firms
I’ve used different AI products due diligence projects, and they’re a massive time-saver. What might take a team of associates two weeks can be accomplished in just a day. There is a case study where a legal team using Luminance reviewed an 80-page contract in five minutes by having the AI identify key points. That feels almost like cheating, but in a good way. Of course, lawyers still need to interpret the findings and conduct nuanced analysis. But AI handles the tedious “find the needle in the haystack” work at scale. It also reduces human error (like overlooking a rogue clause on page 87 of a document at 2 am).
Spellbook, a Canadian AI company
Even contract drafting has gotten an AI upgrade. Instead of just reading documents, AI can assist in writing them. I’ve been playing with Spellbook, a Canadian AI add-in for Microsoft Word that suggests contract language as you draft (it’s powered by GPT-4, right inside Word). It can, for example, indicate a confidentiality clause based on context or flag that “Hey, you mentioned arbitration here, but no clause is provided.” The company claims it can let lawyers draft and review contracts up to four times faster. While 4× might be optimistic, I find it handy for not missing standard provisions. It’s like Grammarly but for legal clauses.
Chatbots, Virtual Lawyers, and Legal Aid: AI Frontlines for Clients
Not all AI in law is for lawyers behind the scenes, as some of it faces the public (for better or worse). AI chatbots for legal assistance have emerged to assist people with fundamental legal issues directly. The poster child here is DoNotPay, an app that brands itself as “the world’s first robot lawyer.” DoNotPay began by helping people fight parking tickets through a simple Q&A chatbot. And then they expanded to drafting form letters for refunds, cancellations, small claims, and more. The idea is accessible legal self-help: fight bureaucracy without needing a lawyer.
The flashy marketing of “robot lawyer” apps, such as DoNotPay, highlights the hype surrounding AI in law. In practice, these kinds of software have had mixed results. They can help automate simple legal tasks, but they’re far from replacing attorneys, and regulators have cracked down on overblown claims.
DoNotPay’s CEO even planned to have an AI argue in an actual court. In early 2023, he announced a stunt: an accused traffic violator would wear smart glasses with an earpiece. And then a chatbot (via Chatgpt) would provide him with arguments in real-time during traffic court. This “AI lawyer in your ear” scheme made headlines, and then got shut down fast. State bar officials threatened that unauthorized practice of law charges and even jail could follow if they went through with it.
Needless to say, they pulled the plug on that experiment. It was a good reality check: courts and regulators aren’t ready to let an AI replace a licensed attorney in a courtroom. And honestly, technology isn’t prepared for that either (imagine trusting a drug charge court defence entirely to a bot).
Integrating Technology into Legal Services
DoNotPay just got in trouble with the FTC in the United States for its bold marketing. It had claimed things like “fight corporations, beat bureaucracy, sue anyone at the press of a button.” This was basically implying its AI was as good as a lawyer. The FTC found those claims misleading. In late 2024, DoNotPay settled and paid $193,000, agreeing to stop calling itself the “world’s first robot lawyer” and to warn users of its limitations. (Fun tidbit: the FTC noted DoNotPay never even hired a lawyer to check its AI’s outputs. So it was truly running without a human in the loop. This was said to be big no-no for legal advice.)
That doesn’t mean AI chatbots are useless, as they can help with access to justice in many ways. For straightforward, low-stakes matters (such as contesting a $50 parking ticket or obtaining a refund for a poor airline experience), a simple automated software can guide individuals through the process at minimal to no cost. Even courts and nonprofits are getting into this game.
In British Columbia, Canada, an Online Dispute Resolution (Civil Resolution Tribunal) system exists for small claims and condominium disputes. Utilizing a question-and-answer interface to assist parties in settling their disputes – not exactly an AI lawyer, but a step toward automating dispute resolution.
Meanwhile, in the United States, groups like Code for America are working on AI to help clear criminal records. One pilot uses AI to scan California court data and automatically identify people eligible for expungement of old convictions, thereby eliminating the need for them to file paperwork.. That’s huge for access to justice. There are thousands of people with old records who could get relief, even if they can’t afford an attorney, because the AI flags their case for the court.
The Rise of Smart Law Firms
Even simple legal aid chatbots can make a difference. Some state courts and legal aid websites offer virtual assistants that answer common questions, such as “How do I file for divorce? Where can I find information on tenant rights?” and assist with form completion. These are often rule-based, not super fancy GPT, but I suspect we’ll see more Genai-powered versions soon that can give more conversational guidance (within ethical limits).
AI is empowering consumers of legal services, not just lawyers. It hasn’t replaced lawyers (and won’t anytime soon for complex matters), but it’s automating the tedious paperwork and guidance for simpler issues. As a lawyer, I think this is great. It frees us up to focus on the tough cases and clients who truly need counsel, while routine stuff can be handled or at least triaged by an app.
The caution is to ensure that the information is accurate and that people understand a chatbot is not a substitute for a qualified lawyer. We don’t want folks relying on bad legal advice from a glorified FAQ machine. So far, the approach in Canada and the United States has been to keep a human in the loop or at least in oversight for AI-driven legal assistance.
What’s Working, What’s Overhyped (Keeping It Real)
From my experience and watching others in the industry, here’s a quick breakdown of where AI is delivering and where it’s mostly hot air…
What’s actually working:
Document Review & Diligence: AI excels at efficiently reviewing documents. It identifies key clauses and flags issues more quickly than paralegals conducting a manual review. Many large firms utilize software like Kira and Luminance, and they’ve become trusted aids, with over 25% of the world’s largest firms using Luminance. In e-discovery, AI-driven review has been proven to reduce costs and time; it’s now a standard practice in significant cases to use predictive coding.
Legal Research & Analytics: Attorneys are becoming increasingly comfortable with utilizing AI software to search for cases or analyze litigation data. A 2024 survey revealed that approximately 68% of law firm professionals now utilize legal analytics software, such as Lexis’s Lex Machina, to analyze judges, outcomes, and other relevant factors, up from 36% in 2018. Clients even expect this: 80% of lawyers said clients assume the firm is using these analytics to be strategic. On my end, AI research assistants (CoCounsel, etc.) are consistently reliable at summarizing the law and providing a starting point, which is a significant efficiency boost.
Generating First Drafts
For routine tasks, such as drafting contracts, sending demand letters, and composing memos, generative AI is a significant time-saver. It saves me hours of work by handling boilerplate and basic explanations. One study estimated AI can save lawyers around 4 hours a week, adding up to $100,000 in extra billable capacity per lawyer per year. That tracks with my experience. My hours are now dedicated to strategic work, while AI handles the routine tasks.
AI isn’t making decisions for us, but it’s offering insights. Prediction software can estimate win rates for specific motions before Judge X or indicate the duration of cases in Court Y.
Blue J Legal, a Canadian company, utilizes this functionality for tax and employment law. It predicts likely outcomes based on past rulings. It’s not perfect, but it helps guide decisions. I’ve used it to frame conversations with clients, like when there’s an 80% chance a worker will be seen as an employee, not a contractor, and it makes more sense to settle.
What’s Overhyped or Still Problematic
You’ve seen the claims that AI replacing lawyers. It’s hype. AI isn’t arguing motions or negotiating deals on its own. Just look at DoNotPay. It attempted to enter the courtroom space but failed miserably. It encountered legal restrictions and wasn’t sufficiently competent. AI lacks judgment, ethics, and a license. The “robot lawyer” dream is fiction… For now. AI supports lawyers and, in some cases, self-represented litigants. But it’s not a lawyer.
Some people thought AI was ready to be trusted blindly. Then came the fake citations. That myth died fast. AI can “hallucinate.” This is a nice way of saying it makes stuff up when it doesn’t know the answer. Lawyers who forget that get burned. The ABA stepped in with ethics guidance in 2024, recommending that AI be treated like a junior staffer. You’re responsible for reviewing its work and ensuring the safekeeping of client information.
Some believe AI can solve complex legal questions with precision. Not quite. AI works best with pattern-heavy tasks, such as spotting trends across dozens of similar contracts or estimating the typical length of sentences. But on novel legal questions or complex litigation, AI often stumbles. It either gets vague or confidently wrong. I’ve tested it on edge cases and usually get generic advice or outdated logic. When the data’s thin, AI guesses. This is fine when you ask about recipes, but law isn’t a field where guessing works.
The Future of Technology in Law Firms
Some pitch AI as objective, claiming it removes human bias. Not true. If you train AI on biased data, like sentencing trends that historically punished certain groups, it can repeat or even amplify that bias. We’ve already seen this with “risk assessment” software in United States bail and sentencing hearings. These systems produced racially biased results. Canada and the United States both closely monitor this issue, particularly in government use. Any claims that “AI = fairer law” need a serious reality check.
AI works best when it stays in its designated helper role. It boosts productivity, automates tedious tasks, and provides data to support informed decisions. But it’s not a lawyer. It doesn’t reason like we do. The best uses are the ones that respect that line—and keep the judgment calls where they belong.
Impact on Access, Costs, and the Lawyer’s Life (BigLaw vs Small Firm)
One thing I love about this AI wave is how it’s forcing the legal industry to rethink our business models and improve accessibility. A few changes stand out.
AI makes lawyers faster. That should mean lower costs for clients, unless law firms keep the savings as profit. There’s growing pressure to move away from the billable hour. If AI cuts 30% of the time needed for a task, clients will expect to pay less, or at least a flat fee.
We’re already seeing this shift. Firms adopting AI are leaning into fixed-fee models and still staying profitable thanks to time saved. In my practice, I’ve started offering flat-rate packages for services such as contract drafting and compliance reviews. With AI, I can complete them quickly, and smaller clients can finally afford legal help.
Access to Justice
We previously touched on chatbots, but AI is also making legal help more accessible. It won’t close the justice gap on its own. Some problems still need a human lawyer, but it’s a solid step forward.
Courts in Canada and the United States are experimenting with plain-language AI summaries of decisions and software that guide self-represented litigants through the form-filling process. If a tenant can generate a decent letter to a landlord or figure out how to file a complaint using a free AI assistant, that’s a win.
The Canadian Judicial Council has even published AI guidelines. They see a role for AI in assisting with tasks such as translation, transcript generation, and evidence summaries. All of these can expedite the process and enhance court access.
Small Firms and Solos Are Levelling Up
In the past, advanced technology was a hallmark of BigLaw. Huge software budgets. Dedicated IT teams. Now, AI is cloud-based, cheap, or even free. That’s changed everything.
I know solo lawyers using Chatgpt or Bing Chat for fast research. They use software like Spellbook or Clio’s AI to automate document drafting and handle client intake. It enables them to serve more clients faster, without the need to hire.
A survey in Canada found that by the end of 2024, 25% of law firm professionals would already be using generative AI. Another 24% planned to start in the following year. Solos led the charge, increasing tech spending by 56%, compared to about 20% for firms overall. They view AI as a means to compete without increasing headcount.
BigLaw’s Edge Still Exists
That said, BigLaw hasn’t been sleeping. They’re throwing serious money at AI. Some have exclusive deals, like Harvey’s partnership with Allen & Overy, or are building their software using years of internal data. That gives them robust in-house systems that a small firm can’t easily match.
They also utilize AI across the board. This often incldues billing, client service, and so on. And they’ve got protocols. Risk committees. Bias audits. Security reviews. If they deploy AI, it’s often through locked-down enterprise systems.
Compare that to a solo lawyer pasting client info into ChatGPT on a whim. They might do this without realizing it might get scraped for training. That’s a real problem. I’ve seen it happen. Big law firms mitigate those risks with closed systems. Small law firms need to stay sharp or risk exposure.
In-House Legal Departments
Quick note. Corporate legal teams are getting into AI too, just at a slower pace than law firms. A late 2024 Canadian survey found only 15% of in-house teams had adopted general AI, compared to 25% of law firms. Budget limits and tricky IT setups often slow down in-house adoption.
Still, interest is high. Many in-house lawyers see big potential in using AI for contract management, compliance, and risk triage. I recently spoke with a GC who’s looking to use AI to scan the flood of NDAs their company receives. The idea is to flag the risky ones and skip the rest. That kind of workflow can cut legal costs significantly.
Of course, these changes raise deeper questions. If AI handles first drafts, how do junior lawyers learn? Are we leaning too much on automation and opening the door to avoidable mistakes? These are live debates. Bar associations across the United States and Canada are publishing guidance. The ABA now says that understanding AI is part of a lawyer’s duty of competence. Canadian regulators are saying the same: lawyers must stay current with tech or risk falling behind.
Some are pushing law schools and CLE programs to teach AI literacy. Obviously this doesn’t include coding, but how to prompt effectively, spot errors, and use it ethically. I had to learn all that by trial and error. It’s now as essential as knowing Westlaw or Excel.
United States vs Canada: Similar Journey, Different Pace?
I get asked a lot which country is ahead on legal AI. Having worked in both, I’d say the trends are almost parallel, but the pace and culture differ.
United States firms, especially in BigLaw, embraced AI quickly once GPT-powered software hit. That stat, 79% of lawyers expected to use AI by 2024, comes mostly from the United States. Canada took a more measured approach. Smaller budgets and a “wait-and-see” culture may explain it. The Appara survey from mid-2024 showed about 25% of Canadian legal professionals using gen AI, which puts Canada slightly behind.
That said, the gap is closing. Interest has exploded. Even conservative Canadian firms are now piloting GPT-based software for research and automation.
Ironically, Canada has been a legal AI innovation hub. Kira Systems, Caseway, ROSS Intelligence, Blue J Legal, and Lexum, each started in Canada. Canadian law schools and incubators were early movers in the AI space. For its size, Canada punches well above its weight.
The United States dominates in scale, with major vendors like Lexis and Thomson Reuters and a flood of startups. But per capita, Canada holds its own. Blue J, for example, trained its system on decades of tax rulings and can predict case outcomes with over 90% accuracy in some scenarios. Software like that are now used by lawyers in both countries. The technology flows freely across the border.
Court System Involvement
Canadian courts and government have been proactive in examining AI. The Federal Court of Canada’s 2023 initiative, which includes issuing guidelines for the use of AI in court and exploring the court’s internal use of AI, is a notable step. They said that if lawyers use AI drafting, disclose it, and the court is considering using AI to assist with tasks such as scheduling and administrative efficiency, it may be possible. In the United States, there is no single, unified approach due to the fragmented state and federal system.
However, we do see individual actions: for example, the California court system established a task force in 2024 to evaluate the use of generative AI in the court system and draft rules surrounding it. Some states have proposed or implemented rules that require lawyers to review and verify AI outputs to ensure that no confidential information is disclosed. Both countries are aware of and working on the governance of AI in the justice system, with Canada possibly a half-step ahead in terms of formal guidance, due to the Federal Court notice.
Regulation of the AI Industry
Outside of legal practice, Canada is moving toward regulating AI through legislation, specifically the proposed Artificial Intelligence and Data Act under Bill C-27, as noted on whitecase.com. It’s a broad law that will impose requirements on AI systems, particularly those with high impact, to ensure transparency and mitigate risk.
If that passes, any AI used in high-risk legal services might require specific compliance measures (for example, if an AI is used to make decisions affecting legal rights, it could be classified as high-impact). The United States doesn’t have a federal AI law yet, although there are sectoral guidelines, and the EU’s AI Act will likely influence everyone. For lawyers, it means we may soon have more formal rules governing the use of AI software and how they are utilized. We’re keeping an eye on it – no one wants to use an AI software that later is deemed non-compliant accidentally.
Privacy and Data
Canada takes privacy seriously. Laws like PIPEDA, and even stricter provincial rules, often outpace United States standards. That matters when using cloud-based AI. Canadian firms typically need to store data in Canada or have robust agreements in place if it crosses borders.
In contrast, many United States lawyers face fewer complex restrictions. Their duties regarding confidentiality are fundamental, but state-by-state rules often leave more room for flexibility. So while a United States firm might jump into a new AI software, a Canadian law firm is more likely to pause and check compliance, or confirm that the servers are local.
I’ve seen this play out firsthand. In cross-border cases, Canadian co-counsel have refused to use AI software unless the provider hosted everything in Canada or signed a strict data agreement.
Still, the core AI use cases are the same. Lawyers in Toronto and Vancouver are utilizing AI for research, document review, and drafting, just like their counterparts in New York and Los Angeles, everyone’s wrestling with the same ethical questions. Everyone is trying to serve more clients more efficiently.
We’re also seeing real cross-border adoption. Toronto-based Alexsei, which generates AI-driven legal memos, is now expanding into parts of the United States. In the United States lawyers use it for rapid state-law summaries. Meanwhile, Canadian lawyers use United States built platforms like CaseText to handle Canadian case law, just by switching databases. The tech is flexible. The flow goes both ways. We’re building a shared legal AI ecosystem.
The Road Ahead With Technology in Law Firms
I never thought I’d say this, but it’s a fun time to be a lawyer. AI is dragging the profession, kicking and screaming, into the future. And it’s mostly a good thing. I’m more productive. I spend more time thinking about strategy and client advice, and less time on drudgery like document tagging or checking boilerplate for the fifth time.
So, where is this going?
AI will become invisible. Like Westlaw or Word, it’ll just be part of the toolkit. A few years from now, we won’t say, “I’m using AI.” We’ll use it, embedded in billing software, practice management tools, and even our inboxes.
We’ll also see new rules. Courts and bar associations will require more disclosures. For example, Canada’s Federal Court already expects you to say if you used AI in filings. Others will follow. CLE programs will start treating AI literacy as a core skill.
And law school? It’ll change, too. Today’s graduates are expected to be familiar with e-filing and online research. Tomorrow, they will need to know how to prompt an AI to draft a memo or analyze a judge’s ruling history. We might even see AI certifications, “prompt fluency,” become a real thing.
That said, we can’t forget the basics. Knowing how to use an AI doesn’t mean you know how to argue a motion. Future lawyers still need the fundamentals. AI should enhance legal thinking, not replace it.
Legal Operations and Automation Trends
Clients stand to benefit the most if we get this right. Routine legal services could become far more affordable and accessible. As cost and friction drop, more people might seek legal help for problems they’d usually ignore. That could mean hiring a lawyer or using a legal app. Either way, access to justice improves. This is not because AI replaces lawyers, but because it enables us to serve more clients better and at a lower cost. It can also support the public directly with simpler legal needs.
The legal justice system itself might become more efficient. Picture AI triaging cases, drafting basic decisions for judges to review, or guiding self-represented litigants through forms. Courts would certainly welcome fewer errors and delays.
But we need to stay sharp. Security matters more than ever when we use cloud-based AI. We also need to be aware of potential bias in the results that AI suggests. And no matter how advanced the tech gets, we can’t let it override our judgment. I like how the ABA puts it, treat AI like a junior colleague who means well but isn’t always right. You’re still the one responsible for what it does. That mindset will save us a lot of grief.
Technology in Law Firms
From where I sit, I’m optimistic. AI in law can feel like having a superpower, like getting a polished contract draft in seconds or using an AI insight to win a motion because we strategized better. It’s made practicing law more enjoyable (less grunt work!) and, in many cases, improved outcomes for clients.
Sure, there are headaches. Learning new systems, double-checking AI’s accuracy, and convincing old-school partners to give it a chance can be tough. But the direction we’re heading is obvious.
In both the United States and Canada, AI isn’t some distant future. It’s already in our documents, our courtrooms, and even on our clients’ phones. The lawyers who embrace it, carefully, will gain an edge. Those who don’t risk falling behind. One innovation partner said it might soon be malpractice not to use AI, thanks to the improvement and insight it brings. We’re not there yet, but the fact that bar regulators and courts are already talking about it shows how fast things are changing.
Thanks for sticking with this brain-dump! I’d love to hear how others are using (or avoiding) AI in their legal work. Got any great stories, tools, or disasters to share? One thing’s certain, the future’s wide open, and it’s a fascinating time to be a lawyer in the middle of all this change.
Sources
ABA Legal Technology Survey Report 2024 (ABA News) – lawyers leveraging AI for research; 85% using e-filing; growing use of AI in discovery.
Canadian Lawyer survey (Appara, 2024) – 25% of law firm professionals in Canada have invested in gen AI (15% in-house); law firms outpacing in-house in AI adoption.
Reuters (Feb 2023) – Allen & Overy’s 3,500 lawyers using Harvey AI; partner says not adopting AI will be a competitive disadvantage; a couple hours saved per lawyer per week.
Luminance Press Release (2022) – AI contract review tool used by 400+ orgs, including 1/4 of world’s largest law firms.
Gizmodo (Sep 2024) – FTC settlement with DoNotPay over “robot lawyer” claims; $193k fine; company never tested AI with actual lawyers.
NPR (Jan 2023) – DoNotPay’s attempt to have an AI “lawyer” argue a traffic ticket case halted by threats of prosecution; plan involved smart glasses feeding ChatGPT answers to defendant.
Reuters (Dec 2020) – ROSS Intelligence (AI legal research startup) shut down, citing Westlaw’s lawsuit over data usage.
Legal Dive (Feb 2024) – Nearly 70% of law firm professionals now use legal analytics tools (up from 36% in 2018); clients expect data-driven insights.
Federal Court of Canada Notice (2023) – Requires disclosure of AI-generated content in court filings and emphasizes “human in the loop” verification.
AI and Software in the Practice of Law Sources
California Courts (2024) – Chief Justice created a task force on generative AI to explore uses in court administration and potential new rules.
ABA Formal Ethics Opinion 512 (2024) – Lawyers using generative AI must ensure competence, confidentiality, and accuracy; warns of risks like fabricated citations.
Image – DoNotPay “World’s First Robot Lawyer” ad – illustrates the bold claims of AI legal tech (source: Gizmodo, via DoNotPay website).
Thomson Reuters 2024 Report (blog) – 77% of professionals say AI will have high impact on work; AI could save 4 hours/week and $100k in billables per lawyer annually.
Code for America project – Using AI to identify cases eligible for expungement in California, automating record clearance for greater access to justice.
Spellbook (LawNext blog) – AI contract drafting assistant using GPT-4, claims to speed up drafting 4× within Word.
Author: Jordan Mitchell, JD Jordan Mitchell is a litigation attorney with over a decade of experience in both private practice and legal operations consulting. After earning a law degree from the USC Gould School of Law, Jordan spent eight years in civil litigation before transitioning into a role in legal tech strategy.