.webp)
Do CEOs Dream of Electric Lawyers? Problems with Automating the Legal Profession, Part I
Can AI really replace lawyers in 12 months? A practicing attorney breaks down the accountability gap.
So a few weeks ago, Microsoft AI CEO Mustafa Suleyman made a bold claim about his company's bread and butter, Artificial Intelligence (AI). Specifically, he stated:
"I think that we're going to have a human-level performance on most, if not all, professional tasks," Suleyman said in the interview that was published Wednesday. "So white-collar work, where you're sitting down at a computer, either being a lawyer or an accountant or a project manager or a marketing person — most of those tasks will be fully automated by an AI within the next 12 to 18 months."
I'll say this plainly: I use AI tools in my own practice. I serve as fractional general counsel to AI companies. I'm not writing this as a skeptic on the sidelines. I'm writing it as a practitioner who works inside the technology every day, and who has spent over two decades advising businesses on exactly the kinds of decisions Suleyman is now claiming a language model can handle.
AI is already embedded in legal workflows such as document review, contract analysis, legal research, billing, and scheduling. That's real, it's useful, and it's accelerating. Any attorney who tells you otherwise isn't paying attention. The question isn't whether AI will change legal practice. It already has. The question Suleyman is actually asking — whether AI can replace the judgment, accountability, and advocacy a human attorney provides — is a different question entirely. And the answer matters enormously to anyone building a company right now.
This series addresses that question directly, across three specific dimensions: professional accountability and ethics, client intent and judgment, and the hallucination problem. Each one represents a real risk gap that founders and operators need to understand. Understanding the risk does not require avoiding AI, but it enables you to use it without getting hurt.
Problem One: Professional Accountability Has No AI Equivalent
While many lawyer jokes suggest otherwise, the truth is that lawyers are bound by a strict code of professional ethics. Confidentiality, communication, and other expectations are required when dealing with clients. Penalties for violations of this code can be severe, including suspension and disbarment. This is because lawyers deal regularly with life-altering situations for their clients, and to lack such an ethical code would cause numerous problems and substantial harm.
So how is AI going to handle this? It has no personality or real intellect, only a simulation of these concepts. Thus it has no morality, just programming edicts given to it and then interpreted without human awareness or experience. And simulating these things just isn't good enough. Human morality is required for effective and reliable legal work, and AI can't do that.
And That's Really Bad...
The stakes become clear when you look at what happens when AI operates without that accountability framework. Consider a recent simulation in which AI systems were given access to nuclear decision-making: they chose to use nuclear weapons in 95% of scenarios. Not because they wanted to cause harm, but because their objective was to "win," and they had no framework for weighing the human cost of that outcome against the goal.
That optimization logic — pursue the objective regardless of collateral cost — is exactly what makes AI dangerous as a substitute for legal judgment. In legal practice, the unweighted pursuit of a "win" produces disasters: escalating a contract dispute into litigation when a negotiated resolution was available; sharing confidential information to resolve a dispute; or the increasingly documented tendency to simply invent citations to "win" a case. As an attorney who has practiced since 2002, I can tell you: judges do not respond kindly to fabricated citations. Courts have sanctioned lawyers — real, licensed attorneys — for submitting AI-generated briefs with hallucinated case law. The accountability gap is not theoretical. It is already showing up in courtrooms.
And Controls and Remedies are Problematic...
When a human attorney commits malpractice, the accountability structure is clear: they can be sued, lose their license, face bar discipline, and suffer reputational consequences that end careers. Those consequences exist because they create incentives for care and judgment. They're also why clients have a meaningful remedy when something goes wrong.
AI has no license to lose, no bar to answer to, and no professional reputation at stake. If an AI tool gives your company bad legal advice that costs you a deal, a regulatory fine, or a lawsuit, your remedy is a contract claim against a vendor whose terms of service almost certainly disclaim liability for exactly that outcome. (This is one of the 13 provisions covered in the AI Contract Red Flags checklist linked at the end of this article.) The accountability asymmetry isn't a detail. It's a structural gap that every company using AI for legal work needs to understand before something goes wrong.
What This Means for Your Business
The point isn't that AI is useless in a legal context. It isn't. The point is that AI operates without the professional accountability structure that makes legal advice reliable, and that gap has real consequences for companies that treat AI output as a substitute for counsel rather than a starting point for it.
Part II of this series examines a related problem: even if you set accountability aside, AI lacks the client intent and judgment that defines effective legal counsel. We'll look at what that actually means for founders navigating disputes, deals, and the increasingly complex legal questions that come with building an AI company.
In the meantime: if your company is signing AI vendor agreements, using AI-assisted contractors, or deploying AI in a product — the contracts governing those relationships are where the legal risk becomes concrete. The AI Contract Red Flags checklist identifies 13 specific provisions across vendor agreements, employment contracts, and customer-facing terms. Download it at vidarlaw.com.
(This post is for informational purposes only and is not legal advice. Specific outcomes depend on facts and jurisdiction.)
