State-by-State Guide to AI Disclosure Requirements for Lawyers (2026)

State-by-State Guide to AI Disclosure Requirements for Lawyers (2026)
Fifty-three percent of law firms have no AI policy — yet 79% of legal professionals are already using AI tools daily. That gap between adoption and governance is where malpractice claims, bar discipline, and client trust problems live.
If you’re a solo or small firm lawyer using AI for contract review, legal research, or document drafting, you face a practical question that no single source answers well: What exactly do I have to disclose, to whom, and when? The answer depends on your state, your court, and the type of work you’re doing. This guide consolidates every major disclosure requirement into one reference.
Whether you use Clause Labs, ChatGPT, or any other AI tool, this article gives you the compliance roadmap. Start free with 3 contract reviews per month — no credit card required.
The AI Disclosure Landscape in 2026
There is no federal standard for AI disclosure in legal practice. What exists is a patchwork: state bar ethics opinions, individual court standing orders, and the ABA’s Formal Opinion 512, issued July 29, 2024, which provides a national framework but isn’t binding on any state.
The trend line is clear. According to the ABA’s 2024 TechReport, AI adoption among lawyers nearly tripled from 11% in 2023 to 30.2% in 2024. The Clio 2025 Legal Trends Report puts that number at 79% when you count all AI-adjacent tools. State bars are responding with guidance at an accelerating pace — more than 30 states have now issued ethics opinions, practical guides, or formal rules addressing AI in legal practice.
The disclosure obligations fall into three categories: what you must tell your clients, what you must tell courts, and what your state bar recommends or requires as a matter of professional responsibility.
The Master Reference: Key States and Their Requirements
No article can cover all 50 states plus DC in granular detail and remain current for more than a few weeks. What follows is the most consequential guidance from the states where most transactional lawyers practice, organized by the type of obligation imposed.
States with Mandatory or Strongly Recommended Client Disclosure
Florida — Opinion 24-1 (2024)
Florida’s Opinion 24-1 is one of the most detailed state bar pronouncements on AI. Key requirements:
- Lawyers may use AI but must prioritize client confidentiality
- Disclosure is mandatory when AI use impacts billing or costs
- Lawyers must practice accurately and competently when using AI outputs
- AI-generated work must be reviewed and verified before delivery
Practical impact: If you’re a Florida lawyer using AI for contract review and billing fewer hours as a result, you need to address that in your fee arrangement. If you’re uploading client documents to a third-party AI tool, your confidentiality obligations under Rule 4-1.6 are triggered.
California — Practical Guide on AI (2024–2025)
The State Bar of California published a practical guide emphasizing that attorney competence under Rule 1.1 requires an understanding of large language models before using them — including hallucination risks and data privacy implications. While California hasn’t issued a formal opinion mandating disclosure in all cases, the competence standard effectively requires:
- Understanding how any AI tool you use works
- Evaluating data privacy implications before uploading client data
- Maintaining supervisory control over all AI outputs
Texas — Opinion 705 (February 2025)
Texas Opinion 705 clarifies that human oversight of AI-generated work product is mandatory. The opinion specifically addresses the risk of fabricated citations (the Mata v. Avianca problem) and requires:
- Independent verification of all AI-generated content
- Human supervision of AI as a non-lawyer assistant under Rule 5.03
- Competence in understanding the AI tool’s limitations
New York
New York has been aggressive on AI governance. The state requires at least two annual CLE credits in practical AI competency, with enforcement beginning in 2025. Multiple court systems within New York have adopted AI disclosure rules for court filings, and the NYC Bar Association has published detailed guidance on ethical AI use.
States with Formal Guidance (Advisory but Influential)
Oregon — Formal Opinion 2025-205
Oregon’s Formal Opinion 2025-205 is a thorough treatment of AI ethics obligations. It addresses competence, confidentiality, supervision, and disclosure, closely tracking ABA Formal Opinion 512.
North Carolina
The North Carolina Bar Association published a 2026 guide arguing that law firms need realistic AI policies rather than outright bans. The guidance emphasizes documentation and policy-based governance.
Pennsylvania
Pennsylvania mandates explicit disclosure of AI use in all court submissions. Transparency is a filing requirement in state courts.
Illinois, Massachusetts, Colorado, Georgia, Washington
Each of these states has addressed AI use through bar opinions, CLE requirements, or court rules. The details vary but converge on three themes: competence, confidentiality, and verification.
States with No Guidance (As of February 2026)
Roughly 15-20 states have not yet issued formal AI guidance. If you practice in one of these states, the ABA Model Rules and Formal Opinion 512 are your best framework. The Justia 50-State Survey maintains a current tracker — bookmark it.
For a comprehensive and regularly updated listing of every state’s position, the Clio AI Ethics Opinions guide provides state-by-state detail with links to primary sources.
Federal Court AI Disclosure Requirements
Federal courts have moved faster than state bars. Since Judge Brantley Starr of the Northern District of Texas issued the first standing order requiring AI disclosure in court filings in 2023, over 300 federal judges have adopted similar orders.
These standing orders typically require one or more of:
- Disclosure of AI tool use in drafting or researching any filing
- Certification that all citations have been independently verified
- Identification of which specific AI tools were used
The requirements are not uniform. Some judges require a standalone certification. Others require a footnote. Some apply only to generative AI (ChatGPT, Claude) while others cover all AI-assisted research tools.
Practical advice: Before filing in any federal court, check the assigned judge’s standing orders. Bloomberg Law’s tracker and Law360’s AI tracker maintain current databases.
Note: contract review work rarely involves court filings directly. But if your contract review feeds into litigation — a breach of contract claim, for example — the disclosure requirement kicks in when the AI-assisted analysis becomes part of a filing.
The ABA Framework: Formal Opinion 512
ABA Formal Opinion 512, released July 29, 2024, provides the most comprehensive national framework. It addresses six Model Rules and their application to generative AI.
Rule 1.1 (Competence): Lawyers must understand the capabilities and limitations of any AI tool they use. You don’t need to be a technologist, but you need a “reasonable understanding” — enough to evaluate whether the tool is appropriate for the task. For a deeper analysis, see our guide on ABA guidelines for AI in legal practice.
Rule 1.4 (Communication): Inform clients about AI use when it’s relevant to their representation. Notably, Formal Opinion 512 states that boilerplate consent in engagement letters is not adequate for confidentiality purposes — you need informed, specific consent when uploading client data to third-party AI tools.
Rule 1.5 (Fees): You may not bill clients for time spent learning to use AI tools generally. If a client specifically requests a particular AI tool, learning costs may be billable. The bigger implication: if AI reduces your review time from 3 hours to 30 minutes, your fee arrangement should reflect that.
Rule 1.6 (Confidentiality): Before uploading client data to any AI tool, evaluate the tool’s data handling practices. This includes data retention, training policies, encryption, and sub-processor arrangements. For detailed guidance on this issue, see our article on confidentiality and AI contract tools.
Rule 5.1/5.3 (Supervision): Supervise AI output the same way you’d supervise a junior associate. Review everything. Verify everything. For a practical framework on exactly how to do this, see our guide on supervising AI legal outputs.
Types of Disclosure: Client, Court, and Bar
Client Disclosure
Client disclosure addresses what you tell your clients about using AI in their matters.
When it’s required:
– When uploading client data to a third-party AI tool (confidentiality trigger)
– When AI use materially affects your fees or billing (fee disclosure trigger)
– When your state bar has issued specific guidance requiring disclosure
When it’s recommended but not strictly required:
– For all AI-assisted contract review (best practice regardless of state)
– When clients are likely to have concerns about AI use
– When the matter involves sensitive or confidential business information
Where to disclose:
– Engagement letter (standard practice — add an AI use section)
– Separate AI disclosure addendum (for sensitive matters)
– Ongoing client communication (for new tools or changed practices)
Court Disclosure
Court disclosure is more straightforward: check the standing orders of the court and judge where you’re filing. If a standing order requires AI disclosure, comply. If no order exists, Rule 11 certification already requires you to verify the accuracy of everything in your filing — AI-assisted or not.
Bar Compliance
Your state bar’s guidance governs your ongoing professional responsibility. Even where no formal disclosure rule exists, the underlying Model Rules (competence, confidentiality, communication, supervision) apply to AI use. Document your compliance.
Engagement Letter AI Disclosure Templates
Three templates, scaled to your jurisdiction’s requirements.
Minimal Disclosure (States with No Specific Requirements)
Our firm may use AI-assisted tools to enhance the efficiency of legal services, including contract analysis, legal research, and document review. All AI-generated analysis is reviewed and verified by a licensed attorney before inclusion in any client deliverable. Our firm remains fully responsible for all work product.
Standard Disclosure (States with Recommended Disclosure)
Our firm uses AI-powered contract review and analysis tools as part of our quality assurance process. These tools assist with clause identification, risk analysis, and missing provision detection. All AI-generated analysis is independently reviewed, verified, and supplemented by attorney judgment before delivery. Our AI tools employ encryption for data in transit and at rest, do not retain client documents after analysis, and do not use client data to train AI models. Attorney [Name] maintains supervisory responsibility for all work product.
Comprehensive Disclosure (States with Mandatory Disclosure)
Our firm uses the following AI tools in providing legal services: [Tool Names]. These tools are used for: [specific tasks — e.g., contract clause identification, risk scoring, missing provision detection]. Data handling: client documents are processed via encrypted connections, are not retained after analysis, and are not used to train AI models. [Tool Name] maintains [SOC 2 / relevant certification] compliance. Human review: all AI-generated analysis is independently reviewed and verified by [Attorney Name], who exercises professional judgment on all findings before inclusion in client deliverables. You have the right to request that we not use AI tools on your matter. If you choose to opt out of AI-assisted review, please notify us in writing, and we will adjust our review process accordingly. This may affect the timeline and cost of services.
The Penalty Landscape: What Happens If You Don’t Disclose
The most prominent sanction case remains Mata v. Avianca, Inc., No. 22-cv-1461 (S.D.N.Y. 2023), where attorneys Steven Schwartz and Peter LoDuca were fined $5,000 for submitting AI-fabricated citations. But Mata involved affirmative misrepresentation to the court — not mere failure to disclose AI use. For more on the Mata case and its implications, see our analysis of AI hallucination risks in legal practice.
As of early 2026, no lawyer has been disciplined solely for failing to disclose AI use in transactional contract review. But the trajectory is clear: 300+ federal judges have standing orders, state bars are issuing guidance at an accelerating rate, and over 700 documented incidents of AI-fabricated content in court filings have made courts and bars aggressive about enforcement.
The risks of non-disclosure include:
- Court sanctions for non-compliance with standing orders
- Bar discipline for violating competence, confidentiality, or communication rules
- Malpractice exposure if AI errors cause client harm and your use wasn’t disclosed
- Client trust damage that’s harder to repair than any formal sanction
The calculus is simple: disclosure costs you nothing. Non-disclosure can cost you your practice.
The Universal Compliance Framework: 6 Steps That Work Everywhere
Regardless of your state, these six practices keep you compliant with current and likely future requirements.
-
Add AI disclosure to your standard engagement letter. Use the templates above. Update annually or when your toolset changes.
-
Maintain an AI tool inventory. List every AI tool your firm uses, what it’s used for, what data it accesses, and its security certifications. Review quarterly.
-
Verify all AI output before use. This isn’t optional anywhere. Review every clause identification, risk assessment, and suggested edit against the source document. Our VERIFY framework for supervising AI outputs gives you a repeatable protocol.
-
Document your AI use and human review process. Date, tool, matter, what was reviewed, what was changed. This is your audit trail for any bar inquiry or malpractice claim.
-
Stay current on your state’s requirements. The Justia 50-State Survey and Clio’s ethics opinions guide are the best free trackers. Check quarterly.
-
When in doubt, disclose. Overcompliance beats undercompliance every time. No lawyer has ever been disciplined for disclosing too much about their technology use.
Clause Labs is built for compliant AI use: structured output that’s easy to verify, no data retention after analysis, and encryption for all document processing. Start free with 3 reviews per month — no credit card required.
Over 500 lawyers already use Clause Labs for AI-assisted contract review with ABA-compliant data handling. Join them — start free today.
Frequently Asked Questions
Do I need to disclose if I just use ChatGPT to brainstorm contract language?
It depends on your jurisdiction and what you do with the output. If you use ChatGPT to brainstorm and then independently draft the language yourself, most jurisdictions wouldn’t require disclosure. But if AI-generated language appears substantially in a client deliverable, disclosure is prudent. Under ABA Formal Opinion 512, you must also consider whether you’ve uploaded any client confidential information in the process — even pasting a client’s contract clause into ChatGPT may trigger Rule 1.6 obligations.
Do I need to disclose AI use to opposing counsel?
Generally, no. No state currently requires disclosure to opposing counsel in transactional practice. The exceptions are narrow: collaborative law settings, some mediation contexts, and situations where a specific court order applies. In litigation, some federal standing orders require disclosure in filings — which opposing counsel will see.
Can my client refuse to let me use AI?
Yes. If a client requests that you not use AI tools, you must honor that request. Include an opt-out provision in your engagement letter (see the comprehensive template above). Be transparent about how opting out may affect timelines and costs.
Is disclosure required for contract review, or only litigation?
The ABA Model Rules and most state guidance apply to all areas of practice, not just litigation. Rule 1.6 (confidentiality) applies whenever you share client information with a third-party tool — whether you’re reviewing a contract or drafting a brief. The court-specific standing orders only apply to litigation filings, but your ethical obligations to clients are practice-area agnostic.
How often should I update my disclosure language?
Review and update annually at minimum. Update immediately when you adopt new AI tools, when your state bar issues new guidance, or when there’s a material change in how your existing tools handle data.
This article is for informational purposes only and does not constitute legal advice. Consult a qualified attorney for advice specific to your situation. AI disclosure requirements are evolving rapidly — verify current requirements in your jurisdiction before relying on this guide.
More articles
What Is Contract Redlining? How Lawyers Mark Up Agreements
What Is Contract Redlining? How Lawyers Mark Up Agreements The average commercial contract goes through 3.4 rounds of negotiation before execution. Each round involves at least two lawyers marking up the same document, tracking who changed what, and trying not to lose revisions in an email chain that has grown to 47 messages. According to [...]
What Is a Master Service Agreement (MSA)? A Plain-English Guide
What Is a Master Service Agreement (MSA)? A Plain-English Guide A technology company signs a three-year deal with a consulting firm. Six months in, the consultant takes on a second project. Then a third. Each time, both legal teams spend three weeks negotiating payment terms, liability caps, and confidentiality obligations they already agreed to in [...]