Newsletter

Should lawyers embrace ChatGPT, or stick to legal industry AI tools?

Image of desk and keyboard, text reads: Should lawyers embrace ChatGPT, or stick to legal industry AI tools? Good Journey Consulting Newsletter Issue 44

Issue 44 

General-purpose AI tools like ChatGPT are currently among the most popular AI tool solutions used by lawyers. Today I’m examining why lawyers may be choosing general-purpose AI tools over other options, and why lawyers may want to consider giving legal industry AI tools a chance. 

Why Are General-Purpose AI Tools Popular with Lawyers?  

Last year, Clio reported in its Legal Trends Report that general-purpose AI tools like ChatGPT were among the top three AI solutions being adopted in the legal industry.[i] There are a few likely explanations for the popularity of general-purpose AI tools among lawyers. General-purpose AI tools like ChatGPT and Claude have become well-known, especially compared to niche legal industry AI tools that a lawyer may have to go searching for to discover. General-purpose AI tools also offer relatively affordable consumer-grade plans, and because they are general-purpose, they can be prompted to do a lot of different things. Essentially, general-purpose AI tools have served as an initial gateway AI tool for many lawyers. For those lawyers who are now ready to go deeper, this newsletter explains some reasons why legal industry AI solutions may be preferable to general-purpose solutions. 

Should Lawyers Trust General-Purpose AI Tools Like ChatGPT? 

Lawyers should be aware that general-purpose AI tools may not in all cases be aligned with their professional responsibilities. Last year, the American Bar Association (“ABA”) published Formal Opinion 512, which outlined various risk assessment efforts lawyers should undertake when selecting an AI tool that will receive data related to a lawyer’s representation of a client, which include confirming that the AI tool will preserve the security and confidentiality of the information disclosed.[ii]  

Some general-purpose AI tools default to using your data to train their AI models. For example, on August 28, 2025, Anthropic announced that it was updating its Consumer Terms and Privacy Policy for its Claude Free, Pro, and Max plans, which as of September 28, 2025, will default to permitting Anthropic to train on user’s chats unless the user explicitly opts out.[iii] Even if you can opt out of permitting a general-purpose AI tool to train on your data, some legal industry AI tools still offer greater data privacy protections than general-purpose AI tools. This is because some legal industry AI tools utilize the general-purpose AI models like OpenAI’s GPTs and Anthropic’s Claude models to provide their legal-specific AI solutions via a technical solution called an application programming interface (“API”) with zero data retention endpoints, which is intended to ensure that customer data will stay private.[iv]

Here's a real-world example of how legal industry AI tools may offer greater data privacy protections: in 2025, a U.S. federal court issued an order directing OpenAI “to preserve and segregate all output log data that would otherwise be deleted on a going forward basis until further order of the Court (in essence, the output log data that OpenAI has been destroying), whether such data might be deleted at a user’s request or because of ‘numerous privacy laws and regulations’ that might require OpenAI to do so.”[v] OpenAI publicly addressed the court order, explaining that ChatGPT Free, Pro, Plus, and Team subscription users’ data is impacted by the order.[vi] Further, OpenAI confirmed that the court order does not impact the data of its API customers who are using zero data retention endpoints pursuant to OpenAI’s zero data retention agreement.[vii] This means that the legal industry AI tool companies that utilize OpenAI’s models pursuant to a zero data retention agreement are not impacted by the data preservation order, while some direct users of ChatGPT are impacted by the data preservation order. You can read more about the OpenAI data preservation order here

Legal industry AI tools generally are built with the goal of serving lawyers better than the general-purpose tools. As such, most legal industry AI tools aim to improve upon the security, privacy, and accuracy of the general-purpose tools, because these are issues that matter to lawyers. However, lawyers should not blindly trust any AI tool, even those created for the legal industry. Lawyers should always perform a risk assessment of an AI tool prior to using the tool for legal work. 

The People Problem with AI Tools  

Some law firms have opted to permit the use of general-purpose AI tools like ChatGPT or Claude while prohibiting users from inputting any confidential client information into the general-purpose AI tool. Lawyers who are considering this option should know that in a 2025 survey of 1,600 Chief Information Security Officers, human error was the top cybersecurity vulnerability concern.[viii] If you’re considering using a general-purpose AI tool for your law practice, “what do I want users to do with the AI tool?” isn’t the only relevant question. A more important question is, “how could the AI tool be used, possibly in ways I do not intend?” Selecting a legal industry AI tool that offers greater data protection and security compared to a general-purpose AI tool is one way to mitigate risks associated with an employee inadvertently inputting confidential client or organization information into a general-purpose AI tool.   

What Other AI Tool Options Exist for Lawyers? 

There are now hundreds of legal industry AI tools on the market, which collectively offer over 60 ways lawyers can use AI. In addition to possibly offering greater data protection, some legal industry AI tools have automated legal prompting, or do not require prompting at all, while general-purpose AI tools can require an investment of time to figure out how to improve output through legal prompt engineering. While general-purpose AI tools have understandably served as an AI entry point for many lawyers, there are many other legal industry AI tool opportunities available to the lawyers who take the time to find them. 

AnnouncementI’m currently preparing to record a CLE called “How to Pick the Best AI Tool for Your Law Practice”. Once I release the CLE, I’ll provide my newsletter subscribers with an exclusive discount code. If you already subscribe to my newsletter, thank you! If you know someone who might like access to this discount code for my newsletter subscribers, please share this issue of the newsletter with them, and encourage them to sign up for my newsletter here before the CLE is released. Additionally, if you would like me to prioritize applying for CLE accreditation in your state, please send me an email at [email protected].

Thanks for being here. 

Jennifer Ballard
Good Journey Consulting

 

[i] Legal Trends Report at p. 32, Clio (2024) https://www.clio.com/wp-content/uploads/2024/10/2024-Legal-Trends-Report-Full-Publication.pdf. 

[ii] ABA, Formal Op. 512, 11, (2024), https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-512.pdf. 

[iii] Connie Loizos, Anthropic users face a new choice – opt out or share your chats for AI training, TechCrunch (Aug. 28, 2025 13:43 PDT), https://techcrunch.com/2025/08/28/anthropic-users-face-a-new-choice-opt-out-or-share-your-data-for-ai-training/.  

[iv] Security & Encryption, Spellbook, https://www.spellbook.legal/security (last visited Sept. 23, 2025).  

[v] Order dated May 13, 2025 at 2, In Re: OpenAI, Inc., Copyright Infringement Litigation., No. 25-md-3143 (S.D.N.Y. centralized Apr. 3, 2025), document in relation to The New York Times Company v. Microsoft Corporation et al., No. 1:23-cv-11195 (S.D.N.Y. filed December 27, 2023).  

[vi] Brad Lightcap, How we’re responding to The New York Times’ data demands in order to protect user privacy, OpenAI (Jun. 5, 2025), https://openai.com/index/response-to-nyt-data-demands/ 

[vii] Id. 

[viii] Proofpoint’s 2025 Voice of the CISO Report Reveals Heightened AI Risk, Record CISO Burnout, and the Persistent People Problem in Cybersecurity, Proofpoint (Aug. 26, 2025), https://www.proofpoint.com/us/newsroom/press-releases/proofpoint-2025-voice-ciso-report. 

Stay connected with news and updates!

Join our mailing list to receive the latest legal industry AI news and updates.
Don't worry, your information will not be shared.

We will not sell your information.