The ABA issues new guidance in relation to AI tools used for jury selection

Issue 41
The American Bar Association (“ABA”) has issued a new formal opinion containing AI-related considerations for lawyers. Formal Opinion 517 addresses discrimination in jury selection, advising that lawyers who know or reasonably should know that exercising a peremptory challenge during jury selection that would constitute unlawful discrimination violate ABA Model Rule 8.4(g).[i] ABA Model Rule 8.4(g) addresses professional misconduct when a lawyer engages in harassment or discrimination on an unlawful basis.[ii]
In relation to AI tools, Formal Opinion 517 advises that lawyers may not follow an AI tool’s recommendation on a peremptory challenge if the lawyer knows or reasonably should know the action will be unlawful discrimination against the juror.[iii] The opinion further explains that a lawyer’s culpability for discrimination could be called into question when the lawyer relies on an AI tool for jury selection.[iv] The opinion provides an example that a lawyer could unintentionally strike a juror for an unlawful discriminatory reason such as race or gender if the AI tool applies rankings in a biased manner, while simultaneously providing the lawyer seemingly nondiscriminatory reasons for the ranking choices.[v]
Bias is a known AI issue that predates the current age of AI, and it is an issue that has not been solved by the AI models released to date. Underlying biased training data is believed to most often be the main source of bias problems in AI, however bias can occur in an algorithm’s training data, the algorithm, and/or the predictions produced by the algorithm.[vi]
Formal Opinion 517 acknowledges that a lawyer’s reasonable knowledge of an unlawfully discriminatory peremptory challenge will depend on the facts of the situation.[vii] The opinion warns lawyers that they should conduct due diligence to acquire a general understanding of an AI-powered jury selection tool’s methodology, and refers lawyers to ABA Formal Opinion 512 for additional guidance on lawyers’ professional responsibilities in relation to selecting AI tools for practice.[viii]
The potential for bias is one of many considerations lawyers should evaluate as they engage in the process of selecting a new AI tool to use in legal practice. Chapter 5 of A Lawyer’s Practical Guide to AI gives you a step-by-step guide for how to do your due diligence in assessing risks in relation to an AI tool before you select and implement it. You can get instant access to the guide here.
Thanks for being here.
Jennifer Ballard
[i] ABA, Formal Op. 517, 1, (2025), https://www.americanbar.org/content/dam/aba/administrative/professional_responsibility/ethics-opinions/aba-formal-opinion-517.pdf.
[ii] Model Rules of Pro. Conduct r. 8.4(g).
[iii] Formal Op. 517, supra at 7.
[iv] Id. at 5.
[v] Id. at 5-6.
[vi] Jake Silberg and James Manyika, Tackling bias in artificial intelligence (and in humans), McKinsey Global Institute (Jun. 6, 2019), https://www.mckinsey.com/featured-insights/artificial-intelligence/tackling-bias-in-artificial-intelligence-and-in-humans, Shedding light on AI bias with real world examples, IBM (October 16, 2023), https://www.ibm.com/blog/shedding-light-on-ai-bias-with-real-world-examples/.
[vii] Formal Op. 517, supra at 6.
[viii] Id.
Stay connected with news and updates!
Join our mailing list to receive the latest legal industry AI news and updates.
Don't worry, your information will not be shared.
We will not sell your information.