AI’s Promising Role With Expert Testimony
One area where artificial intelligence has gained a strong foothold in litigation practice is in the preparation and cross-examination of expert witnesses. This shouldn’t be surprising. Expert opinions are frequently case-determinative in litigation. For this reason, the litigator’s role in supporting or challenging expert testimony in depositions is a “leave no stone unturned” endeavor. Enter artificial intelligence.
Artificial intelligence tools do a good job summarizing and extracting insights from large amounts of data. They are adept at surfacing inconsistencies in the expert’s report. These might be internal inconsistencies, or inconsistencies among opinions expressed in testimony given in other cases. Artificial intelligence tools can also formulate alternative conclusions that the expert might have rejected or failed altogether to consider.
AI review can be a useful source of damaging inquiries on cross-examination.
One example: William and Mary Law School’s Center for Legal and Court Technology published a January 25 article detailing how one law firm used artificial intelligence to review an expert witness’s 63 prior deposition transcripts. The AI review uncovered past statements by the expert that were inconsistent with the report in the current case.
This example shows how AI can be enormously useful when directed to expert testimony. For the proponent, AI review can uncover weaknesses in the expert’s views. For parties challenging the expert’s views, AI review can be a useful source of damaging inquiries on cross-examination.
What else can AI do to help prepare expert witnesses?
- Using a technique known as the “Nordstrom method,” AI can analyze an expert’s recorded, rehearsal testimony, check it against prior testimony, spot inconsistencies, and offer suggestions for making it more persuasive.
- AI can translate an expert witness’s technical vocabulary into terms that might be more useful during direct or cross-examination, and more understandable to the factfinder.
- Using a witness preparation technique called “AI shadowboxing” that employs AI to simulate opposing counsel’s questioning. AI shadowboxing helps experts refine responses prior to trial or deposition.
- Verify (or attempt to verify) the scientific claims made in an expert’s report or testimony.
- Quickly synthesize available scientific literature and other relevant information in a way that assists the selection, preparation, and examination of expert witnesses.
Careful litigators will want to approach the use of these tools with care. Litigators are by now familiar with Kohls v. Ellison, No. 24-cv-3754 (D. Minn., Jan. 10, 2025), where the trial judge rejected an expert report that contained several non-existent AI-generated hallucinated citations. The Ellison case won’t soon be forgotten because, ironically, the subject of the expert’s report was AI’s capacity to mislead.
With both experts and litigators using artificial intelligence tools, the Ellison scenario is likely to happen again (and again) if AI is not used with care and an awareness of the consequences of carelessness. In fact, the parties in LeDoux v. Outliers, Inc., No. 3:24-cv-05808 (W.D. Wash.), were advised Jan. 9 to expect a ruling on the defendant’s motion to strike an expert’s report because, allegedly, it contained dozens of false statements and citations due to AI-generated hallucinations.
The prevalence of AI use among experts isn’t accurately known, although we do know that, according to a 2025 Pew Research Survey, 76% percent of researchers view artificial intelligence positively. And artificial intelligence is used by large consulting firms to generate government reports. It stands to reason that artificial intelligence outputs are present in many expert reports offered today in litigation.
Over in the United Kingdom, the November 2025 Bond Solon Expert Witness Survey, revealed that a significant number of expert witnesses are using artificial intelligence in court-related engagements. According to the survey, 20% of expert witnesses responding said they had used artificial intelligence tools. That number, while not terribly high, is nonetheless double from the prior year’s survey.
The obvious practice pointer here was noted in the survey report: “If an expert chooses to use AI technology in the construction of their report, it is vital that they double-check the material produced by AI to determine the accuracy of the information. Incorrect information will open the expert up to criticism and potentially impact the outcome of a case.”
Careful litigators might consider providing expert witnesses with guidance that specifically addresses AI-related issues. For example:
- Supply the expert with ethical guidance in the relevant jurisdiction and judicial opinions spelling out the consequences of offering evidence containing AI-inspired errors.
- Remind the expert that the client’s confidential information must not be supplied as inputs to AI tools.
- Demand that experts document their use of artificial intelligence tools.
- Insist that experts explicitly confirm that their opinions were formed independently of whatever assistance they may have obtained from artificial intelligence tools.
- Research local disclosure obligations, if any, regarding use of AI in court submissions and take measures to ensure compliance.
Summing up, the most important takeaway is one that litigators are by now familiar with: Artificial intelligence is a weapon, a compelling weapon, in the hands of an experienced and capable litigator. This really is the message sent by all of the ethics opinions and practice guidance issued to date. Artificial intelligence is not — not yet, and perhaps not ever — a replacement of the lawyer in litigation.