Expert Matters - The Podcast

Each month, CEO of EWI, Simon Berney-Edwards, and Policy Manger, Sean Mosby, will take an informed look at developments in the world of expert witnesses and expert evidence. There will also be updates on what's happening at EWI, as well as longer form content including interviews and in-depth discussion of key issues for the expert witness community.

 

Clicking on one of the topics below will display episodes with content relevant to that topic.

 

Did you know you can get CPD hours for listening to our Podcast?

Anyone who is a registered user on our website can record their time listening to the Podcast in their CPD Log by visiting their My EWI.

Record my CPD

How not to use AI in expert evidence
Sean Mosby 289

How not to use AI in expert evidence

bySean Mosby

 

Summary

In this US case, an expert in fiduciary services used Microsoft’s Copilot to cross-check calculations he used in expert evidence. He was unable to recall the prompts he used, state the sources Copilot relied on, or explain how the tool worked and arrived at its outputs. The judge provided some useful insight into the challenges with using AI in expert evidence.  

Learning points

Learning points for experts:

  • Do not use an AI tool unless you fully understand it and can explain how the tool works, how you have used it, how it generated the results, and what the results mean.

  • If an AI tool is not generally utilised in your field, consider whether it is appropriate to use it for expert evidence.

  • Do not rely on an AI tool if the results are not fully replicable.

  • Check the AI tool to see if its developers consider it sufficiently reliable to be used in court proceedings.

  • Be aware that AI tools are often trained primarily on data from non-UK jurisdictions.

  • If you wish to use an AI tool, tell your instructing party as soon as possible which tool you intend to use, explaining why you require it and how you intend to use it.

  • Disclose your use of the AI tool to all parties and the court, clearly setting out the prompts and configurations you have used.

  • Make sure you use the AI tool thoughtfully and be extremely careful before entering any confidential or personal information. Remember that AI tools will generally retain any information you enter and the organisation managing the tool may use that information to provide services to third-parties.

  • If you use, or are considering using, AI tools in your expert witness work, make sure you understand any guidance on the use of AI in court proceedings, and especially any guidance or case law on the use of AI in expert evidence.

  • Be appropriately cautious with documents generated by others using AI tools.

  • Lastly, it is worth considering how AI tools might impact your practice. Developments in AI are advancing rapidly and may have an impact in the future on the work of many expert witnesses. Even if you don’t choose to use AI yourself, the opposing expert might.

  • You can watch Sir Keith Lindblom’s keynote address to the 2024 EWI Conference to hear the judiciary’s view on the role of AI in expert evidence. Also of interest is Sir Geoffery Vos’s speech on AI – Transforming the work of lawyers and judges and the Judicial Guidance on the use of AI.

Learning points for instructing parties:

  • Ask the experts you instruct whether they intend to use any AI tools.

  • If they intend to use an AI tool, make them aware of any guidance or case law on the use of AI tools in expert evidence.

  • Ensure that any use of AI tools is disclosed to the court and the other party.

  • Ideally, you should include the request to use AI in your application for permission to rely on expert evidence, explaining why the tool is needed, how it will be used, why it is reliable, and how any confidential or personal information will be protected.  

The case

The petitioner was seeking an order for settlement of the accounts of the estate she had been administering. The objectant, who was a beneficiary of the Trust, claimed the petitioner had breached her fiduciary duty by deciding to retain a property in the Bahamas when selling it would have been a better financial option. He also objected to her travel to the island which he claimed amounted to a conflict of interest or self-dealing.

The expert evidence

The objectant relied on the expert evidence of Charles Ransom to prove the damages caused by the retention of the property in the Bahamas. Mr Ransom provided two reports. The court found numerous deficiencies in Mr Ransom’s evidence, finding that his “calculations and specifically those with regards to damages are inherently unreliable, are based on speculation, hypothetical market performance, and are unsupported or outright contradict by facts in the record”.

The expert’s use of AI

The court went on to examine Mr Ransom’s use of Microsoft Copilot, a generative AI tool, in cross-checking his calculations. Mr Ransom could not recall the input or prompt he used, could not state which sources Copilot relied upon, and was unable to explain how Copilot works and how it arrives at a given output.

The judge noted that the court did not have an objective understanding of how Copilot works and, when it attempted to replicate Mr Ranom’s results, Copilot provided different outputs. The judge noted that “[w]hile these resulting variations are not large, the fact there are variations at all calls into question the reliability and accuracy of Copilot to generate evidence to be relied upon in a court proceeding.”

The judge then queried Copilot about its reliability and accuracy. The tool responded that “my accuracy is only as good as my sources so for critical matters, it's always wise to verify” and “I do my best to be as reliable as possible. However, I'm also programmed to advise checking with experts for critical issues.” When asked whether its calculations were reliable enough to be used in court, Copilot replied “[w]hen it comes to legal matters, any calculations or data need to meet strict standards. I can provide accurate info, but it should always be verified by experts and accompanied by professional evaluations before being used in court”.

The judge concluded that “it would seem that even Copilot itself self-checks and relies on human oversight and analysis. It is clear from these responses that the developers of the Copilot program recognize the need for its supervision by a trained human operator to verify the accuracy of the submitted information as well as the output.”

Although arguing that the use of AI tools for drafting expert reports was generally accepted in the field of fiduciary services, he was unable to name any publications regarding its use or sources to confirm its use was generally accepted.

The court found that due to the rapid evolution of artificial intelligence and its inherent reliability issues that prior to evidence being introduced which has been generated by an artificial intelligence product or system, counsel has an affirmative duty to disclose the use of artificial intelligence. Permission to use such evidence should properly be the subject to a pre-trial hearing with the scope of the evidence determined by the court.

Share

Print
Comments are only visible to subscribers.