Expert Witness Used Generative AI to Prepare His Report. It Didn’t Go Well–In re Weber
This case involves real property on Cat Island in the Bahamas. After the death of Michael S. Weber, the property passed into his trust. Susan, his sister, is the trustee, and his son Owen is a trust beneficiary. Owen may have difficulty managing his money. When asked what happened to a $319k trust disbursement, he said “that he attempted to start a business, had drug addiction, and ‘I blew — blew that money.'”
Owen nevertheless thinks Susan should have sold the Cat Island property and invested the money elsewhere. He’s also not thrilled that Susan retained the property partially for sentimental reasons and mixed business and vacation-time on visits to the property. The trust finally sold the property in 2022.
To advance his case, Owen retained an “expert,” Charles W. Ranson, who advertises himself as providing “Trusts & Estates Litigation Consulting & Expert Witness Testimony.” It’s fair to say that this expert engagement doesn’t go well. The court says it found “his testimony and opinion not credible,” something no expert witness ever wants to see published in a court opinion. That can’t be good for business.
Owen claimed that if the trust had sold the real estate, it could have invested the cash into the stock market. In support of this, Ranson’s report showed how much money the trust would have had if it had sold the property twenty years ago and invested the proceeds in a Vanguard mutual fund. There are a number of problems with this, including (1) this argument benefits from hindsight bias by claiming that a trustee should have put money into an asset when the asset’s returns are already known, and (2) it is logical and defensible for a trust to diversify asset classes.
The court is NOT IMPRESSED by Ranson’s use of generative AI here. The court reran Ranson’s queries at Copilot and got slightly different numbers each time. The “fact there are variations at all calls into question the reliability and accuracy of Copilot to generate evidence to be relied upon in a court proceeding.” The court then asked Copilot if it was reliable and accurate, and Copilot self-reported that it’s always a good idea to double-check its work. (This was a bit of a parlor trick by the court; Copilot isn’t actually writing these responses in a reasoned fashion).
It sounds like this isn’t the first time that Ranson used generative AI as part of his expert work. The court recounts:
Mr. Ranson was adamant in his testimony that the use of Copilot or other artificial intelligence tools, for drafting expert reports is generally accepted in the field of fiduciary services and represents the future of analysis of fiduciary decisions; however, he could not name any publications regarding its use or any other sources to confirm that it is a generally accepted methodology.
Say what? The whole point of retaining an “expert” is that they have specialized knowledge that they can bring to bear on the facts at issue in the case. If the same knowledge is readily available from an AI chatbot, then the lawyers can consult the AI directly and skip retaining an “expert” at all. In other words, it seems like Ranson is undermining demand for his expert witness services…
(Note also that submitting queries to a chatbot could disclose confidential information about the case to the model, though that probably wasn’t a concern with this specific query).
Overall, the court isn’t ready to hand over expert work to the machines:
The mere fact that artificial intelligence has played a role, which continues to expand in our everyday lives, does not make the results generated by artificial intelligence admissible in Court….this Court holds that due to the nature of the rapid evolution of artificial intelligence and its inherent reliability issues that prior to evidence being introduced which has been generated by an artificial intelligence product or system, counsel has an affirmative duty to disclose the use of artificial intelligence and the evidence sought to be admitted should properly be subject to a Frye hearing prior to its admission
I don’t love the AI exceptionalism in this approach. Instead, I see the court’s expectation as one specific application of the standard duty of expert witnesses to explain their methodology. If an expert witness is using an Excel spreadsheet to compute investment returns, the expert probaby needs to disclose the spreadsheet so that the other side can see the assumptions and check the numbers. If the expert is skipping that step and asking a chatbot to spit out a valuation, then the queries and chatbot involvement do need to be disclosed so that the other side can test its credibility. So I think a general reminder to lawyers and expert witnesses to “show their work” was all that was needed, without any AI-specific admonishments.
Case Citation: Matter of Weber, 2024 NY Slip Op 24258 (N.Y. Surrogacy Ct. Oct. 10, 2024).