An attorney defending artificial-intelligence company Anthropic in a copyright lawsuit over music lyrics told a California federal judge on Thursday that her law firm Latham & Watkins was responsible for an incorrect footnote in an expert report caused by an AI "hallucination."
Ivana Dukanovic said in a court filing that the expert had relied on a legitimate academic journal article, but Dukanovic created a citation for it using Anthropic's chatbot Claude, which made up a fake title and authors in what the attorney called "an embarrassing and unintentional mistake."
"Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors," Dukanovic said.
The lawsuit from music publishers Universal Music Group , Concord and ABKCO over Anthropic's alleged misuse of their lyrics to train Claude is one of several high-stakes disputes between copyright owners and tech companies over the use of their work to train AI systems.
The publishers' attorney Matt Oppenheim of Oppenheim + Zebrak told the court during a hearing on Tuesday that Anthropic data scientist Olivia Chen may have used an AI-fabricated source to bolster the company's argument in a dispute over evidence.
U.S. Magistrate Judge Susan van Keulen said at the hearing that the allegation raised "a very serious and grave issue," and that there was "a world of difference between a missed citation and a hallucination generated by AI."
Dukanovic responded on Thursday that Chen had cited a real article from the journal American Statistician that supported her argument, but the attorneys had missed that Claude introduced an incorrect title and authors.
A spokesperson for the plaintiffs declined to comment on the new filing. Dukanovic and a spokesperson for Anthropic did not immediately respond to requests for comment.
Several attorneys have been criticized or sanctioned by courts in recent months for mistakenly citing nonexistent cases and other incorrect information hallucinated by AI in their filings.
Dukanovic said in Thursday's court filing that Latham had implemented "multiple levels of additional review to work to ensure that this does not occur again."
The case is Concord Music Group Inc v. Anthropic PBC, U.S. District Court for the Northern District of California, No. 5:24-cv-03811.
For the music publishers: Matt Oppenheim of Oppenheim + Zebrak
For Anthropic: Sy Damle of Latham & Watkins
Ivana Dukanovic said in a court filing that the expert had relied on a legitimate academic journal article, but Dukanovic created a citation for it using Anthropic's chatbot Claude, which made up a fake title and authors in what the attorney called "an embarrassing and unintentional mistake."
"Unfortunately, although providing the correct publication title, publication year, and link to the provided source, the returned citation included an inaccurate title and incorrect authors," Dukanovic said.
The lawsuit from music publishers Universal Music Group , Concord and ABKCO over Anthropic's alleged misuse of their lyrics to train Claude is one of several high-stakes disputes between copyright owners and tech companies over the use of their work to train AI systems.
The publishers' attorney Matt Oppenheim of Oppenheim + Zebrak told the court during a hearing on Tuesday that Anthropic data scientist Olivia Chen may have used an AI-fabricated source to bolster the company's argument in a dispute over evidence.
U.S. Magistrate Judge Susan van Keulen said at the hearing that the allegation raised "a very serious and grave issue," and that there was "a world of difference between a missed citation and a hallucination generated by AI."
Dukanovic responded on Thursday that Chen had cited a real article from the journal American Statistician that supported her argument, but the attorneys had missed that Claude introduced an incorrect title and authors.
A spokesperson for the plaintiffs declined to comment on the new filing. Dukanovic and a spokesperson for Anthropic did not immediately respond to requests for comment.
Several attorneys have been criticized or sanctioned by courts in recent months for mistakenly citing nonexistent cases and other incorrect information hallucinated by AI in their filings.
Dukanovic said in Thursday's court filing that Latham had implemented "multiple levels of additional review to work to ensure that this does not occur again."
The case is Concord Music Group Inc v. Anthropic PBC, U.S. District Court for the Northern District of California, No. 5:24-cv-03811.
For the music publishers: Matt Oppenheim of Oppenheim + Zebrak
For Anthropic: Sy Damle of Latham & Watkins