-
Should Dan Campbell be coach of the year favorite over Kevin O’Connell & Mike Tomlin? | NFL on FOX - 27 mins ago
-
Texas’ Teachers Are More Diverse Than California’s: Report - 36 mins ago
-
China opens probe into Nvidia, accusing company of violating its anti-monopoly law - 40 mins ago
-
Bengals come away with 27-20 win after botched blocked punt by Cowboys - about 1 hour ago
-
2 students stabbed at SoCal high school; one attacker was 14, source says - about 1 hour ago
-
Luigi Mangione Disputes Key Details in PA Court as New York Charges Filed - about 1 hour ago
-
Israel launches strikes, ground incursion into Syria - 2 hours ago
-
Hegseth Clarifies Remarks on Women in the Military: ‘Greatest Warriors’ - 2 hours ago
-
Eagles receivers are frustrated. How much is Jalen Hurts to blame? - 2 hours ago
-
NBA Cup Quarterfinals: Preview, Predictions, and More - 3 hours ago
Stanford University professor Jeff Hancock accused of using AI to cite fake study
A Stanford University “misinformation expert” has been accused of using artificial intelligence (AI) to craft testimony later used by Minnesota Attorney General Keith Ellison in a politically-charged case.
Jeff Hancock, a professor of communications and founder of the vaunted school’s Social Media Lab, provided an expert declaration in a case involving a satirical conservative YouTuber named Christopher Kohls. The court case is about Minnesota’s recent ban on political deepfakes, which the plaintiffs argue is an attack on free speech.
Hancock’s testimony was submitted to the court by Ellison, who is arguing in favor of the law. Hancock is “well-known for his research on how people use deception with technology, from sending texts and emails to detecting fake online reviews,” according to Stanford’s website.
But the plaintiff’s lawyers have asked the Minnesota federal judge hearing the case to dismiss the testimony, charging that Hancock cited a fake study.
WHAT IS ARTIFICIAL INTELLIGENCE (AI)?
“[The] Declaration of Prof. Jeff Hancock cites a study that does not exist,” lawyers argued in a recent 36-page memo. “No article by the title exists.”
The “study” was called “The Influence of Deepfake Videos on Political Attitudes and Behavior” and was purportedly published in the Journal of Information Technology & Politics. The Nov. 16 filing notes that the journal is authentic, but had never published a study by that name.
“The publication exists, but the cited pages belong to unrelated articles,” the lawyers argued. “Likely, the study was a ‘hallucination’ generated by an AI large language model like ChatGPT.”
“Plaintiffs do not know how this hallucination wound up in Hancock’s declaration, but it calls the entire document into question, especially when much of the commentary contains no methodology or analytic logic whatsoever.”
The document also calls out Ellison, arguing that “the conclusions that Ellison most relies on have no methodology behind them and consist entirely of expert say-so.”
“Hancock could have cited a real study similar to the proposition in paragraph 21,” the memo states. “But the existence of a fictional citation Hancock (or his assistants) didn’t even bother to click calls into question the quality and veracity of the entire declaration.”
BIDEN EXECUTIVE ORDER FOR ‘WOKE’ ARTIFICIAL INTELLIGENCE CALLED ‘SOCIAL CANCER’
The memorandum also doubles down on the claim that the citation is bogus, noting the multiple searches lawyers went through to try to locate the study.
“The title of the alleged article, and even a snippet of it, does not appear on anywhere on the internet as indexed by Google and Bing, the most commonly-used search engines,” the document states. “Searching Google Scholar, a specialized search engine for academic papers and patent publications, reveals no articles matching the description of the citation authored by ‘Hwang’ [the purported author] that includes the term ‘deepfake.’”
“Perhaps this was simply a copy-paste error? It’s not,” the filing later flatly states. “The article doesn’t exist.”
The attorneys concluded that, if the declaration were partially fabricated, it is entirely unreliable and should be dismissed from court consideration.
“The declaration of Prof. Hancock should be excluded in its entirety because at least some of it is based on fabricated material likely generated by an AI model, which calls into question its conclusory assertions,” the document concluded. “The court may inquire into the source of the fabrication and additional action may be warranted.”
CLICK HERE TO GET THE FOX NEWS APP
Fox News Digital reached out to Ellison, Hancock and Stanford University for comment.
Source link