HELENA, Mont. (AP) — Wyoming’s governor and a local prosecutor’s quotes caught the attention of Powell Tribune reporter CJ Baker. The phrases in the stories seemed unusual, and some even seemed robotic.
Upon further investigation, Baker discovered that a reporter from a rival news outlet had used generative artificial intelligence to help write his stories. This was evident in an article about Larry the Cable Guy being chosen as the grand marshal of the Cody Stampede Parade.
The Cody Enterprise reported, “The 2024 Cody Stampede Parade promises to be an unforgettable celebration of American independence, led by one of comedy’s most beloved figures.” Baker, a veteran reporter, later met with Aaron Pelczar, a rookie journalist who confessed to using AI in his stories before resigning from the Enterprise.
The Enterprise’s publisher and editor, founded by Buffalo Bill Cody in 1899, issued apologies and promised to prevent such incidents in the future. In an editorial, Enterprise Editor Chris Bacon acknowledged his oversight in allowing AI-generated content to be published with false quotes.
Prior to AI, journalists have faced career-ending scandals for fabricating quotes and facts. This recent incident highlights the risks associated with AI in journalism, as it can produce seemingly plausible but misleading content with minimal input.
While AI has been integrated into journalism for automating certain tasks, newsrooms like The Associated Press do not allow generative AI to create publishable content. The AP has used technology for financial reports and sports stories, with a disclaimer at the end of each story to clarify the role of technology in its production.
Transparency in the use of AI is crucial, as demonstrated by the backlash faced by Sports Illustrated for publishing AI-generated content as original reporting. The incident tarnished the publication’s reputation and led to severing ties with the company responsible for the articles.
Baker revealed Pelczar’s use of AI in articles, prompting an internal review by the Enterprise. Several stories were found to contain AI-generated quotes attributed to individuals who had not spoken to Pelczar.
Despite refusing to discuss the matter, Pelczar admitted to unintentionally misquoting individuals and pledged to correct the errors. The Enterprise is currently reviewing all of Pelczar’s articles to identify additional instances of AI-generated content.
The publisher emphasized the importance of recognizing AI-generated stories to prevent future incidents and announced plans to implement an AI policy to address ethical concerns surrounding the use of AI in journalism.
AI has become a tool in modern journalism, but the Cody Enterprise’s experience serves as a cautionary tale of the potential risks and ethical considerations associated with AI in the industry.