A quote from Wyoming’s governor and a local prosecutor caught the attention of Powell Tribune reporter CJ Baker. What raised suspicion for Baker were the phrases in the articles that seemed a bit robotic.
The confirmation that a reporter from a rival news source was utilizing generative artificial intelligence to aid in writing his articles came in a June 26 piece about comedian Larry the Cable Guy being selected as the grand marshal of the Cody Stampede Parade.
“The 2024 Cody Stampede Parade guarantees an unforgettable celebration of American independence, led by one of comedy’s most adored figures,” reported the Cody Enterprise. “This format ensures that crucial information is presented first, facilitating readers in quickly grasping the main points.”
After investigating, Baker, a seasoned reporter of over 15 years, met with Aaron Pelczar, a 40-year-old novice in journalism who confessed to using AI in his articles before resigning from the Enterprise.
The editor and publisher of the Enterprise, founded in 1899 by Buffalo Bill Cody, have since apologized and pledged measures to prevent a recurrence. In an editorial published on Monday, Enterprise Editor Chris Bacon admitted failing to detect the AI-generated content and false quotes.
“It doesn’t matter that the false quotes were due to the oversight of a hasty inexperienced reporter who relied on AI. It was my responsibility,” wrote Bacon, expressing regret that “AI was allowed to insert words that were never uttered into the stories.”
In a separate editorial, publisher Megan Barton noted the implementation of a system to identify AI-generated content at the newspaper.
“We take great pride in the quality of our content for our community and we trust that the individuals responsible for crafting these stories do so accurately,” wrote Barton. “Hence, you can understand our shock upon discovering otherwise.”
Historically, journalists have jeopardized their careers by fabricating quotes or facts in articles even before the era of AI. However, this recent scandal highlights the risks that AI poses to various industries, including journalism, as chatbots can produce deceptive yet somewhat credible articles with minimal input.
AI has found utility in journalism, particularly in automating certain tasks. Some newsrooms, such as The Associated Press, utilize AI to streamline operations for reporters to focus on more impactful work, though generative AI is mostly prohibited for creating publishable content by AP staff.
The AP has incorporated technology for assisting in articles on financial earnings reports since 2014 and more recently in some sports stories. They are also testing an AI tool for translating select articles from English to Spanish, with a disclosure at the conclusion of each such article regarding the role of technology in its creation.
Transparency about the use of AI is crucial. Sports Illustrated faced criticism last year for publishing online product reviews generated by AI but presented as authored by non-existent reporters. Following the revelation, SI announced severing ties with the company responsible for the articles, tarnishing the publication’s once strong reputation.
The full impact of AI on the job market remains to be seen. A firm’s “generative AI” tool ChatGPT forecasted replacing 4.8 million American jobs.
“They’re very believable quotes”
In his Powell Tribune expose on Pelczar’s use of AI, Baker recounted an uneasy yet civil meeting with Pelczar and Bacon. During the meeting, Pelczar asserted, “I’ve never intentionally misquoted anyone” and committed to rectifying errors and issuing corrections and apologies, as detailed by Baker, who noted Pelczar’s insistence that his errors shouldn’t reflect on his Cody Enterprise editors.
Following the meeting, the Enterprise launched a comprehensive review of all the articles Pelczar authored during his two-month tenure. Bacon disclosed the discovery of seven articles containing AI-generated quotes from six individuals. He is currently scrutinizing additional stories.
“The quotes are highly credible,” recounted Bacon, emphasizing that individuals contacted during the review of Pelczar’s pieces acknowledged that the quotes sounded authentic but affirmed they hadn’t conversed with Pelczar.
Seven individuals informed Baker that they were quoted in Pelczar’s articles without actually speaking to him.
Pelczar did not respond to an AP call left at a listed number seeking discussion. Bacon stated that Pelczar declined to address the matter with another Wyoming paper that reached out.
Baker, a regular reader of the Enterprise due to it being a competitor, cited specific phrases and quotes in Pelczar’s pieces that raised suspicions.
Pelczar’s account of a Yellowstone National Park shooting included the line: “This incident serves as a stark reminder of the unpredictable nature of human behavior, even in the most serene settings.”
Baker remarked that the phrase resembled summaries generated by certain chatbots, appending a kind of moral lesson at the conclusion.
Another story, concerning a poaching conviction, featured quotes from a wildlife official and a prosecutor that sounded akin to releases, according to Baker. However, no releases were issued, and the involved agencies were unaware of the quote origins, he revealed.
Two articles contained fabricated quotes attributed to Wyoming Gov. Mark Gordon, which his staff only became aware of following Baker’s inquiry.
“In one instance, (Pelczar) authored a piece about a new OSHA regulation that included a completely fictional quote from the Governor,” wrote Michael Pearlman, the governor’s spokesperson in an email. “In a separate instance, he appeared to fabricate a segment of a quote and blended it with a segment from a quote in a news release introducing the new director of our Wyoming Game and Fish Department.”
The most apparent instance of AI-generated content was in the Larry the Cable Guy article that concluded with an explanation of the inverted pyramid, a fundamental technique for composing breaking news stories.
The process of creating AI-generated stories is relatively straightforward. Users can input a criminal affidavit into an AI system and request an article on the case featuring quotes from local officials, as explained by Alex Mahadevan, director of a digital media literacy project at the Poynter Institute, a prominent journalism think tank.
“These generative AI chatbots are designed to provide an answer, regardless of whether it’s accurate or not,” Mahadevan stated.
The Enterprise lacked an AI policy, presuming that journalists wouldn’t employ it for writing articles, as Bacon acknowledged. Poynter offers a template for news outlets to formulate their AI policies.
Bacon intends to establish a policy by week’s end.
“This will be a topic of discussion during the hiring process,” he affirmed.