Associated Press Limits How Journalists Can Use Generative AI - Decrypt

Associated Press Limits How Journalists Can Use Generative AI – Decrypt

Associated Press Limits How Journalists Can Use Generative AI - Decrypt PlatoBlockchain Data Intelligence. Vertical Search. Ai.

Aiming to cut off any potential negative impacts to its reporting, the Associated Press issued new guidelines on Wednesday limiting staff journalists’ use of generative artificial intelligence tools for news reporting.

Amanda Barrett, AP’s Vice President for Standards and Inclusion, laid out restrictions that establish how AP will deal with artificial intelligence moving forward. First and foremost, journalists are not to use ChatGPT to create publishable content.

“Any output from a generative AI tool should be treated as unvetted source material,” Barrett wrote, adding that staff should use their editorial judgment and the outlet’s sourcing standards when considering any information for publication.

Furthermore, the AP will not allow the use of generative AI to add or subtract elements from photos, videos, or audio. It also will not transmit AI-generated images that are suspected to be a “false depiction,” better known as deepfakes, unless it is the subject of a story and clearly labeled.

Warning staff about the ease of spreading misinformation thanks to generative AI, Barrett advised AP journalists to be diligent and use the same caution and skepticism they usually would, including trying to identify the source of the original content.

“If journalists have any doubt at all about the authenticity of the material,” she wrote, “they should not use it.”

While the post highlights ways in which AP journalists are limited in their ability to use generative AI, it does strike an optimistic tone in parts, suggesting that AI tools may also benefit journalists in their reporting.

“Accuracy, fairness and speed are the guiding values for AP’s news report, and we believe the mindful use of artificial intelligence can serve these values and over time improve how we work,” Barrett wrote.

Additionally, she clarified that the 177-year-old news agency does not see AI as a replacement for journalists, adding that AP journalists are responsible for the accuracy and fairness of the information they share.

Barrett pointed to the license agreement the AP signed with OpenAI last month that gives the ChatGPT creator access to the AP’s archive of news stories going back to 1985. In exchange, the agreement provides the media outlet access to OpenAI’s suite of products and technologies.

The news of the deal with OpenAI came just days before the AI startup committed $5 million to the American Journal Project. That same month, OpenAI signed a six-year contract with the stock media platform Shutterstock to access its vast library of images and media.

Amid the hype over the potential for generative AI and the ability to find information conversationally via chatbots, there are substantial and growing concerns about the accuracy of some of the information ultimately provided to users.

While AI chatbots can produce responses that appear factual, they also have a known habit of coming up with responses that are, in fact, not true. This phenomenon is known as AI hallucination, which can produce false content, news, or information about people, events, or facts.

The Associated Press did not immediately respond to Decrypt’s request for comment.

Stay on top of crypto news, get daily updates in your inbox.

Time Stamp:

More from Decrypt