Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

I used ChatGPT as a reporting assistant. It didn’t go well.

Mar 20, 2024 - niemanlab.org
The article discusses the use of AI tools, specifically ChatGPT, in journalism. The author experimented with using ChatGPT to assist in reporting a story about a train derailment. While the tool was able to help with tasks such as summarizing long documents and generating programming code, it struggled with sourcing information accurately and required constant direction. The author expressed concerns about the use of AI in journalism, particularly in smaller newsrooms that might over-rely on these tools, potentially leading to inaccuracies.

The author also highlighted the ethical considerations of using AI in journalism. Many newsrooms are beginning to draft AI policies to guide their use of these tools. The Markup, for instance, updated its ethics policy to include rules for the use of AI, such as not publishing stories created by AI, always disclosing its use, and rigorously checking any work generated by AI. The author concluded by advising readers to always double-check information provided by AI tools.

Key takeaways:

  • The author experimented with using AI tool ChatGPT for data journalism, but found it often provided poorly sourced information and required very specific instructions to get accurate results.
  • Despite its shortcomings, ChatGPT was found to be useful in generating and debugging programming code, saving time on tasks such as decoding railroad car numbers.
  • The author expressed concerns about the use of AI tools in journalism, particularly in small, understaffed newsrooms, due to the potential for errors and inaccuracies.
  • Many newsrooms are addressing these concerns by drafting AI policies, with The Markup recently updating their ethics policy to include rules for the use of AI in their work.
View Full Article

Comments (0)

Be the first to comment!