The study emphasizes the importance of verifiability as a foundational content policy for Wikipedia and the need for enhanced tools to help humans maintain the quality of references on the platform. SIDE identifies Wikipedia citations that may not robustly substantiate their claims and proposes more reliable alternatives from the web. The model's initial citation suggestion was favored twice as often as the current Wikipedia citation for the top 10% of claims considered potentially unverifiable by SIDE.
Key takeaways:
- A new study suggests that AI can enhance the reliability of Wikipedia, addressing concerns about its open editing system and potential for inaccurate or misleading information.
- London-based AI company, Samaya AI, has developed an AI model, SIDE, that scrutinizes sources and provides its own suggestions, improving the reliability of Wikipedia's reference system.
- When SIDE classified Wikipedia sources as unverifiable and provided its own recommendations, users favored SIDE's suggestions approximately 70% of the time. In about half of the instances, SIDE proposed the same sources that Wikipedia initially suggested.
- The study highlights the promise of an AI-powered system working with humans to improve Wikipedia's credibility, with future research focusing on assessing references in Wikipedia that go beyond text on the internet, including images, videos, and printed publications.