Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

The Download: what social media can teach us about AI

Mar 13, 2024 - technologyreview.com
The Download newsletter discusses the potential risks and benefits of artificial intelligence (AI), drawing parallels with the unregulated evolution of social media. It highlights the need to learn from past mistakes to prevent AI from causing significant societal damage. The newsletter also covers various technology-related topics, including Google's restrictions on its Gemini chatbot, the rise of Kate Middleton conspiracy theories, the pressure on TikTok to find new owners, the resurgence of Bitcoin, the high cost of AI computing, Donald Trump's interest in buying Truth Social, and the use of Apple's Vision Pro headset in a spinal surgery operation.

In other news, the newsletter discusses the transformative potential of sonification technology for blind and visually impaired individuals. Sarah Kane, a legally blind researcher, is working with Astronify to present astronomical information in audio form, potentially opening access to education and careers in science. The newsletter also includes a quote from Amazon instructing its workers to practice mindfulness and a section on comforting and fun distractions.

Key takeaways:

  • Nathan E. Sanders and Bruce Schneier argue that we should learn from the unregulated evolution of social media to avoid making the same mistakes with AI, which has the potential to do both good and harm to society.
  • Google is restricting its Gemini chatbot from answering election queries out of an abundance of caution, recommending users to try Google Search for election questions instead.
  • There is increasing pressure on TikTok to find new owners to avoid an outright ban in the United States.
  • A technology known as sonification, which transforms information into sound, is opening access to education and careers for millions of blind and visually impaired people.
View Full Article

Comments (0)

Be the first to comment!