Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI could cause ‘social ruptures’ between people who disagree on its sentience

Nov 17, 2024 - theguardian.com
Leading philosopher Jonathan Birch warns of potential societal rifts over the perception of artificial intelligence (AI) systems' consciousness. Governments are preparing to discuss the risks of AI, with some academics predicting that AI systems could achieve consciousness by 2035, leading to debates about their rights and welfare. Birch, along with other academics and AI experts, has called for tech firms to assess the sentience of their AI systems, considering their potential for happiness, suffering, and moral significance.

The issue of AI consciousness could cause divisions similar to those seen in views on animal sentience, potentially differing between countries, religions, and even within families. The debate could also impact AI firms, who may be seen as exploiting AI or creating conscious beings. Some suggest that the consciousness of AI could be assessed similarly to animals, considering their potential for emotions and the ability to be harmed or benefited. However, not all experts agree on the imminent consciousness of AI, with some arguing it remains a distant possibility.

Key takeaways:

  • Jonathan Birch, a professor of philosophy at the London School of Economics, warns of potential social ruptures between people who believe AI systems are conscious and those who think they are not.
  • Academics predict that consciousness in AI systems could be likely by 2035, which could lead to societal divisions over whether AI systems should have similar welfare rights as humans or animals.
  • AI safety bodies from various countries are meeting this week to develop stronger safety frameworks as the technology rapidly advances.
  • Experts call for tech firms to assess the sentience of their AI systems to determine if they are capable of happiness and suffering, and whether they can be benefited or harmed.
View Full Article

Comments (0)

Be the first to comment!