Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI that’s smarter than humans? Americans say a firm "no thank you."

Sep 19, 2023 - vox.com
A new poll conducted by YouGov for the AI Policy Institute reveals that 63% of American voters believe regulation should actively prevent the development of AI superintelligence. The survey of 1,118 Americans across various demographics also found that the majority rejected the argument that the US should lead in AI development to maintain an edge over China. The results suggest a disconnect between public sentiment and the goals of major AI companies like OpenAI, which are actively pursuing the development of superintelligent AI, also known as artificial general intelligence (AGI).

The article argues that the development of AGI is a deeply political issue and criticizes the lack of public debate and democratic input in its development. It suggests that the potential risks of AGI, including mass unemployment, economic disruption, and even the extinction of humanity, warrant a more cautious and democratic approach. The article also highlights the growing distrust of tech executives and skepticism towards the idea that tech progress is inherently positive.

Key takeaways:

  • A new poll reveals that 63 percent of American voters believe regulation should aim to actively prevent the development of AI superintelligence.
  • Despite this, major AI companies like OpenAI are actively working to build superintelligent AI, or artificial general intelligence (AGI), which they believe should exist for the benefit of humanity.
  • However, there is increasing distrust among Americans towards tech executives and the notion that tech progress is inherently positive, especially given the potential risks associated with AGI, such as mass unemployment and drastic changes to the world order.
  • The development of AGI is viewed as a deeply political move, with calls for more democratic input and oversight in its creation and implementation.
View Full Article

Comments (0)

Be the first to comment!