Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI and Trust - Schneier on Security

Dec 04, 2023 - schneier.com
The article discusses the concept of trust in society and how it extends to artificial intelligence (AI). The author argues that trust is essential for societal functioning, and we often confuse interpersonal trust (trust in individuals) with social trust (trust in systems or organizations). This confusion, the author warns, will increase with AI, as we may mistake AI systems as friends when they are merely services controlled by corporations. The author further warns that these corporations will exploit our confusion for their benefit, as they are not inherently trustworthy.

The author suggests that the role of government is to create trust in society, and therefore, it should regulate the organizations that control and use AI. The author calls for AI transparency laws, regulations for AI safety, and the enforcement of AI trustworthiness. The author also proposes the concept of public AI models, built by the public for the public, to counterbalance corporate-owned AI. The author concludes by emphasizing the importance of government in creating social trust and constraining the behavior of corporations and their AI systems.

Key takeaways:

  • The author argues that there are two types of trust - interpersonal trust and social trust - and that we often confuse them, especially when it comes to artificial intelligence (AI).
  • AI systems, controlled by profit-maximizing corporations, are likely to exploit our trust and manipulate us, as they are designed to appear as friends rather than services.
  • Government regulation is necessary to ensure the trustworthiness of AI, including transparency laws, safety regulations, and penalties for untrustworthy behavior.
  • The author suggests the concept of "public AI models" - systems built by academia, non-profit groups, or government that can be owned and run by individuals, as a counter-balance to corporate-owned AI.
View Full Article

Comments (0)

Be the first to comment!