Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

AI use rising in influence campaigns online, but impact limited - US cyber firm

Aug 18, 2023 - reuters.com
Google-owned cybersecurity firm Mandiant has reported an increase in the use of artificial intelligence (AI) for manipulative information campaigns online since 2019. The Virginia-based company found that AI-generated content, such as fabricated profile pictures, was used in politically-motivated online influence campaigns by groups aligned with the governments of several countries, including Russia, China, Iran, and others. Despite this, the impact of these campaigns has been limited and AI has not yet played a significant role in digital intrusions.

Mandiant also highlighted the recent boom in generative AI models, such as ChatGPT, which can create convincing fake videos, images, text, and computer code. While these models could enable groups with limited resources to produce higher quality content for influence campaigns, the company has not yet seen AI play a key role in threats from major countries like Russia, Iran, China, or North Korea. However, they expect this to be a growing problem over time.

Key takeaways:

  • Google-owned cybersecurity firm Mandiant has reported an increase in the use of artificial intelligence (AI) for manipulative information campaigns online since 2019.
  • AI-generated content, such as fabricated profile pictures, has been used in politically-motivated online influence campaigns by groups aligned with the governments of several countries including Russia, China, Iran, Ethiopia, Indonesia, Cuba, Argentina, Mexico, Ecuador, and El Salvador.
  • Despite the increase in AI usage, Mandiant has not yet seen AI play a significant role in threats from Russia, Iran, China, or North Korea, and expects AI use for digital intrusions to remain low in the near term.
  • However, Mandiant warns that the problem of AI being used in manipulative information campaigns is likely to grow over time.
View Full Article

Comments (0)

Be the first to comment!