Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Meta Says No to Political Ads Using Its Generative AI Ads Tools

Nov 07, 2023 - techtimes.com
Meta has announced that it will ban political advertisers from using its new generative AI advertising tools, which can rapidly create and alter ad materials. This decision, one of Meta's first public policies regarding safety measures for its AI ad tools, comes amid concerns that the technology could be used to spread election misinformation. The tools, which were recently made available to a select group of advertisers, are set to be launched for all marketers worldwide next year.

In contrast, Google will reportedly allow political advertisers to use AI-generated ads with altered content, provided they do not distort factual scenes or events. The company also plans to release similar image-customizing generative AI ad technologies, but will prohibit the use of "political keywords" as prompts. Other platforms such as Snapchat and TikTok ban political ads, while X (formerly Twitter) has not released any generative AI ad tools.

Key takeaways:

  • Meta has announced that it will prohibit political advertisers from using its new generative AI advertising products, although this decision has not yet been disclosed in its advertising standards or AI-specific rules.
  • The AI tools in question can rapidly build backgrounds, alter images, and produce multiple versions of advertisement materials based on simple written instructions.
  • Concerns have been raised about the potential use of AI in spreading election misinformation, with a survey suggesting that 58% of US adults believe AI technologies could increase the spread of incorrect and misleading material during elections.
  • Other platforms such as Google and Snapchat have also taken steps regarding AI-generated political advertisements, with Google requiring clear disclosure of AI use and Snapchat prohibiting political advertisements altogether.
View Full Article

Comments (0)

Be the first to comment!