This move comes as lawmakers and regulators are preparing to address the issue of AI-generated content in political ads ahead of the 2024 presidential election. Earlier this year, bills were introduced requiring campaigns to disclose when ads include AI-generated content, and the Federal Election Commission is expected to decide on a new rule requiring such disclosure.
Key takeaways:
- Meta has announced that it will require advertisers to disclose when AI-generated or altered content is used in political, electoral, or social issue ads on Facebook and Instagram.
- The new rule, which is expected to go into effect next year, applies to ads that contain "realistic" images, videos, or audio falsely depicting someone doing something they never did or a real event playing out differently.
- Meta will flag ads containing digitally altered content to users and log it in its ads database. However, edits that are inconsequential or immaterial to the claim, assertion, or issue raised in the ad, like cropping or color correcting, do not need to be disclosed.
- The move comes as lawmakers and regulators, including Rep. Yvette Clarke (D-NY), Sen. Amy Klobuchar (D-MN), and the Federal Election Commission, are preparing to address the issue of AI-generated content in political ads ahead of the 2024 presidential election.