The guidelines emphasize the need for local expertise in content moderation and risk analysis, and recommend setting up dedicated internal teams for each election event. Platforms are also advised to provide users with access to official information on electoral processes and to run media literacy campaigns. The guidelines are not mandatory, but platforms that opt for a different approach must demonstrate that their alternative meets the EU's standards. Failure to do so could result in penalties of up to 6% of global annual turnover under the DSA. Formal adoption of the guidelines is expected in April.
Key takeaways:
- The European Union has published draft election security guidelines aimed at large platforms with over 45M+ regional monthly active users, including Facebook, Google Search, Instagram, LinkedIn, TikTok, YouTube and X, that are regulated under the Digital Services Act (DSA).
- The guidelines expect these platforms to increase their efforts in protecting democratic votes and deploy effective content moderation resources in the multiple official languages spoken across the bloc.
- The EU also recommends that platforms give their users a meaningful choice over algorithmic and AI-powered recommender systems, and have measures to downrank disinformation targeted at elections.
- Platforms are also advised to prepare for incoming transparency rules in political advertising, have systems in place enabling them to demonetize disinformation, and provide free data access to third parties studying election risks.