The EU AI Act, which is currently being debated by the European Commission, Council, and Parliament, is focused on analyzing and classifying AI systems according to the risk they pose to users. The coalition is concerned that the Act's requirements could inadvertently impose expectations on open-source developers that are more suited to companies or well-resourced actors. The group is optimistic that providing clear information about the open-source approach to development will be beneficial as discussions continue.
Key takeaways:
- A coalition of open-source AI stakeholders, including Hugging Face, GitHub, EleutherAI, Creative Commons, LAION and Open Future, have released a policy paper calling on EU policymakers to protect open-source AI innovation in the finalization of the EU AI Act.
- The coalition argues that the current draft of the EU AI Act favors closed and proprietary AI development, which could disadvantage the open AI ecosystem.
- The EU AI Act is focused on analyzing and classifying AI systems according to the risk they pose to users, with higher risk levels requiring more regulation.
- The EU's influence in tech regulation, known as the "Brussels Effect", means that the outcome of the EU AI Act could impact global AI regulation, including in the U.S.