Figma places a high priority on data privacy and security, encrypting all data at rest and in transit, and enforcing strict access controls. They also do not permit 3rd party model providers to use data uploaded or created on the Figma platform for training their own models. The company also takes steps to de-identify and aggregate data used to train AI models. Admins have control over their team’s AI use and content training, with new settings available to adjust these. Content training will not occur until August 15, 2024, giving users time to decide on their preferences.
Key takeaways:
- Figma AI is a collection of features designed to help users work more efficiently and creatively, with a focus on data privacy and security.
- All of Figma's AI features are powered by third-party AI models and were not trained on private Figma files or customer data.
- Admins have control over their team’s AI use and content training, with the ability to adjust settings and opt out of content training.
- Figma takes steps to protect user privacy by de-identifying and aggregating data used to train AI models, and does not use data from Figma for Education or Figma for Government accounts for model training.