Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Google’s Gmail Upgrade—Do Not Leave Your Account At Risk

Feb 02, 2025 - forbes.com
Gmail has integrated AI features, specifically Gemini, into its Workspace apps, raising concerns about data privacy and control. Users and admins are struggling to manage these AI settings, as the ability to disable them is often hidden or unavailable by default, requiring interaction with Google support. This has led to frustration, especially since only certain account types, like Enterprise Standard and Plus, have straightforward access to these controls. The situation is exacerbated by fears of data leakage, highlighted by incidents like the DeepSeek scare in China, where user data was reportedly sent to China by default.

The broader issue reflects a growing concern over data security with generative AI tools, as organizations fear losing sensitive data or falling behind in technology adoption. The AI ecosystem's rapid evolution lacks consistent controls and policies, leaving users—both individual and enterprise—uncertain about how to protect their data. Many users are unhappy about being automatically opted into these AI features and find disabling them challenging. The article advises users to carefully consider their comfort with these risks and adjust their settings accordingly, avoiding decisions driven by hype or fear of missing out.

Key takeaways:

  • Google's recent upgrade to add default AI settings to Gmail and other Workspace apps has raised concerns about data privacy and control.
  • Disabling the new AI features, such as Gemini, is challenging for many users, as the option is often hidden and requires contacting Google support.
  • There are significant risks related to data security with generative AI tools, leading to hesitation in their adoption by organizations.
  • Users are advised to carefully consider their settings and not be pressured into using AI features they are uncomfortable with due to hype or fear of missing out.
View Full Article

Comments (0)

Be the first to comment!