However, the introduction of this technology also raises security concerns, particularly around deepfakes and impersonation scams. Deepfakes have been used to spread disinformation and target individuals, leading to significant financial losses. Despite these risks, Microsoft's Interpreter in Teams is a relatively narrow application of voice cloning technology. The company is expected to reveal more details about the safeguards it will implement in the coming months.
Key takeaways:
- Microsoft is planning to introduce a feature in Teams that allows users to clone their voices for real-time interpretation in up to nine languages.
- The feature, called Interpreter in Teams, will be available to Microsoft 365 subscribers from early 2025 and will not store any biometric data.
- Despite the potential benefits, AI translations and voice cloning technologies pose significant security challenges, including the risk of deepfakes and impersonation scams.
- Microsoft has not yet provided detailed information about the safeguards it will implement to prevent misuse of the Interpreter in Teams feature.