Additionally, the FBI and experts recommend creating a secret code or phrase known only to close family and friends. This code should be used to confirm the identity of someone claiming to be in trouble, as deepfake technology can convincingly mimic voices using public audio clips. The public is urged to be cautious and implement these measures to protect against these sophisticated AI attacks.
Key takeaways:
- The FBI and security experts warn of AI-powered deepfake attacks targeting smartphone users, particularly through voice cloning.
- These attacks involve scammers using deepfake audio to impersonate family members and extort money by simulating emergencies.
- The FBI advises hanging up immediately if you receive such a call and verifying the caller's identity through direct means.
- Creating a secret code known only to close family and friends is recommended to help identify genuine calls from impersonated ones.