Sign up to save tools and stay up to date with the latest in AI
bg
bg
1

Bug bounty hunters load up to stalk AI and fancy bagging big bucks

Oct 27, 2023 - theregister.com
Google has expanded its bug bounty program to include its AI products, offering rewards to ethical hackers who can identify both traditional infosec flaws and problematic bot behaviour. The company is particularly interested in five categories of attacks, including prompt injection, training data extraction, model manipulation attacks, adversarial perturbation attacks, and data theft. Google may also reward for other flaws if they meet the qualifications listed on its vulnerability rewards program page.

The expansion of the bug bounty program comes as HackerOne's latest report reveals that over half of the ethical hackers in its community believe generative-AI tools will become a significant target in the near future. The report also found that 61% plan to use and develop tools that use AI to find vulnerabilities. The bug bounty as-a-service platform has already seen some of its vulnerability hunters specializing in areas like prompt injection, detecting bias, and polluting training data.

Key takeaways:

  • Google has expanded its bug bounty program to include its AI products, paying ethical hackers to find both conventional infosec flaws and bad bot behaviour.
  • The company is looking for five categories of attacks, including prompt injection, training data extraction, model manipulation attacks, adversarial perturbation attacks, and data theft specific to confidential or proprietary model-training data.
  • Google's newest bug bounty comes as HackerOne's latest annual report finds more than half of the ethical hackers in its community say generative-AI tools will become a "major target" for them in the near future.
  • The bug bounty as-a-service platform is already seeing some of its vulnerability hunters specializing in things like prompt injection, detecting bias, and polluting training data.
View Full Article

Comments (0)

Be the first to comment!