Lakera AI's CEO, David Haber, dismissed concerns about the dashboard, stating that it contained no confidential information and was used for educational purposes. However, O'Reilly pointed out that some players had input personal information, such as their email addresses, into the game, which was accessible via the dashboard. He also highlighted that the situation underscores how component-based systems can have weak links and emphasized the importance of evaluating the security of the entire system, not just its core technology.
Key takeaways:
- An educational game called Gandalf, designed by Lakera AI, had an unintended expert level that allowed public access to an analytics dashboard with user prompts and related metrics.
- The dashboard, which was taken down after being notified, listed a prompt count of 18 million user-generated prompts, 4 million password guess attempts, and game-related metrics.
- CEO of Lakera AI, David Haber, dismissed concerns about the data being public, stating that it contained no personal identifiable information and was used for educational purposes.
- However, Jamieson O'Reilly, CEO of Dvuln, pointed out that some players had input personal information such as email addresses, which were accessible via the dashboard, highlighting the need for stringent security protocols even in educational environments.