Google responded by stating that the response was “not intended” and that the issue was only occurring “in some instances and on certain devices.” The company said it had taken immediate action to fix the bug. This incident adds to previous criticism of Google and its parent company Alphabet for developing products that push social justice absolutism, such as their AI platform Gemini which was mocked for generating politically correct creations.
Key takeaways:
- Google's Nest virtual assistant has been criticized for refusing to answer basic questions about the Holocaust, but was able to provide a detailed description of the Nakba.
- The device repeatedly insisted it did not understand questions related to Jews and the Holocaust, but had no issue answering questions about other World War II events and the Rwandan genocide.
- The video demonstrating this has been widely shared and condemned, with millions of views on various platforms.
- Google has responded by saying the response was not intended and that they have taken immediate action to fix the bug.