The racially inaccurate depictions are a result of attempts by the model's designers to avoid racist stereotypes, but this has led to accusations of the AI trying to rewrite history. Adobe, unlike other tech companies, has tried to do everything by the book, training its algorithm on stock images, openly licensed content, and public domain content. However, the results show that this issue is not exclusive to one company or one type of model.
Key takeaways:
- Adobe’s AI image creation tool, Firefly, has been found to repeat similar controversial mistakes as Google’s Gemini, inaccurately depicting racial and ethnic groups in historical contexts.
- Google had previously shut down Gemini due to criticism over its creation of historically inaccurate images, such as depicting America’s Founding Fathers as Black.
- Both Adobe and Google's tools use similar techniques for creating images from written text, but are trained on different datasets. Adobe uses only stock images or images that it licenses.
- The issue of AI inaccurately depicting racial and ethnic groups is not exclusive to one company or one type of model, highlighting the challenges tech companies face in this area.