The petitioners are calling for new legislation to regulate how harmful and dangerous technology work is outsourced in Kenya, and for existing laws to recognize exposure to harmful content as an occupational hazard. They also want an investigation into the ministry of labor's failure to protect Kenyan youth from outsourcing companies. The case is being supported by Foxglove, a non-profit legal NGO, which argues that tech companies like OpenAI bear significant responsibility for the working conditions of content moderators.
Key takeaways:
- Former content moderators for OpenAI's ChatGPT in Nairobi, Kenya, have filed a petition to the Kenyan government alleging exploitative conditions, psychological trauma, low pay, and abrupt dismissal.
- The moderators were tasked with reviewing texts and images, many depicting graphic scenes of violence and sexual abuse, and claim they were not adequately warned about the content and were offered insufficient psychological support.
- The petition relates to a contract between OpenAI and Sama, a data annotation services company headquartered in California that employs content moderators globally. The contract was terminated eight months early, leaving the moderators without income and dealing with trauma.
- The former moderators are calling for new legislation to regulate how harmful and dangerous technology work is outsourced in Kenya, and for existing laws to include exposure to harmful content as an occupational hazard.