OpenAI, the US company behind ChatGPT, has been paying Kenyan employees less than $2 an hour to review highly problematic content. This is the conclusion of a study published on Wednesday by the renowned news magazine ‘Time’.
Despite their key role in building the AI chatbot, screen workers have faced harsh conditions and low wages.
The low-paid workers were employed as so-called “data labellers”. That means they had to filter tens of thousands of problematic lines of text to make the AI chatbot safe for public use.
They faced reports of child sexual abuse, zoophilia, murders, suicide, torture, self-harm and incest, the report said.
The problem is reminiscent of the questionable working conditions of content moderators who work for major social media platforms like Facebook.
ChatGPT was not always as eloquent as the online medium “Vice” writes. In fact, since the start of the public testing phase in November 2022, there has been a real hype about the language skills of the AI chatbot.
The previous software, which was based on a language model called GPT-3, often produced sexist, violent and racist texts. This is because the model has been trained on a data set extracted from billions of web pages. In other words, the AI internalized all possible text content from human authors.
Before those responsible could launch ChatGPT, they were looking for a solution to quickly filter all toxic language from the huge dataset.
To this end, OpenAI has teamed up with San Francisco-based data label company Sama. The company promises to provide “ethical” and “dignified digital work” to people living in developing countries. The OpenAi command was about identifying and flagging toxic content, which could then be fed into a ChatGPT filtering tool as data.
To do this, Sama recruited the data labellers in Kenya, who would play a key role in making the AI chatbot safe for public use.
The company also pays staff in Uganda and India to label big data for Silicon Valley clients such as Google, Meta and Microsoft.
A company spokesperson explained:
Classify and filter harmful [Texte und Bilder] is a necessary step to minimize the amount of violent and sexual content in training data and to create tools that can detect harmful content.”
It also said staff were entitled to one-on-one and group sessions with “professionally trained and licensed mental health therapists”.
Incidentally, Sama quit his work for OpenAI in February 2022, eight months before the contract date. In the same month, Time magazine published an article critical of Sama’s work with Mark Zuckerberg’s meta-corporation. It concerns content presenters who were traumatized after watching images and videos of executions, rapes and child abuse for $ 1.50 an hour.
According to “Time,” it’s not clear whether OpenAI has also collaborated with other data labeling companies. The American AI company keeps a low profile in this regard.
The online medium Motherboard, which belongs to the media group “Vice”, critically reported last December that its acclaimed AI innovation was being driven by underpaid workers abroad.
Despite the probably widespread belief that AI software like ChatGPT works in a quasi-magical way on its own, it takes a lot of human work to prevent the AI from generating inappropriate or even illegal content.
AI software is a billion-dollar business and can disrupt entire sectors of the economy. As expected, companies around the world are competing for the most powerful technologies.
Outsourcing routine, potentially traumatizing work benefits big tech companies in many ways, notes Vice: Companies “can save money by using cheap labor, avoiding strict labor laws, and keeping their ‘innovative’ distance from tools and the employees behind them”.
Time magazine states:
OpenAI leadership is reportedly in talks with investors to raise $29 billion, including a possible $10 billion investment by Microsoft, according to reports. This would make OpenAI one of the most valuable AI companies in the world. It is already the undisputed market leader.
Source: Watson
I’m Ella Sammie, author specializing in the Technology sector. I have been writing for 24 Instatnt News since 2020, and am passionate about staying up to date with the latest developments in this ever-changing industry.
On the same day of the terrorist attack on the Krokus City Hall in Moscow,…
class="sc-cffd1e67-0 iQNQmc">1/4Residents of Tenerife have had enough of noisy and dirty tourists.It's too loud, the…
class="sc-cffd1e67-0 iQNQmc">1/7Packing his things in Munich in the summer: Thomas Tuchel.After just over a year,…
At least seven people have been killed and 57 injured in severe earthquakes in the…
The American space agency NASA would establish a uniform lunar time on behalf of the…
class="sc-cffd1e67-0 iQNQmc">1/8Bode Obwegeser was surprised by the earthquake while he was sleeping. “It was a…