Advertisement

News

Google still hasn’t fixed its big controversy with Gemini and historical images

The fundamental issue with AI is that it's racist, and fixing that is difficult.

Google still hasn’t fixed its big controversy with Gemini and historical images
Chema Carvajal Sarabia

Chema Carvajal Sarabia

  • Updated:

In February, Google paused the ability of its chatbot Gemini to generate images of people after users complained about historical inaccuracies.

Gemini DOWNLOAD

When asked to portray “a Roman legion,” for example, Gemini showed an anachronistic group of soldiers of diverse races, while the “Zulu warriors” were stereotypically black.

Google CEO Sundar Pichai apologized and Demis Hassabis, co-founder of Google’s AI research division, DeepMind, said that a solution should arrive “very soon”, within the next two weeks. But we are already in May and the promised solution has still not arrived.

An almost impossible problem to solve and the reason is racist

Google has presented many other features of Gemini at its annual I/O developer conference this week, from personalized chatbots to a vacation itinerary planner and integrations with Google Calendar, Keep, and YouTube Music.

But the generation of people images is still disabled in Gemini applications for web and mobile, as confirmed by a Google spokesperson.

What is the reason for the delay? Well, the problem is probably more complex than Hassabis suggests.

The datasets used to train image generators like Gemini often contain more images of white people than people of other races and ethnicities, and the images of non-white people in those datasets reinforce negative stereotypes.

Google, in an apparent effort to correct these biases, implemented a clumsy encoding under the hood. And now it strives to find a reasonable middle ground solution that avoids repeating history.

Gemini DOWNLOAD

Will Google achieve it? Maybe yes. Maybe not. In any case, this endless issue reminds us that finding a solution for a misbehaving AI is not easy, especially when biases are the cause of the misbehavior.

Chema Carvajal Sarabia

Chema Carvajal Sarabia

Journalist specialized in technology, entertainment and video games. Writing about what I'm passionate about (gadgets, games and movies) allows me to stay sane and wake up with a smile on my face when the alarm clock goes off. PS: this is not true 100% of the time.

Latest from Chema Carvajal Sarabia

Editorial Guidelines