Google to fix AI picture bot after 'woke' criticism

  • Published
A Google sign is pictured in front of Googleplex headquarters at Mountain View in California on 19 December 2023Image source, Getty Images
Image caption,

Google and parent company Alphabet Inc's headquarters in Mountain View, California

Google is racing to fix its new AI-powered tool for creating pictures, after claims it was over-correcting against the risk of being racist.

Users said the firm's Gemini bot supplied images depicting a variety of genders and ethnicities even when doing so was historically inaccurate.

For example, a prompt seeking images of America's founding fathers turned up women and people of colour.

The company said its tool was "missing the mark".

"Gemini's AI image generation does generate a wide range of people. And that's generally a good thing because people around the world use it. But it's missing the mark here," Jack Krawczyk, senior director for Gemini Experiences said on Wednesday.

"We're working to improve these kinds of depictions immediately," he added.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post by Mike Wacker

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post by Mike Wacker

Google later said, external it would suspend the tool's ability to generate images of people while it worked on the fix.

It is not the first time AI has stumbled over real-world questions about diversity.

For example, Google infamously had to apologise almost a decade ago after its photos app labelled a photo of a black couple as "gorillas".

Rival AI firm, OpenAI was also accused of perpetuating harmful stereotypes, after users found its Dall-E image generator responded to queries for chief executive, for example, with results dominated by pictures of white men.

Google, which is under pressure to prove it is not falling behind in AI developments, released its latest version of Gemini last week.

The bot creates pictures in response to written queries.

It quickly drew critics, who accused the company of training the bot to be laughably woke.

This Twitter post cannot be displayed in your browser. Please enable Javascript or try a different browser.View original content on Twitter
The BBC is not responsible for the content of external sites.
Skip twitter post 2 by Frank J. Fleming

Allow Twitter content?

This article contains content provided by Twitter. We ask for your permission before anything is loaded, as they may be using cookies and other technologies. You may want to read Twitter’s cookie policy, external and privacy policy, external before accepting. To view this content choose ‘accept and continue’.

The BBC is not responsible for the content of external sites.
End of twitter post 2 by Frank J. Fleming

"It's embarrassingly hard to get Google Gemini to acknowledge that white people exist," computer scientist Debarghya Das, wrote, external.

"Come on," Frank J Fleming, an author and humourist who writes for outlets including the right-wing PJ Media, in response to the results he received asking for an image of a Viking.

The claims picked up speed in right-wing circles in the US, where many big tech platforms are already facing backlash for alleged liberal bias.

Mr Krawczyk said the company took representation and bias seriously and wanted its results to reflect its global user base.

"Historical contexts have more nuance to them and we will further tune to accommodate that," he wrote on X, formerly Twitter, where users were sharing the dubious results they had received.

"This is part of the alignment process - iteration on feedback. Thank you and keep it coming!"