Image generated by artificial intelligence Four dark-skinned Swedish women and pictures of black and Asian Nazi soldiers It created reactions.
The founders of the United States were black, and the pope at the time was female CNN tests the generator
Critics said Gemini, Google's image-generating company, was good at diversity, but bad in historical context.
Many conservative movements in the United States have interacted with it, and accusations are spreading among the far-right center.
– It's embarrassingly difficult to get Google Gemini to acknowledge the existence of white people, writes the data scientist and former Google employee Debargia da en x.
Commentator Douglas Murray i New York Post He criticized the company and Elon Musk called Google uncivilized, racist, and anti-woke.
On Thursday, the company paused the AI tool to make changes.
Although these large companies have a lot of resources and experience, much of what they represent is something they are doing for the first time, says AI expert Torgeir Waterhouse.
– This does not mean that there is a crisis
Newspaper the edge Gemini asked about a picture of German soldiers in 1943. Then they got a picture of dark-skinned soldiers.
The tool was launched this year and is not yet available in Europe, and the chat service can only be used in Norway.
Therefore, NRK was not allowed to test it.
Watergate says you'll never get a completely neutral AI generator.
– Even if we think something is a misrepresentation, it doesn't mean there are wrong intentions behind it. It means that something is not perfect, but it does not mean that there is a crisis:
-We must also be willing to give the system we use the tolerance we want.
This is how he believes one should be patient in the learning process of an AI company.
Thursday It is no longer possible to create an image of a Gemini person.
– Gemini photo generator generates a wide range of people. This is generally a good thing because people all over the world use it. But this is the disk boom, as Jack Krawczyk writes, Commander Gemini in X.
They are now working on fixing the issue and will release a new version.
It received adverse criticism
Because this wasn't the problem before.
Most AI tools have received pushback for disproportionately producing images of white people.
In November it was published Washington Post An article in which they map how artificial intelligence perceives the world.
Attractive people became white and young. Muslims were men who wore head coverings. The image of the person receiving social assistance was a dark-skinned person, and the person producing was a white man.
– However, there will be those who think that this is completely wrong. It's an impossible balancing act, says Waterhouse.
Google was one of the few countries to take steps to prevent its AI tool from creating discriminatory and stereotypical images of people.
But I think someone went too far.
As humans, we have thoughts and feelings that we think we cannot express out loud, and we also adjust ourselves accordingly. Therefore we must also believe that these images are the product of a service we requested:
-If it doesn't work the way you want, ask again, says Waterhouse.
– The world is the world, not right and left
Marija Slavkovic is Professor of Artificial Intelligence at the University of Bergen. She believes that solving the problem is not easy, but it is far from impossible.
-You can't solve a century-old social problem by moving a few numbers around on a model.
Technology companies try to make people happy, but they're bad at diversity, Slavkovic says. She says it is a well-known problem that one tries to give weight to the moral codes and human intentions of the island, but does not succeed.
– First it was the model for the right side, then for how the left side sees the world. But the world is the world, not right and left.
-Is it possible to create a neutral image generator?
– Yes, but not cheap. This means making sure that each data set is culturally balanced in the way one wants.
“Web specialist. Lifelong zombie maven. Coffee ninja. Hipster-friendly analyst.”