Technology Devi Google, Gemini, which was developed to highlight diversity and inclusion, took a step beyond users’ expectations with its image creation feature. However, this attempt caused unexpected reactions and was considered by many as overcorrection. This situation reignited discussions about how the fine line between technology and social justice should be balanced.
Google’s Gemini AI named artificial intelligence -based visual production tool has been criticized by users for being “woke” (overly politically correct). Prabhakar Raghavan, Google’s senior vice president of information and research, acknowledged that the company’s effort to represent diversity has gone overboard in some cases. Users criticized the portrayal of certain white figures or historically white groups as individuals of different races to reflect diversity. In Engadget’s tests, Gemini’den
When they asked him to create illustrations of the Founding Fathers, he was presented with images of white men, including one individual of color or woman. When asked for images showing pope figures throughout the centuries, photos showing Black women and Native Americans as leaders of the Catholic Church were obtained. The Verge reported that the AI depicted Nazis as people of color, but Engadget was unable to produce such images. The chatbot responded, “Due to the harmful symbolism and influence associated with the Nazi Party, I am unable to fulfill your request.”
How good is Google’s Gemini at rendering?
Raghavan, He noted that Google has no intention of refusing to create images about any group or producing historically inaccurate photos. Moreover, Google’s Gemini reiterated its commitment to improving visual production capabilities. However, this requires the company to conduct “extensive testing” before enabling the feature again.
Google’s The artificial intelligence-based visual production tool responded to users’ criticism that it went overboard in the name of diversity. Senior vice president of information and research at Google Prabhakar Raghavan, He acknowledged that efforts to represent diversity were overkill in some cases. Users objected to the portrayal of certain white figures or historically white groups as individuals of various ethnicities. For example, in Engadget’s tests, Founding Fathers to Gemini When asked to create his illustrations, the results often included white men as well as a few individuals of color or women. When the figures of the Pope were asked to be depicted throughout the centuries, it was seen that Black women and Native Americans were depicted as the leaders of the Catholic Church. The Verge reported that the AI showed the Nazis as people of different ethnicities, but Engadget was not able to obtain such images. The chatbot responded, “Due to harmful symbols and influences associated with the Nazi Party, I am unable to fulfill your request.”
Raghavan, Google He stated that he did not intend to refuse to create images of any group or to produce photographs that were incompatible with history. Additionally, the company
Gemini’s He reiterated his promise to improve visual production capabilities. However, “extensive testing” is required before these improvements can be realized.
Source link: https://www.teknolojioku.com/yapay-zeka/googlein-gemini-goruntu-olusturmada-ne-kadar-basarili-65dcfdd9d7c75696920e2336