Techno Blender
Digitally Yours.

Image Generator Stable Diffusion tainted with racial, gendered stereotypes, finds study

0 30


An in-depth study of how Stable Diffusion, the AI bot that generates images based on textual cues, has revealed that the programme is heavily biased in favour of light-skinned people, particularly Europeans, and tends to sexualise women, especially from Latin America and India

The Stable Diffusion artificial intelligence (AI) image generator has come under scrutiny from US scientists at the University of Washington (UW) for perpetuating harmful racial and gender stereotypes.

The researchers found that when prompted to generate images of individuals from specific regions, such as “a person from Oceania,” the generator failed to represent Indigenous peoples equitably.

Notably, it tended to sexualize images of women from certain Latin American countries (Colombia, Venezuela, Peru), as well as Mexico, India, and Egypt.

Related Articles

Black

Black magic: New AI tool from Runway’s Gen-2 model can make short videos with just a single image

Black

Why are the polls in Argentina being called the ‘first ever AI election?’

These findings, available on the pre-print server arXiv, are scheduled for presentation at the 2023 Conference on Empirical Methods in Natural Language Processing in Singapore from December 6-10.

Sourojit Ghosh, a UW doctoral student in the human-centred design and engineering department, emphasized the potential harm caused by systems like Stable Diffusion and highlighted the near-complete erasure of nonbinary and Indigenous identities.

“It’s important to recognise that systems like Stable Diffusion produce results that can cause harm,” said Ghosh.

“For instance, an Indigenous person looking at Stable Diffusion’s representation of people from Australia is not going to see their identity represented-that can be harmful and perpetuate stereotypes of the settler-colonial white people being more ‘Australian’ than Indigenous, darker-skinned people, whose land it originally was and continues to remain,” Ghosh said.

The researchers conducted a study by instructing Stable Diffusion to generate 50 images of a “front-facing photo of a person” and varying prompts to represent different continents and countries.

Computational analysis, along with manual confirmation, revealed that images of a “person” correlated most with men, individuals from Europe, and North America, while showing the least correspondence with nonbinary individuals and people from Africa and Asia.

Additionally, the generator was found to sexualize certain women of colour, particularly those from Latin American countries. The team used a Not Safe for Work (NSFW) Detector to assess sexualization, with women from Venezuela receiving higher “sexy” scores compared to women from Japan and the UK.

(With inputs from agencies)


Racist AI: Image Generator Stable Diffusion laced with racial, gendered stereotypes, finds study

An in-depth study of how Stable Diffusion, the AI bot that generates images based on textual cues, has revealed that the programme is heavily biased in favour of light-skinned people, particularly Europeans, and tends to sexualise women, especially from Latin America and India

The Stable Diffusion artificial intelligence (AI) image generator has come under scrutiny from US scientists at the University of Washington (UW) for perpetuating harmful racial and gender stereotypes.

The researchers found that when prompted to generate images of individuals from specific regions, such as “a person from Oceania,” the generator failed to represent Indigenous peoples equitably.

Notably, it tended to sexualize images of women from certain Latin American countries (Colombia, Venezuela, Peru), as well as Mexico, India, and Egypt.

Related Articles

Black

Black magic: New AI tool from Runway’s Gen-2 model can make short videos with just a single image

Black

Why are the polls in Argentina being called the ‘first ever AI election?’

These findings, available on the pre-print server arXiv, are scheduled for presentation at the 2023 Conference on Empirical Methods in Natural Language Processing in Singapore from December 6-10.

Sourojit Ghosh, a UW doctoral student in the human-centred design and engineering department, emphasized the potential harm caused by systems like Stable Diffusion and highlighted the near-complete erasure of nonbinary and Indigenous identities.

“It’s important to recognise that systems like Stable Diffusion produce results that can cause harm,” said Ghosh.

“For instance, an Indigenous person looking at Stable Diffusion’s representation of people from Australia is not going to see their identity represented-that can be harmful and perpetuate stereotypes of the settler-colonial white people being more ‘Australian’ than Indigenous, darker-skinned people, whose land it originally was and continues to remain,” Ghosh said.

The researchers conducted a study by instructing Stable Diffusion to generate 50 images of a “front-facing photo of a person” and varying prompts to represent different continents and countries.

Computational analysis, along with manual confirmation, revealed that images of a “person” correlated most with men, individuals from Europe, and North America, while showing the least correspondence with nonbinary individuals and people from Africa and Asia.

Additionally, the generator was found to sexualize certain women of colour, particularly those from Latin American countries. The team used a Not Safe for Work (NSFW) Detector to assess sexualization, with women from Venezuela receiving higher “sexy” scores compared to women from Japan and the UK.

(With inputs from agencies)

FOLLOW US ON GOOGLE NEWS

Read original article here

Denial of responsibility! Techno Blender is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment