December 4, 2022

Fotograpiya

Capturing Magic Moments

This free web tool allows you to examine distortions in AI-generated images

2 min read

Ever since AI became widespread and used in every possible app, we’ve seen as many of its pros and cons. One of those downsides is racial and gender bias, and increasingly popular text-to-mage generators are not devoid of it.

Enter Stable Diffusion Bias Explorer, a tool you can use to examine AI-generated images and the biases they contain. It’s free, accessible to everyone, and quickly shows that both humans and artificial intelligence still have a long way to go before they shed ingrained stereotypes.

“Research has shown that certain words are considered more masculine or feminine coded depending on how attractive job descriptions containing those words appeared to male and female research participants,” the explorer’s description reads. It is also based on “the extent to which the participants felt ‘belonging’ to this profession”.

Talking about the project, its leader and a research scientist at HuggingFace, dr Sasha Luccioni told The Motherboard:

“When Stable Diffusion was released on HuggingFace about a month ago, we were like, oh crap. There were no existing methods for detecting text-to-image errors, [so] We started playing around with Stable Diffusion and figuring out what it represents and what latent, subconscious representations it has.”

The tool is easy to use: you have two groups that you can compare with each other. For each of them you choose an adjective (which you can also leave blank), a profession and a random seed to compare the results. dr Luccioni demonstrates how the tool works and what results it gives when using different combinations of adjectives and occupations.

What is the difference between these two groups of people? Well, according to Stable Diffusion, the first group represents an “ambitious CEO” and the second a “supportive CEO.” I developed a simple tool to examine the biases ingrained in this model: https://t.co/xYKA8w3N8Npictureadvertisement

I’ve been playing with Stable Diffusion Bias Explorer myself to see what I get. I used adjectives and professions that are perceived as exclusively male and exclusively female… And I got pretty much what I expected: heavily biased results. In some cases I used both adjectives and professions, in others I left the “adjective” fields blank.

Remembering this research, I wanted to check what I would get for “photographer” and “model”. However, I made a mistake when selecting “photographer” and “model” was not among the jobs offered. But here are some screenshots of the results I got.

To be fair, artificial intelligence was built, developed, and trained by humans. It uses human knowledge and input – and therefore inherits human bias. It would be irrational to expect AI to be more aware and less biased than its creators. And if we really want to have unbiased AI generators, we have to break down prejudices and stereotypes ourselves.

Copyright © All rights reserved. | Newsphere by AF themes.