See how biased AI image models are for yourself with these new tools
One theory as to why that might be is that nonbinary brown people may have had more visibility in the press recently, meaning their images end up in the data sets the AI models use for training, says Jernite. OpenAI and Stability.AI, the company that built Stable Diffusion, say that they have introduced fixes to mitigate the biases ingrained in their systems, such as blocking certain prompts that seem likely to generate offensive images. However, these new tools from Hugging Face show how limited these fixes are. A…