Artificial Intelligence and gender equality | UN Women โ€“ Headquarters

A study by the Berkeley Haas Center for Equity, Gender and Leadership analyzed 133 artificial intelligence systems in different industries and found that about 44 percent of them showed a gender bias.and 25 percent exhibited racial and gender bias.

Beyza DoฤŸuรง, an artist from Ankara, Turkey, encountered gender bias in generative AI when she was researching a novel and it prompted her to write a story about a doctor and a nurse. Generative AI creates new content (text, images, video, etc.) inspired by similar content and data it was trained on, often in response to a user's questions or prompts.

AI turned the doctor into a man and the nurse into a woman. DoฤŸuรง continued to give him more prompts and the AI โ€‹โ€‹always chose stereotypical gender roles for the characters and associated certain qualities and skills with male or female characters. When she asked the AI โ€‹โ€‹about the gender bias it exhibited, the AI โ€‹โ€‹explained that it was due to the data it had been trained on and, specifically, "word embedding," which means the way certain words They are encoded in machine learning to reflect their meaning and association with other words โ€“ this is how machines learn and work with human language. If AI is trained with data that associates women and men with different, specific skills or interests, it will generate content that reflects that bias.

โ€œArtificial intelligence reflects the biases that are present in our society and that are manifested in AI training data,โ€ DoฤŸuรง said in a recent interview with UN Women.

Who develops AI and what type of data it is trained on has gender implications for AI-powered solutions.

Sola Mahfouz, a quantum computing researcher at Tufts University, is excited about AI, but also worried. โ€œIs it equitable? To what extent does it reflect the patriarchal structures of our society and the inherent prejudices of its predominantly male creators?, she reflected.

Mahfouz was born in Afghanistan, where she was forced to leave school when the Taliban came to her home and threatened her family. She eventually escaped Afghanistan and immigrated to the United States in 2016 to attend college.

As companies scramble for more data to feed AI systems, Epoch researchers say tech companies could run out of high-quality data used by AI by 2026.

Natacha Sangwa is a student from Rwanda who participated in the first coding camp organized under the African Girls Can Code Initiative. last year. "I've noticed that [AI] itโ€™s developed primarily by men and trained on data sets that are primarily based on men,โ€ said Sangwa, who saw firsthand how that affects womenโ€™s experience with technology. "When women use some AI-powered systems to diagnose diseases, they often receive inaccurate answers, because the AI โ€‹โ€‹is not aware of symptoms that may present differently in women."

If current trends continue, AI-powered technology and services will continue to lack diverse racial and gender perspectives, and that gap will result in lower quality of services, biased decisions about jobs, credit, healthcare, and further.

Leave a Comment

Comments

No comments yet. Why donโ€™t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *