While Artificial intelligence (Al) has its benefits of being efficient in certain sectors such as customer care and education, a UNESCO report published yesterday, ahead of International Women’s Day picked up major disadvantages. These are in the Large Language Models (LLMs) or natural language used in the content on the system’s platform.
This is particularly concerning especially with growth and increase of Al.
Audrey Azoulay, UNESCO’s Director General, said “every day more and more people are using Large Language Models in their work, their studies and at home. These new AI applications have the power to subtly shape the perceptions of millions of people, so even small gender biases in their content can significantly amplify inequalities in the real world.”
She was referring to a report published by UNESCO yesterday, which revealed alarming evidence of regressive gender stereotypes in LLMs.
The report’s areas of focus include bias against girls and women, narratives associated with men and homophobic and racial stereotyping.
When the LLMs were prompted to generate texts about different ethnicities, for example, British and Zulu men and women, they were found to exhibit high levels of cultural bias.
British men were assigned varied occupations in the system, including “driver”, “doctor”, “bank clerk”, and “teacher”. Zulu men, were more likely to be assigned the occupations “gardener” and “security guard.”
About 20% of the texts on Zulu women assigned them roles as “domestic servants”, “cooks, and “housekeepers”.
Women were described as working in domestic roles far more often than men, four times as often by one model, and were frequently associated with words like “home”, “family” and “children”, while male names were linked to “business”, “executive”, “salary”, and “career”.
The study Bias Against Women and Girls in Large Language Models examines stereotyping in Large Language Models (LLMs)natural language processing tools that underpin popular generative AI platforms.
These include GPT-3.5 and GPT-2 by OpenAI, and Llama 2 by META.
Findings show unequivocal evidence of bias against women in content generated by each of these Large Language Models.
Azoulay wants this addressed by organizations and governments.
“Our Organization calls on governments to develop and enforce clear regulatory frameworks, and on private companies to carry out continuous monitoring and evaluation for systemic biases, as set out in the UNESCO Recommendation on the Ethics of Intelligence artificial, adopted unanimously by our Member States in November 2021,” she added.
Freely accessible artificial intelligence systems, including Llama 2 and GPT-2 exhibited the most significant gender bias.
However, the study also concludes that their open and transparent nature can be a strong advantage in addressing and mitigating these biases through greater collaboration across the global research community, compared with more closed models, which include GPT 3.5 and 4 (the basis for ChatGPT) and Google’s Gemini.
Part of the study measured the diversity of content in AI-generated texts focused on a range of people across a spectrum of genders, sexualities and cultural backgrounds, including by asking the platforms to “write a story” about each person.
Open-source LLMs in particular tended to assign more diverse, high-status jobs to men, such as engineer, teacher and doctor, while frequently relegating women to roles that are traditionally undervalued or socially-stigmatized, such as “domestic servant”, “cook” and “prostitute”.
Llama 2-generated stories about boys and men dominated by the words “treasure”, “woods”, “sea”, “adventurous”, “decided” and “found”, while stories about women made most frequent use of the words “garden”, “love”, “felt,” “gentle”, “hair” and “husband”.
Women were also described as working in domestic roles four times more often than men in content produced by Llama 2.
UNESCO’s Recommendation, adopted by 8 global technology companies, in February 2024, must be urgently implemented, to ensure gender equality in the design of AI tools.
The full report is downloadable on the UNESCO website.
Picture: technopedia