You are using an outdated browser. Please upgrade your browser to improve your experience and security.

Tracking transition: The Green Economy Tracker
Our flagship analysis platform for green economy policies around the world, benchmarking 41 countries across 21 metrics and 6 themes. How's your country rated?

Women missing: Dismantling gender bias across diverse sectors

Gender biases are a real roadblock to achieving progressive change, so how can we move past them to unlock transformation?

By Arijit Goswami & Rachel Masika Guest Author · 11th June, 2024
Sandy millar Kh St XR Vhfog unsplash
Image by Sandy Millar / Unsplash

This blog was published as part of our Global Research & Action Network for a New Eco-Social Contract, a joint project with UNRISD, supported by funds received from the European Union.

In the pursuit of gender justice, an undeniable truth resonates — the battle for equality begins with dismantling gender biases. Such biases have over the years crept into our conversations, literature, media content, and even inside machines (Borokini et al., 2023; O’Connor and Liu, 2023). What if we told you that humankind has taught machines to be prejudiced against women? What if our social narratives are invisibly reinforcing the stereotypes that act as antithesis to gender justice?

Most people would be alarmed for genuine reasons, despite such biases circulating all around them in mainstream digital and non-digital media and all forms of communication. For example, women’s issues were scantily mentioned in important communications, opening remarks, and key addresses during the major United Nations Climate conferences recently, such as COP28.

Talking about the intersection of climate change and gender justice, we find that Artificial Intelligence (AI) algorithms used for disaster prediction and preparedness do not have a fair use across all sectors. Gender bias and lack of diversity in both data and developers contribute to unfair and biased understandings, policy interventions, and outcomes. First, there is programmer bias, which is reflected as systemic and repeatable errors in computer algorithms that lead to unfair outcomes, because of the gendered biases inherent in the mind of the programmer, typically male as there are few and low availability of women in data science roles. World Economic Forum’s Global Gender Gap Report 2020 revealed that women constituted only 26% of the total data science workforce.

Further research by Bob Hayes of Business Over Broadway indicates that women make up just one in four data scientists and hold a mere 26% of all data professional positions (ALX Africa, 2023). Moreover, datasets available globally for AI development do not represent women as adequately as they do men, because of the difference in rate of data collection for the two genders. For example, when an AI image generation software was asked to create images of “judges”, only 3 percent of the images were those of women (UN Women, 2023). Consequently, women remain discriminated against by machines, as social stereotypes about women creep into AI algorithms and underlying datasets. For example, Amazon’s AI recruitment tool was found in 2018 to discriminate against female applicants, which led to an opportunity loss for women (American Civil Liberties Union, 2018).

Similar challenges have been observed with climate-related data, where data about the impact of climate change on women has been found insufficient, despite the common knowledge that women are more severely impacted by natural disasters compared to men. Underrepresentation of women in climate change data can be detrimental to disaster preparedness when women’s experiences are not taken into account in predictions of the impact of extreme climate conditions.

The world urgently needs greater participation of women in data science to ensure data and technology do not turn against the cause of women.”

Given the knowledge that women and children are at a disproportionately higher risk due to adverse climate change events, we need more gender-disaggregated climate data and gender-inclusive data analysis. Climate change pushes an estimated 158 million women and girls into poverty, surpassing the total number of men and boys by 16 million. Women and girls represent up to 80 percent of those displaced by climate change and natural disasters and are 14 times more likely to die in their aftermath (UNESCO, 2024). Therefore, we urgently need gender-inclusive climate policies and greater participation of women in the climate data collection and analysis process as part of new eco-social contracts.

Gender bias is not limited to climate science or algorithms alone. The media and entertainment sector perpetuate gender-related stereotypes quite strongly and have reinforced gender socialization for many years. Take India for example, where data of highest grossing movies for the past 5 years have been dominated by male lead characters while movies with female lead characters have fared lower in revenues. The dominance of male directors in the movie and entertainment industry in India also leads to a lack of women-centered narratives.

Even the leading podcasts and talk shows in India are dominated by male hosts. This gender gap is starkly reflected in the industry as women hold only 13 percent of senior roles in the Indian media and entertainment industry (Mint, 2023). At the global scale, cartoon serials have been found to mostly feature males in lead roles, thus reinforcing gender stereotypes among kids which get cemented over time. The dominance of male authors and male protagonists in the books winning some of the most prestigious literary awards also highlights the overrepresentation of masculine viewpoints in print media. Women authors and women protagonists dominated only in self-publishing platforms, probably due to the greater freedom women authors experience on these platforms.

Getty Images 1169784713 866X487

Talking of media platforms, we cannot ignore the threat of cyberbullying and unsafe spaces online that deter effective participation by women in the digital world. Digital platforms that democratize access to knowledge and culture, fostering global connection are also conduits for misinformation, ideological polarization, hate, and violence. 85 percent of women who have spent time online have witnessed digital violence against other women. Online violence has been experienced by 75 percent of female journalists. 58 percent of young women and girls have faced online violence.

Such examples of the misogyny and insecurity of women online abound. For example, on May 21, 2023, a disturbing case of a woman activist’s badly injured face appeared on Facebook. This post was accompanied by a caption about how the woman deserved it. The post remained unchallenged for two years despite being reported for inciting violence multiple times until a user appealed to the Facebook Oversight Board. Meta's Oversight Board pointed out a policy gap: by not dealing with misogynistic content where victims are unrecognizable or depicted as fictional, Meta's rules seem to allow content that "normalizes gender-based violence by praising, justifying, celebrating or mocking it." In response, Meta noted that it is working on a policy regarding "Praise of Violent Acts". This post, while finally removed, demonstrated flaws in Big Tech policies and processes for review. It also highlighted the need for the centrality of tackling gender-based violence in social media policies.

The world urgently needs greater participation of women in data science to ensure data and technology do not turn against the cause of women as our society has. By implementing the right guardrails and policies around data and algorithms, we can endeavour to make technology an ally, instead of an adversary, of women. UNESCO's platform governance guidelines highlight transparent content moderation, institutional checks, accessible governance, diverse expertise, and cultural inclusivity.

Exploring new eco-social contracts, and forward-thinking regulatory arrangements would benefit from prioritizing sustainability, independent oversight, and systems grounded in openness, transparency, and evidence. As we contemplate new eco-social contracts, the Global Research and Action Network for a New Eco-Social Contract (GRAN-ESC) calls for regulatory arrangements for enabling gender justice across intersecting areas of gender representation, media, and climate change. The journey from identifying biases to advocating robust policies marks a pivotal step toward gender-inclusive and sustainable futures.

- Arijit Goswami & Rachel Masika

This blog was published as part of our Global Research & Action Network for a New Eco-Social Contract, a joint project with UN-RISD, supported by funds received from the European Union.


Borokini, F., Wakunuma, K., Akintoye, S., 2023. The Use of Gendered Chatbots in Nigeria: Critical Perspectives, in Responsible AI in Africa: Challenges and Opportunities. Springer International Publishing Cham, pp. 119–139.

O’Connor, S., Liu, H., 2023. Gender bias perpetuation and mitigation in AI technologies: challenges and opportunities. AI & SOCIETY 1–13.

Related Articles