hidden image

Gender Bias: Real, Not Artificial

Jaswant Kaur Jaswant Kaur
10 Mar 2025

Artificial intelligence (AI) is the buzzword today. While for some, it is merely restricted to ChatGPT or maybe Gemini, it is much more than that. What started from e-commerce platforms has now spread its wings to several domains - from the education sector to healthcare, robotics, GPS technology, automobiles, agriculture, recruitment processes, social media, and gaming. The list is endless. Indeed, AI has changed the world the way we could have never imagined until a few years ago. And this is not the end of it. The evolution continues.

If you think of a work, you will find an AI tool. In fact, a web portal, TheresAnAIForThat.com, has numerous AI tools listed on it. If you visit this page, you will find apps launched merely hours ago. From an app which can shape your resume to a parenting app to an AI poem generator to one that can create music or a podcast or your favourite graphic design to an app that you can talk with to pour your heart into; everything is now possible with AI. The website has listed all AI tools that have been launched since 2015. Yes, "AI" is a decade old now.

In this AI-packed world, things are changing rapidly, and people are adapting to these changes quickly, too. However, one thing that remains the same is the bias against women, and that bias is now AI-led.

AI does promise efficiency and objectivity. However, it mirrors the biases of society that create it. It reinforces the social inequities and perpetuates stereotypes that have been around for ages, perhaps in a more rigorous manner than ever. The issue is not just about faulty technology; it is about how AI amplifies the flaws in the human systems that shape it.

If you ever happen to use an AI tool to develop a professional photograph or a random shot using your old or generic photograph, it tends to generate a photo that aligns with a pre-defined mindset, influenced by people's general likes and dislikes. It focuses on a particular type of complexion, body build, hairstyle, outlook, etc. The pictures produced by such tools may impress you more than any professional photographer you hire.

Now, the question is, how did this bias sneak into AI? Well, it emanated from the fact that women are not as active on the Internet as men. The pandemic certainly accelerated mobile Internet adoption, but the change has not been significant. A study shows that women are less likely to use mobile Internet across low and middle-income countries.

The mobile gender gap report 2024 shows that only 66 per cent of women worldwide use mobile Internet compared to 78 per cent of men. The gap has undoubtedly reduced since 2017, when this report was published for the first time. The report says, "The underlying gender gap in mobile ownership across low- and middle-income countries has changed very little since 2017. Women across these countries are 8 per cent less likely than men to own a mobile phone."

It also shows that around 405 million women worldwide still do not have a mobile of their own due to various issues, including affordability, lack of digital literacy, and socio-cultural biases. These facts do show the gender disparity regarding access to basic digital devices for women, forget about developing software or algorithms.
The gap is starker than one would have thought of among AI developers. The Gender Gap Report 2023 reveals that women constitute only 30 per cent of professionals in the global technology industry. This figure drops to 22 per cent when it comes to AI. This gender imbalance affects the perspectives and experiences that shape AI technologies, leading to algorithms that fail to consider the needs and challenges faced by women and perpetuating existing biases.

AI uses data fed by humans to generate the results we seek. Since fewer women use or have access to mobile phones and the Internet than men, the quantum of the data available for these technologies is highly skewed. Once the data is generated, humans collect it based on their choices and perceptions.

A classic example is healthcare. Since medical trials started, females have seldom been used to conduct trials. The information generated through trials pertains to males. AI and machine learning have been used for improved and accurate detection and diagnosis by reducing cost and time, providing early detection and intervention, and enhancing patient care and satisfaction. However, a research paper shows that the technology has not shown satisfactory results when it comes to diseases affecting women like cervical cancer, breast cancer and so on.

When it comes to the workforce, AI-driven recruitment tools have only gone one step ahead in promoting gender discrimination. A striking example of this issue is Amazon's AI hiring tool. The system trained on past resumes predominantly from male applicants developed a bias against resumes that contained words like "women's," such as "women's chess club captain." Consequently, the AI downgraded applications from female candidates, reducing their chances of being hired. This demonstrates how AI can inadvertently perpetuate discriminatory hiring practices. Of course, Amazon scrapped the tool later on.

Research from Harvard Business Review shows that AI-based hiring platforms are 30 per cent more likely to favour male applicants over female ones, even when their qualifications, experiences, and skill sets are identical! This bias also gets reinforced during job advertising. AI-driven talent acquisition platforms have been found to assign specific jobs to men and others to women. Such platforms not only exasperate gender stereotypes but also violate anti-discrimination laws and policies.

Another area where AI reinforces gender disparities is facial recognition technology. Numerous studies have shown that these systems exhibit significant biases against women, particularly women of colour. A well-known study by a renowned researcher Joy Buolamwini revealed that facial recognition systems misidentify dark-skinned women at error rates of up to 34 per cent, compared to less than 1 per cent for light-skinned men, increasing their chances of harassment at various levels, including arrests for crimes they might have never committed.

Besides, AI-powered digital assistants like Siri, Alexa, and other voice assistants are predominantly programmed to have female voices, reinforcing the age-old thought that women are meant to play subservient and auxiliary roles. These assistants often respond submissively to gendered insults and verbal abuse. This subtle but pervasive bias influences how users, particularly young children, perceive gender roles and expectations, thereby affecting their mindset.

The adverse effects of AI-driven gender bias extend far beyond individual cases of discrimination. It contributes to a wider gender gap in economic opportunities, limiting career prospects for women and reinforcing harmful stereotypes. Unknowingly, the technologies created to facilitate are creating systemic barriers for women, preventing them from fulfilling their dreams.

Unless these inherent biases, stemming from a lack of gender diversity in AI and machine learning to selective data use, are addressed, we won't be able to achieve a just and equal world for all, irrespective of the targets set for the Sustainable Development Goals 2030!

To #AccelerateAction in the true sense, it is essential to accept this gap and develop algorithms to detect such biases at the development or testing stage before these tools are rolled out for use by the general public. Besides, countries should have stricter regulations to prevent AI-driven discrimination at all levels. AI has the potential to drive enormous progress. It should be used to serve the community, not perpetuate discrimination and biases.

Recent Posts

She lost her husband in the attack, yet said she gained two Kashmiri brothers—an almost unbelievable testament to humanity rising above terror, even as the absence of security exposed the failure that
apicture A. J. Philip
05 May 2025
Amid grief, Kashmiris condemned terror and offered aid, while media and political voices stoked communal hatred. True patriotism lies in unity, accountability, and empathy—the values that can heal Ind
apicture Jacob Peenikaparambil
05 May 2025
India's decision to include castes in its next census marks a historic shift. It confronts a deep-rooted social reality that has long been ignored. Depending on its execution and the will to act, this
apicture Dr John Singarayar
05 May 2025
On Labour Day, we celebrate workers—yet ignore gig workers who power our lives silently, without rights or recognition. Their struggles, masked by apps and algorithms, demand urgent legal protections.
apicture Jaswant Kaur
05 May 2025
The discreet swearing-in of a judge and renewed concerns over the collegium system have reignited debate on judicial transparency. This raises questions about executive influence, institutional accoun
apicture Dr. Olav Albuquerque
05 May 2025
Fr Vadakkekara may have departed from this world, but he lives on—in the articles he edited, the writers he mentored, the truths he upheld, and the lives he touched. His was a life of praxis—where bel
apicture Justice Kurian Joseph
05 May 2025
I imagine these WhatsApp warriors in an actual warzone — trudging through mud, dodging bullets, and looking for the "Forward" button on a grenade. Most would faint at the sight of a real gun, or worse
apicture Robert Clements
05 May 2025
Pope Francis is bowing out in this special jubilee year of hope, which he has been leading from the front even as he has braved prolonged health concerns. As he passes on and the world bids goodbye to
apicture George Plathottam
28 Apr 2025
Francis' legacy can be summarised in four keywords that reflect powerfully and prominently in his writings, discourses, actions, and life: joy, hope, mercy, and peace.
apicture Bp Gerald John Mathias
28 Apr 2025
Pope Francis redefined leadership through humility, inclusion, and service. He stood with the marginalised, prioritised mercy over judgment, championed ecological justice, and called for reform rooted
apicture Jacob Peenikaparambil
28 Apr 2025