SC dismisses dowry harassment complaint

NEW DELHI: The Supreme Court has dismissed...

Smriti Irani highlights AI’s role in gender equality

NewsSmriti Irani highlights AI’s role in gender equality

Smriti Irani discussed AI’s role in gender equality and its economic and societal impacts.

 

New Delhi: Smriti Irani discussed AI’s significance in gender equality at NXT Conclave 2025, highlighting how technology is neutral to its users. She talked about how artificial intelligence (AI) can help close the gender gap and create a more inclusive digital future.

While speaking about AI at the event, Irani shed light on what ‘I’ means for gender, saying, “I think one of the greatest fallacies that have been sold is that technology is agnostic to bias. As AI grows, one recognises, realises, and many people monetise the fact that technology has created certain schisms and gaps which have become opportunities for those who have lived as thought leaders or as policy makers dedicated to equity. Their decadal presumption was that equality will be assured. That is still a challenge when you talk about the age of AI. Many people compute how much it will add to a nation’s GDP.”

“Not much psychological computation has been done about how it will affect the psychology of consumerism. From a gender perspective, if you look at global research, you will find that of everything women earn, close to 80 to 90% of their own earnings go into the health and education of their families. Now, as technology enables consumerism, you will find that today, research says 72% of purchasing decisions are affected by algorithms. One would presume that those algorithms would deepen your engagement as a woman on issues of education and health because you are genetically predisposed to them. But what happens is that technology, as the Journal of Psychology especially points out, leads to women spending 22% more on perishable consumables and being 30% less likely to receive algorithmically driven data sets or content that encourages them to invest.”

The former Union Minister continued by sharing research done at Berkeley by a woman named Genevieve Smith.

“A small research study was conducted by a lady named Genevieve Smith. They studied 133 AI systems and found that there was a gender bias in 44% of those systems. The reason this research came about is also unique. Genevieve Smith, along with her husband, applied for a credit card in the US. She had the same money in the bank, in fact, they had a joint account. Her credit score was better than her husband’s and her tax returns were as transparent as his. Yet, she received 10% to 20% less credit than her husband. This drove her to start studying the technological effects on how women’s creditworthiness decreases over time. When this happened, her husband was outraged and went on Twitter to express that this was happening to people. He called it a “comedy of errors” because he worked in technology, helping to code web-based applications. He made a public call on Twitter, saying this had happened to their family. Steve Wozniak, I believe, who was part of the team behind one of the first devices, responded and said, ‘Guess what, this… I don’t know, I like this.’”

Irani also shed light on biased facial recognition systems, saying, “They use a male body or the dynamics of a male body in AI systems. They do not consider how a female body might be impacted, for example, when driving. They do not account for a pregnant woman driving. Research now shows that in the same scenario, women, due to being excluded from the research phase, are 44% more likely to be injured in the same kind of accident a man might have. They are 17% more likely to die compared to men.”

“If it perpetuates bias, I mentioned left and right hands earlier, and you ask an AI imaging tool to create an image of a left-handed person writing, they can’t. This bias exists because all the images they are trained on are of right-handed people. There can also be racial bias in data scraping, which we have seen in facial recognition systems, where 80% of them are based on either white skin or male models, even in fingerprint scanners,” she added.

Irani raised the point that gender equity has a significant economic impact, which is often overlooked in rhetoric.

“Can you imagine the economic loss to that community or system on those two verticals of spend?” Irani stated.

“I think one of the greatest tragedies regarding gender

इस शब्द का अर्थ जानिये
is that we’ve always had a rights-based conversation about it. If there had been an equal amount of conversation about the economic impact of not ensuring equity, we would not have seen the kind of bias that’s now hidden in a lot of rhetoric.”

She added, “We would not have seen it if we had made an evidence-based economic case for it, maybe a decade or a decade and a half ago. Now, huge efforts are being made to reach the goal of artificial general intelligence, and we are going to devote hundreds of billions of dollars to the compute and hardware needed to make this happen.”

- Advertisement -

Check out our other content

Check out other tags:

Most Popular Articles