What it takes to make equitable AI

Participants of the Girls in ICT Day 2019, organised by the ITU at the African Union in Addis Ababa, Ethiopia. (© ITU/M.Tewelde)

The subject of inclusion and ethics in the field of Artificial Intelligence (AI) has been raised often in recent months. At an “AI for Good” online seminar organised yesterday by the Geneva-based International Telecommunications Union, panellists representing women and girls from several regions in the world discussed what it would take to make AI equitable.

Evidence shows that systemic gender biases in AI will continue to be perpetuated unless girls and women are involved in the design of AI. But this is easier said than done. As Patty Alleman of UNICEF, who moderated the session, reminded the participants, while 35 per cent of men studying science, technology, engineering and maths (STEM) in tertiary institutions go on to pursue STEM careers, the proportion is only 18 per cent for women. And the seeds for these differences are sowed in childhood, remarked Alisha Arora, another speaker in the session.

Arora is a high-school student who is interested in the use of AI in suicide prevention. She is also the founder of The HopeSisters, a Canadian non-profit working to improve mental health. In eighth grade, when she led her robotics team to a win despite being the only girl on the team, one of the boys asked her why she bothered programming when she was destined to be a housewife.

“It made me question my potential,” Arora said. “But also I learnt from that experience that I had to challenge the status quo.”

It is important to challenge these notions early. “We can make the changes when you’re about to go into industry, about to go get a job,” she continued, “but when you’re young is where it all starts. It’s where the stigmas and societal norms really start to [come into] play.”

Her experience was echoed by Ecem Yılmazhaliloğlu, a high-school student from Turkey, who is a council member at the World Economic Forum’s AI Youth Council. “The turning point for me was when I got into my coding club, and there were 20 students but only two girls.” The only other girl, she recalled, had to drop out of the club because she felt lonely, unwanted and out of place. “[Girls] get discouraged, hide their interests, because society didn’t deem the field of tech and AI as reasonable for girls.”

If she faced discrimination in her youth, Dr Hoda Alkhzaimi didn’t let it hold her back. Today she is an assistant professor at New York University Abu Dhabi with a PhD in cryptography. According to Alkhzaimi, AI affects nearly every field and every industry because of the efficiencies that it brings to the fields. However, there exist deep biases in how AI is deployed:

“We’ve seen in the past several incidents where AI has been utilised for recruitment purposes and people who were the subjects of the recruitment AI tool reported incidents of discrimination against their portfolio. Because they were not white men, and because the designers [of the AI] bring in their biases.”

To avoid perpetuating bias-orientated gaps, she said, we as a society have to ensure that developers of AI are doing their due diligence so that the design of AI is inclusive of everyone.

The use of AI, especially machine learning, has exploded in the past decade and the data that the machines learn from are problematic, noted Caitlin Kraft-Buchman, CEO and founder of the Geneva-based Women at The Table. “The data that [machines] use is very strange because it’s default-male data. When the machines take the data and learn from the data, they pattern the invisibility of the girls and the women and other historically marginalised communities that haven’t been counted and put into this data system, and this invisibility becomes permanent, almost wired into the system. The machines make it the rule.”

Biases in AI are replications of biases in other historical approaches, but with AI it becomes more structural and scalable, noted Alleman. Addressing these biases requires us to look beyond technologists, be they men or women, said Kraft-Buchman. We need to open up the table to women and girls to sit at the table, she said, and explain their problems and the solutions even if they cannot write the code yet. “They need to be included in the decision-making process, which still suffers mostly from being technical. We need economists, sociologists, teachers, social workers. We need people who understand the lived realities of the world together with the people who can do the technology solving.”

The statement resonated with Arora: “Youth, especially young women, are one of the potential stakeholders in the AI conversation that are mostly overlooked. We need to engage, as youth, into this conversation on the development and deployment of AI.”

“We can’t just be two per cent of humans sitting at the table trying to build the world’s solutions,” Alkhzaimi added emphatically. “We want to provide technology for the welfare of human beings. An AI that has not been built by a diverse team should be a questionable AI.”

The lively session was closed by Belém Guede from Chile, co-founder of STEM Academy. She noted that in her experience the more women the girls in her community saw leading, the more empowered they felt to join. “Action towards equality in the field [of AI] needs to be intentionally made and it needs to be ambitious. If we do AI right,” she added, “we can change the digital gender gap instead of deepening it.”

You can watch the seminar in full on YouTube.

Newsletters