Only one in five people doing science in artificial intelligence is a woman

 width=

From the National Center for Artificial Intelligence, researchers Claudia López and Gabriela Arriagada explain that the weak representation in laboratories is only part of the underlying problem: the invisibility of diversity in technology. 

One out of every five people working in the field of artificial intelligence in the world is a woman. However, these gaps in gender representation and access do not reflect the underlying problem: the bias generated by the lack of diversity in the process of developing solutions based on the technologies that are at the heart of the so-called 4.0 revolution. This is the opinion of the researchers of the National Center for Artificial Intelligence (Cenia) -an entity supported by the National Agency for Research and Development (ANID)-, academics Claudia López and Gabriela Arriagada, who represent Universidad Técnica Federico Santa María and Pontificia Universidad Católica in the organization.

The Chilean specialists, who lead a working group on AI ethics within Cenia, warn that, currently, the development of artificial intelligence solutions lacks the breadth and quality of data on phenomena that affect women, which has motivated networks of women scientists globally to promote a “feminist artificial intelligence” movement.

“Incorporating feminism into artificial intelligence presents a transformative look at how we develop artificial intelligence, and how we talk about biases, prejudices and systematically unequal structures,” said Professor Arriagada, a Cenia associate researcher and assistant professor at UC’s Institute of Applied Ethics and Institute of Mathematical and Computational Engineering.

One out of every five people working in the field of artificial intelligence in the world is a woman. However, these gaps in gender representation and access do not reflect the underlying problem: the bias generated by the lack of diversity in the process of developing solutions based on the technologies that are at the heart of the so-called 4.0 revolution. This is the opinion of the researchers of the National Center for Artificial Intelligence (Cenia) -an entity supported by the National Agency for Research and Development (ANID)-, academics Claudia López and Gabriela Arriagada, who represent Universidad Técnica Federico Santa María and Pontificia Universidad Católica in the organization.

The Chilean specialists, who lead a working group on AI ethics within Cenia, warn that, currently, the development of artificial intelligence solutions lacks the breadth and quality of data on phenomena that affect women, which has motivated networks of women scientists globally to promote a “feminist artificial intelligence” movement.

“Incorporating feminism into artificial intelligence presents a transformative look at how we develop artificial intelligence, and how we talk about biases, prejudices and systematically unequal structures,” said Professor Arriagada, a Cenia associate researcher and assistant professor at UC’s Institute of Applied Ethics and Institute of Mathematical and Computational Engineering.

Algorithms and non-discrimination

The commemoration of International Women’s Day this year is strongly influenced by the need to promote women’s equal access to technology. According to UN Women, 37% of women do not have access to digital devices, a scenario that is even more complex in developing, and middle and low-income countries.

The figure of 22% of female researchers in the scientific community in the field of artificial intelligence is lower than the figure for women in the so-called STEM areas (acronym for science, technology, engineering and mathematics), which is 30%. By 2050, 75% of jobs, three out of four, will depend on STEM skills. For the Cenia researchers, the deficit of women scientists in their field of study is just the tip of the iceberg, and will have consequences that will not only affect women, but also groups whose data are poorly represented in the solution research and development process.

“We start from a systematic gap where women are not represented in the data. She is invisible, because the biases arise at the methodological level,” says researcher Arriagada. A clear example of this is the common classification of a heart attack – one of the most important causes of mortality in global public health – which is a male classification. Diversities are hidden because they are more difficult to code and that is a big problem because most of the studies we use to train algorithms to support diagnosis, use studies based on male reality and needs.

The issue is particularly sensitive because the training of the algorithms on which AI is based depends on the quality and diversity of these data. And, if these data consider only part of the population (typically white, developed and high-income countries), then it will be assumed that these experiences are representative of humanity, which ends up making the rest of the human groups invisible and, possibly, generating discriminatory results. This, for example, has led to huge human rights problems in the U.S. justice system, where artificial intelligence tools for predicting criminal re-offenses have been accused of not having enough data on women’s experiences, and therefore working worse for them. This in addition to being biased against racial minorities such as African Americans, Latin Americans, and Asians, as well as other gender minorities.

Groups of scientists around the world are already warning about the ethical dilemma of data: a report by Princeton researchers warned about a data reproducibility crisis in artificial intelligence, especially in the health field, in the context of the pandemic.

Currently, networks of researchers grouped in feminist scientific collectives are addressing certain gaps in the availability of data that better describe how problems of different kinds affect women. For Dr. Claudia López, there is no artificial intelligence that can work well with poor quality data in the development of solutions. “We don’t have good data on many women’s behaviors. There are practices of ours, because of the way we are socialized and the way we navigate the world, that are different from those of men. For example, our behavior in the financial world, or in mobility in the city; also in entrepreneurship, where technologies take on a male approach, which is competitive and with exclusive dedication, very different from the one that characterizes women, who embark on their own business out of necessity, with less time and who collaborate in a network.”

The fallacy of neutrality

According to Unesco, emerging technologies have demonstrated their immense capacity “to do good”. However, the entity itself warns of the need to control its negative impacts, which are exacerbating the reality of a world that already operates on the basis of division and inequality. The objective, according to the UN agency, is to contribute to the implementation of better policies and actions, with special emphasis on inclusion and gender equality.

The Cenia researchers point out the urgency of problematizing the discussion on AI, beyond its novelty and the efficiency it promises, through multidisciplinary work, better understanding the contexts in which data is generated and those in which AI is deployed. And, in addition, identifying the multiple technical, ethical, and business decisions that influence its results and impact. From there, multiple international movements are emerging, such as the A+ Alliance, and its proposal for a feminist AI network; or the Data Género initiative. A+ Alliance, for example, is promoting a feminist approach anchored in Human Rights, as Dr. Lopez explains, “because both approaches aim at equality for all people.”

Researcher Arriagada agrees, and reflects that “more than pretty speeches, we need tools that help different female developers and researchers to participate in this process and incorporate these principles and values in their work methodologies. This is a project that we at Cenia are taking very seriously. We are committed to playing an important role in the education of scientific communities and that this is transferred to society.”

For both academics, it is no surprise that AI is replicating patterns of inequality that have historically been at the basis of technological development. That is why, they say, many researchers in Chile and Latin America are joining forces to break the trend. This is what many refer to as “the fallacy of neutrality” in technologies.

“A lot of the separation there has been over the last 30 or 40 years of women towards ICTs has been caused by the idea of computers and videogames as something masculine. We are concerned that we believe that the problem is that women have not wanted to be part of engineering or science. Making space for ourselves cannot be a burden on our shoulders alone. Institutions and discipline must also change, and generate these spaces. In this regard, the feminist AI invites us to question assumptions such as that technology is neutral and serves all people equally. Historically this has not been the case,” suggested Dr. Lopez.

For Professor Arriagada, it is not a simple problem to fix because, in the case of AI, the prediction of everyday situations tends to be reductionist. “Solutions also require the participation of governments, companies and other institutions that have individual and shared responsibilities. We can no longer sweep this under the rug. But we must not settle for easy solutions. Feminism teaches us that from complexity we can achieve better things and put technology at the service of people.”

At the academic level, locally, ANID has been consolidating a gender x-ray in science, technology, knowledge and innovation, showing that the number of women enrolled in higher education is higher than that of men (54% and 46%, respectively). However, this proportion decreases as the academic degree progresses. The gap widens for women as they move up the academic hierarchy, a pattern that is repeated in all universities.

 

By: Luis Francisco Sandoval. Agencia Inés Llambías Comunicaciones

News

Noticias Recientes