3 Questions To… Aude Bernheim & Dr. Flora Vincent, researchers at the Weizmann Institute of Science who recently published « L’intelligence artificielle, pas sans elles», a book on the issues about discrimination and sexism through artificial intelligence. Flora Vincent holds a PhD in Marine Microbiology, while Aude Bernheim is a PhD Candidate specialized in Molecular Genetics.
The pair also co-founded in March 2013 WAX science, an association that promotes stereotype free science and gender equality in science. A number of students and volunteers collaborate to develop and spread innovative tools, as well as publishing articles to raise awareness on the platform www.wax-science.fr.
Coup Data wanted to have their perspective on these matters, to address the topic of the position of women in the field of artificial intelligence.
1.
Are the algorithms macho?
Algorithms have no intentions per se, they reflect society and the views of its creators. But our society is sexist. Translation algorithms when they switch from a non-gendered language (like Turkish) to a gendered language propose associations: a single person becomes a single man while a married person becomes a married woman.
These stereotypes can become particularly unfair when it comes to sorting out CVs (systematically discarding those of women for technical positions) or offering salaries (automatically lower for women). AI can reproduce the gendered prejudices of our society, propagate and amplify them.
2.
So we are condemned to reproduce and propagate gender inequalities?
No! The mechanisms generating these biases are today pointed out, understood and dissected. It is possible to domesticate, detect and reduce the biases of algorithms. From the design stage, at the level of the computer code, then through the selection and constitution of databases and their analysis, to the evaluation of automatic solutions; the diversity of solutions is equal to the problem. Many researchers and industrialists are working on it.
In short, the algorithms must be « educated »
Some applications are already operational, in particular to measure and fill the gender gap (the gap in statistics between women and men) in various fields. In addition, institutional injunctions for algorithms that are egalitarian, explainable and fair are becoming clearer in Europe and France. In short, the algorithms must be « educated ».
3.
In your opinion, what can work at the interface of feminism and technology provide?
The fight to get more women involved in scientific fields is as essential as ever. Beyond this oft-mentioned question, feminism today must question the cutting-edge technologies, and this is exactly our approach and that of the Equality Laboratory. Feminist thought has transformed many social sciences, the example of AI shows to what extent it can transform the so-called « hard sciences ».
Artificial intelligence can quantify and reveal previously hidden biases (…) Let's seize this opportunity to encode equality in algorithms
What is a fair algorithm? Mathematical, computational, ethical answers exist and must be implemented. But technologies can also help equality. Artificial intelligence can quantify and reveal previously hidden biases. It is perhaps easier to change lines of code than mentalities. Let's seize this opportunity to encode equality in algorithms.
TO GO FURTHER:
L'intelligence artificielle, pas sans elles ! - Aude Bernheim and Flora Vincent, Belin Editions, March 3rd 2019.
The New York Times - Biased Algorithms are Easier to Fix Than Biased People Déc. 9th 2019.
Comments