克里斯蒂安-费拉尔-舒尔(Christiane Féral-Schuhl),法国和加拿大国籍,是一位专门从事新技术,特别是计算机法的律师。克里斯蒂安-费拉尔-舒尔是现任的法国律师协会主席为《瑞中法律评论》投稿,版权所有。



On theaftermath of International Women’s Rights Day, the French National Bar Councilbrought together fifteen women lawyers, engineers, professors, researchers,programmers and entrepreneurs to seek solutions to the problem of gender biasin artificial intelligence algorithms and to promote female role models toencourage women to take their place in this field. While today, machines andalgorithms are designed mainly by men, it should be remembered that the firstprogrammer was a woman. However, in 2020, digital, at 17% female, is thepenultimate sector least pursued by women, after aeronautics.



Women oftoday are at the end of the line, sometimes held back by stereotypes depictingthe “geek” as a necessarily male figure wallowing in front of acomputer screen and eating cold pizza. This figure not only is no longeradapted to current ecosystem, but it also excludes the female figure from thedigital world. It is therefore important to expose girls to the code at a veryearly age, because not all stereotypes are formed in young children. Coding isno more complicated than writing, and code is not the exclusive property ofengineers. It just needs to be taught at school, to all. Nor does it mean thatage is a barrier, because women of today, as well as those of tomorrow, cantake their place in the digital world in view of the tools at their disposal.While education is a long-term solution, training is a short-term one toestablish the place of women and to try to remedy gender bias.

今天的女性处于队伍的末端,有时会被刻板印象所阻挡,将 “极客 “描绘成一个必然是沉浸在电脑屏幕前,吃着冰冷的披萨的男性形象。这种形象不仅不再适应当前的生态系统,而且还将女性形象排除在数字世界之外。因此,让女孩在年幼时就接触到代码是很重要的,因为不是所有的刻板印象都是在年幼的孩子身上形成的。编码并不比写作复杂,而且编码也不是工程师的专利。它只是需要在学校里教,教给所有人。这也不意味着年龄是一个障碍,因为今天的女性,以及明天的女性,都可以在数字世界中占有一席之地,因为她们可以利用自己掌握的工具.虽然教育是一个长期的解决方案,但培训是一个短期的解决方案,以确立女性的地位,并试图纠正性别偏见。


Artificialintelligence (AI) is often personalized, regarded as both responsible andguilty. However, AI is an analyser of our own biases: it expresses nothing morethan the encapsulated opinions of those who conceive them. It can thusreproduce the sexist tendencies of a Human Resources policy to retain only thecurriculum vitae of male candidates. A keyword search on the notion of”head of a company” will result in the image of a man wearing a tieand, conversely, a search on the notion of “household staff” willreveal women in aprons. Any bias is induced, voluntarily or not, by the designersof the algorithm and they do not only have sexist consequences. Several typesof bias can be encountered.

人工智能(AI)往往是个性化的,这种个性化是被认为是其“原罪”。然而,人工智能是我们自身偏见的分析者:它所表达的不过是那些构想它们的人“封装”的意见。因此,它可以重现人力资源政策中的性别歧视倾向,即只保留男性候选人的简历。以 “公司负责人 “的概念为关键词进行搜索,会出现一个打着领带的男人的形象,反之,以 “家政人员 “的概念进行搜索,则会出现穿着围裙的妇女。任何偏见都是由算法的设计者自愿或不自愿地诱导出来的,它们不仅有性别歧视的后果。可能会遇到几种类型的偏见。

Thereare data biases, which can have ethnically discriminating consequences andwhich are expressed, for example, through the recognition of morphologicalcriteria of the face or skin colour. They are a consequence of the fact that AIlearning data cannot be representative if learning is based on a single,European standard. The biases of algorithms, when there are predictive,”predict” from past data which are thus perpetuated. When it comes tothe professions of lawyer and judge, human freedom and initiative are frozen.Economic biases, which are more discrete and pernicious, are no less common.Algorithms assisting in the design of advertisements consider the predictablecost, which results in targeting one population less than another for purelyeconomic reasons and may be biased against women.

 其中有数据偏见,可能会产生族裔歧视的后果,例如,通过对脸部或肤色的形态学标准的识别来表现。如果学习是基于单一的欧洲标准,它们是人工智能学习数据不可能具有代表性的结果。算法的偏差,当有预测性时,就会从过去的数据中 “预测”,从而延续下去。当涉及到律师和法官这两个职业时,人的自由和主动性都被冻结了,经济上的偏见更加离散和有害,但也同样常见。

AIignores, mistreats the differences. But it’s not responsible or guilty. Representativeness is not enough. AI amplifies biases without any realpossibility to rectify afterwards. A suspicion of discriminatory bias isdifficult not only to identify but even more difficult to repair. This is whywe need more women, and more widely minorities, involved in the design of thesetools, so that their parameterization integrates the richness of our opinions.


Digitaltransforms society and, as long as electricity exists, it will continue to consumeit. The entire economy is supported by digital. Is it the fate of any sector tobecome masculinized as it becomes emerging? Competitiveness and success shouldnot be gendered. A sector of activity that is not mixed disturbs because it isfrom diversity that richness is born. Digital knowledge is masculine. This istrue in France. Not everywhere else. In some places, digital is a promisingsector for women because it allows them to work from home and adapts betterthan others to their constraints. Another surprising observation – even frightening:the higher the general level of equality in a country, the less women engage indigital studies. Conversely, the more countries are based on unequal systems,the less girls and boys live together, the more women move into digitalchannels. Is co-education a catalyst for inhibitions?



Rectifyinggender bias requires finding ways to integrate women more in business, analysisand research. One way could be to intervene right from the training stage bymaking courses more attractive to women. The company, too, has internalorganizational means to integrate women into decision-making processes. It mustimplement them. Recruitment in diversity is a simple measure to apply andcontributes to a fair representation of the structure. Consideration must begiven to the question of representation, to the still little-known digitalprofessions, design and sociology. Is it necessary to impose or convince, setup quotas or set up incentive mechanisms? For a long time, the word quota,understood as the possibility of admitting weaker categories, has made peoplegrind their teeth. However, introducing a quota does not prevent performance.Although it may not be the only way, it can achieve quick results.



The best devices in the world, if they are blind to gender bias, will reproduce thesame shortcomings. Hence the need to train trainers to implement a genuinepedagogy of equality. Gender bias has an impact on girls’ academic performance.It has been observed that they lose self-confidence from secondary schooldespite good results in primary school. An exercise presented as a mathematicsproblem generates poorer results for girls. When this bias is deconstructed, itis the boys who then perform less well. Changing course descriptions has anobvious impact. Teacher training is a topic that needs to be addressed. Theyhave the noble task of accompanying children in their education. They musttherefore be the first line of defence against inequality. They are not theonly ones. Parents are their children’s greatest influence, but they are moredifficult to train. Yet they should be informed about the great possibilitiesof digital in order to convey the desire for it.


Fromthis debate, difficulties, dangers and warnings have emerged, but also agrowing optimism in front of awareness and the search for solutions at thelevel of companies, the State and at the international level, where areflection on ethics in AI is currently underway.


Share on facebook
Share on twitter
Share on linkedin
Share on pinterest


您的电子邮箱地址不会被公开。 必填项已用*标注