原文始发于微信公众号(瑞中法协):女性和数字化时代


克里斯蒂安-费拉尔-舒尔(Christiane Féral-Schuhl),法国和加拿大国籍,是一位专门从事新技术,特别是计算机法的律师。克里斯蒂安-费拉尔-舒尔是现任的法国律师协会主席为《瑞中法律评论》投稿,版权所有。


WOMEN AND DIGITAL

 

On theaftermath of International Women’s Rights Day, the French National Bar Councilbrought together fifteen women lawyers, engineers, professors, researchers,programmers and entrepreneurs to seek solutions to the problem of gender biasin artificial intelligence algorithms and to promote female role models toencourage women to take their place in this field. While today, machines andalgorithms are designed mainly by men, it should be remembered that the firstprogrammer was a woman. However, in 2020, digital, at 17% female, is thepenultimate sector least pursued by women, after aeronautics.

 

在国际妇女权利日之后,法国国家律师委员会召集了15名女律师、工程师、教授、研究人员、程序员和企业家,以寻求解决人工智能算法中的性别偏见问题,并推广女性榜样,鼓励妇女在这一领域占有一席之地。虽然今天,机器和算法主要是由男性设计的,但应该记住,第一个程序员是一位女性。然而,在2020年,数字领域的女性比例为17%,是仅次于航空领域的第二位最不受女性欢迎的领域。


Women oftoday are at the end of the line, sometimes held back by stereotypes depictingthe “geek” as a necessarily male figure wallowing in front of acomputer screen and eating cold pizza. This figure not only is no longeradapted to current ecosystem, but it also excludes the female figure from thedigital world. It is therefore important to expose girls to the code at a veryearly age, because not all stereotypes are formed in young children. Coding isno more complicated than writing, and code is not the exclusive property ofengineers. It just needs to be taught at school, to all. Nor does it mean thatage is a barrier, because women of today, as well as those of tomorrow, cantake their place in the digital world in view of the tools at their disposal.While education is a long-term solution, training is a short-term one toestablish the place of women and to try to remedy gender bias.


今天的女性处于队伍的末端,有时会被刻板印象所阻挡,将 “极客 “描绘成一个必然是沉浸在电脑屏幕前,吃着冰冷的披萨的男性形象。这种形象不仅不再适应当前的生态系统,而且还将女性形象排除在数字世界之外。因此,让女孩在年幼时就接触到代码是很重要的,因为不是所有的刻板印象都是在年幼的孩子身上形成的。编码并不比写作复杂,而且编码也不是工程师的专利。它只是需要在学校里教,教给所有人。这也不意味着年龄是一个障碍,因为今天的女性,以及明天的女性,都可以在数字世界中占有一席之地,因为她们可以利用自己掌握的工具.虽然教育是一个长期的解决方案,但培训是一个短期的解决方案,以确立女性的地位,并试图纠正性别偏见。

 

Artificialintelligence (AI) is often personalized, regarded as both responsible andguilty. However, AI is an analyser of our own biases: it expresses nothing morethan the encapsulated opinions of those who conceive them. It can thusreproduce the sexist tendencies of a Human Resources policy to retain only thecurriculum vitae of male candidates. A keyword search on the notion of”head of a company” will result in the image of a man wearing a tieand, conversely, a search on the notion of “household staff” willreveal women in aprons. Any bias is induced, voluntarily or not, by the designersof the algorithm and they do not only have sexist consequences. Several typesof bias can be encountered.


人工智能(AI)往往是个性化的,这种个性化是被认为是其“原罪”。然而,人工智能是我们自身偏见的分析者:它所表达的不过是那些构想它们的人“封装”的意见。因此,它可以重现人力资源政策中的性别歧视倾向,即只保留男性候选人的简历。以 “公司负责人 “的概念为关键词进行搜索,会出现一个打着领带的男人的形象,反之,以 “家政人员 “的概念进行搜索,则会出现穿着围裙的妇女。任何偏见都是由算法的设计者自愿或不自愿地诱导出来的,它们不仅有性别歧视的后果。可能会遇到几种类型的偏见。


Thereare data biases, which can have ethnically discriminating consequences andwhich are expressed, for example, through the recognition of morphologicalcriteria of the face or skin colour. They are a consequence of the fact that AIlearning data cannot be representative if learning is based on a single,European standard. The biases of algorithms, when there are predictive,”predict” from past data which are thus perpetuated. When it comes tothe professions of lawyer and judge, human freedom and initiative are frozen.Economic biases, which are more discrete and pernicious, are no less common.Algorithms assisting in the design of advertisements consider the predictablecost, which results in targeting one population less than another for purelyeconomic reasons and may be biased against women.

 其中有数据偏见,可能会产生族裔歧视的后果,例如,通过对脸部或肤色的形态学标准的识别来表现。如果学习是基于单一的欧洲标准,它们是人工智能学习数据不可能具有代表性的结果。算法的偏差,当有预测性时,就会从过去的数据中 “预测”,从而延续下去。当涉及到律师和法官这两个职业时,人的自由和主动性都被冻结了,经济上的偏见更加离散和有害,但也同样常见。


AIignores, mistreats the differences. But it’s not responsible or guilty. Representativeness is not enough. AI amplifies biases without any realpossibility to rectify afterwards. A suspicion of discriminatory bias isdifficult not only to identify but even more difficult to repair. This is whywe need more women, and more widely minorities, involved in the design of thesetools, so that their parameterization integrates the richness of our opinions.

 AI忽略了差异。但它并不具有承担责任的能力。AI放大了偏见,却没有任何事后纠正的可能性。歧视性偏见的嫌疑不仅难以识别,更难以修复。这就是为什么我们需要更多的女性,更广泛的少数群体,参与到这些工具的设计中来,使它们的参数化整合了我们意见的丰富性。


Digitaltransforms society and, as long as electricity exists, it will continue to consumeit. The entire economy is supported by digital. Is it the fate of any sector tobecome masculinized as it becomes emerging? Competitiveness and success shouldnot be gendered. A sector of activity that is not mixed disturbs because it isfrom diversity that richness is born. Digital knowledge is masculine. This istrue in France. Not everywhere else. In some places, digital is a promisingsector for women because it allows them to work from home and adapts betterthan others to their constraints. Another surprising observation – even frightening:the higher the general level of equality in a country, the less women engage indigital studies. Conversely, the more countries are based on unequal systems,the less girls and boys live together, the more women move into digitalchannels. Is co-education a catalyst for inhibitions?


数字化改变了社会,只要电存在,就会继续消耗它。整个经济都是由数字支撑的。任何一个行业在成为新兴行业的时候,它的命运都是男性化的吗?竞争力和成功不应该是性别化的。一个没有活动的部门会让人不安,因为丰富性正是从多样性中诞生的。数字知识是男性化的。这在法国是真实的。其他地方则不然。在一些地方,数字行业对女性来说是一个很有前途的行业,因为它允许她们在家工作,而且比其他行业更能适应她们的限制。另一个令人惊讶的观察–甚至是可怕的:一个国家的总体平等水平越高,女性从事数字研究的人数越少。相反,越是建立在不平等制度基础上的国家,女孩和男孩生活在一起的次数越少,女性进入数字渠道的次数就越多。男女同校是抑制的催化剂吗?

 

Rectifyinggender bias requires finding ways to integrate women more in business, analysisand research. One way could be to intervene right from the training stage bymaking courses more attractive to women. The company, too, has internalorganizational means to integrate women into decision-making processes. It mustimplement them. Recruitment in diversity is a simple measure to apply andcontributes to a fair representation of the structure. Consideration must begiven to the question of representation, to the still little-known digitalprofessions, design and sociology. Is it necessary to impose or convince, setup quotas or set up incentive mechanisms? For a long time, the word quota,understood as the possibility of admitting weaker categories, has made peoplegrind their teeth. However, introducing a quota does not prevent performance.Although it may not be the only way, it can achieve quick results.


纠正性别偏见需要找到方法,比如让女性更多地参与商业、分析和研究。一种方法可以是从培训阶段就开始干预,研发对女性更具吸引力的课程。公司也有将妇女纳入决策过程的内部组织手段。它必须实施这些手段。多样化招聘是一项简单的措施,有助于实现结构的公平代表性。必须考虑到代表性问题,考虑到仍然鲜为人知的数字专业、设计和社会学。是否需要强加或说服,设置配额或建立激励机制?长期以来,配额这个词,被理解为可以接纳较弱的类别,这让人们咬牙切齿。然而,引入配额并不能阻止绩效,虽然它可能不是唯一的方法,但它可以达到快速的效果。

 

The best devices in the world, if they are blind to gender bias, will reproduce thesame shortcomings. Hence the need to train trainers to implement a genuinepedagogy of equality. Gender bias has an impact on girls’ academic performance.It has been observed that they lose self-confidence from secondary schooldespite good results in primary school. An exercise presented as a mathematicsproblem generates poorer results for girls. When this bias is deconstructed, itis the boys who then perform less well. Changing course descriptions has anobvious impact. Teacher training is a topic that needs to be addressed. Theyhave the noble task of accompanying children in their education. They musttherefore be the first line of defence against inequality. They are not theonly ones. Parents are their children’s greatest influence, but they are moredifficult to train. Yet they should be informed about the great possibilitiesof digital in order to convey the desire for it.

 世界上最好的设备,如果对性别偏见视而不见,也会产生同样的缺陷。因此,有必要对培训员进行培训,以实施真正的平等教育法。性别偏见对女孩的学习成绩有影响,据观察,尽管女孩在小学的成绩很好,但从中学开始就失去了自信。以数学问题的形式呈现的练习对女孩来说会产生更差的结果。当这种偏见被解构后,就是男生的成绩也会变差。改变课程描述有一个明显的影响。教师培训是一个需要解决的话题。他们肩负着陪伴儿童接受教育的崇高任务。因此,他们必须成为反对不平等的第一道防线。他们不是唯一的人。父母是对子女影响最大的人,但他们更难培养。然而,应该让他们了解数字化的巨大可能性,以便传达对数字化的渴望。 


Fromthis debate, difficulties, dangers and warnings have emerged, but also agrowing optimism in front of awareness and the search for solutions at thelevel of companies, the State and at the international level, where areflection on ethics in AI is currently underway.

从这场辩论中,出现了困难、危险和警告,但也出现了越来越多的乐观情绪,在企业、国家和国际层面的意识和寻求解决方案面前,我们应当对人工智能的伦理进行反思。

发表回复

您的电子邮箱地址不会被公开。 必填项已用*标注