的讲话 / 人工智能是性别歧视吗?? 订阅

This is a picture of three young people and a sketch of a robot
亚马逊’s AI Recruiting Tool Didn’t Like Women

亚马逊 最近取消了 这是一种实验性的人工智能(AI)招聘工具,被发现对女性有偏见. 此时此刻, I hope you might have a few questions, such as: What is an AI recruiting tool and how does it work? Why was it biased against women? I’ll try to answer them for you in the following.

人工智能招聘工具

You have certainly heard of human recruiters. They are matchmakers between employers and potential employees. They travel, send cold emails, and “network” at conferences and job fairs. 当招聘人员成功匹配时,他们会得到报酬,有时是一方,有时是双方. As you can see, this matchmaking dance is often expensive and time-consuming. Surely technology can help, right? 一个人力招聘人员每天最多只能审核几十个申请人,直到她感到疲惫为止. 与此形成鲜明对比的是, 人工智能可以在几秒钟内“阅读”数千个应用程序,并根据所需的标准对它们进行排名, showing the most promising candidates at the top. 可以理解,然后, compared to a human recruiter, an AI recruiter would be more time and cost efficient. And now that the human recruiter doesn’t need to sift through and rank candidates, 她可以花时间去接触最优秀的候选人,说服他们接受这份工作. What nice team-work between the human and AI recruiters! 

Unfortunately, things are never so simple. How can we ensure that the AI recruiter is being fair to all candidates? Can it offer explanations for why it didn’t suggest any women for a certain job opening? 为了回答这些新问题,我们需要了解人工智能工具是如何“学习”完成工作的.

It all st艺术 with a big “training” set of job applications. 多年来,公司一直要求求职者在网上提交所有材料. 例如, if you have been on the academic job market, you were probably asked to upload your resume, 求职信, and letters of recommendation in a website like AcademicJobsOnline.org. 与大学不同,像亚马逊和谷歌这样的大公司运营着自己的求职网站. 因此, 随着时间的推移, they have amassed thousands and thousands of application materials, 全部采用电子形式. Additionally, they have recorded which applicants were successful in their job hunts. 因此, 他们有被录用的申请人和被拒绝的申请人提交的材料的例子. 然后将这些信息提供给人工智能工具,以“学习”反映成功候选人的特征. In the case of 亚马逊’s tool, 人工智能“了解到”简历中的“执行”和“捕获”等词与成功有关. 与此同时, 它还“了解到”出现“women’s”(如“women’s象棋队长”)这样的短语与拒绝有关, and so the corresponding resume was downgraded. 

Artificial intelligence, despite all the hype (它将拯救地球)和所有的恐惧(它会杀死人类), is not, actually, intelligent. 它不知道像“女人的”这样的词是什么意思,也不知道它如何与现实世界中的实体相对应. 这种人工智能只擅长在我们提供给它的数据中发现模式和关系. 所以我们提供给人工智能的数据,以及我们告诉它如何处理这些数据,才是最重要的.

Why was the AI tool biased against women?

亚马逊的员工 talked to Reuters anonymously said that the AI tool downgraded applications of graduates from two women’s colleges, without specifying which colleges. This detail is what compelled me to write about the tool. 

我是一名女性计算机科学教授,在皇冠体育教授人工智能, 哪个是女子学院. As is typical at a liberal 艺术 college, my students not only take computer science and mathematics courses for their major, but also courses in social sciences, 艺术, 和人文, courses with titles such as “Introduction to Women’s and 性别 Studies,” “Almost Touching the Sky: Women’s Coming of Age Stories,” or “From Mumbet to Michelle Obama: Black Women’s History.他们比其他学生更有可能在求职材料中使用“女性”这个词. 其中一些学生甚至可能在亚马逊的人工智能工具认为“不值得招聘”的申请人名单中.

每一天, I stand in front of classrooms full of intelligent women, eager to learn about the beauty and power of algorithms. It pains me to find out that a major player like 亚马逊 created and used algorithms that, 最终, 是否可以通过剥夺他们加入设计和建造我们现在和未来技术的工程师团队的机会,来粉碎他们在世界上留下印记的梦想. 

Why did the AI tool downgrade women’s resumes? Two reasons: data and values. 人工智能工具没有推荐女性从事的工作是软件开发. Software development is studied in computer science, a discipline whose enrollments have seen many ups and downs over the past two decades. 例如, in 2008, 当我加入韦尔斯利的时候, the department graduated only 6 students with a CS degree. Compare that to 55 graduates in 2018, a nine-fold increase. 亚马逊 fed its AI tool historical application data collected over 10 years. Those years most likely corresponded to the drought-years in CS. Nationally, women have received around 占所有计算机科学学位的18% 十多年了. 女性在科技领域的代表性不足是一个众所周知的现象 一直在写 从21世纪初开始. 亚马逊用来训练人工智能的数据反映了这种持续多年的性别差距:在21世纪头十年,很少有女性学习计算机科学,被科技公司聘用的女性更少. At the same time, women were also abandoning the field, which is infamous for its awful treatment of women. 一切都是平等的.g., the list of courses in CS and math taken by female and male candidates, or projects they worked on), if women were not hired for a job at 亚马逊, 人工智能“了解到”像“女性”这样的短语的存在可能表明候选人之间的差异. 因此,在测试阶段,它会惩罚简历中有这句话的求职者. 人工智能工具变得有偏见, because it was fed data from the real-world, which encapsulated the existing bias against women. 此外, 值得指出的是,亚马逊是五大科技公司中唯一的一家(其他四家是苹果), 脸谱网, 谷歌, 和微软), that hasn’t revealed the percentage of women working in technical positions. 这种缺乏公开披露的做法,只会加剧亚马逊对女性的固有偏见.

Could the 亚马逊 team have predicted this? Here is where values come into play. Silicon Valley companies are famous for their neoliberal views of the world. 性别, 比赛, and socioeconomic status are irrelevant to their hiring and retention practices; only talent and demonstrable success matter. So, if women or people of color are underrepresented, it’s because they are perhaps too 生物有限公司 to be successful in the tech industry. 性别歧视的文化规范或缺乏成功的榜样使女性和有色人种远离这个领域,这不是罪魁祸首, according to this world view. 

要认识到这种结构性不平等,就必须致力于将公平和平等作为推动决策的基本价值观. If you reduce humans to a list of words containing coursework, 学校的项目, and descriptions of extra-curricular activities, 你对“有才华”或“成功”的定义是一种非常幼稚的看法.” 性别, 比赛, and socioeconomic status are communicated through the words in a resume. Or, to use a technical term, they are the hidden variables generating the resume content. 

最有可能的是,这个人工智能工具不仅对女性有偏见,对其他弱势群体也有偏见. Imagine that you have to work three jobs to finance your education. 你是否有时间开发开源软件(有些人为了好玩而做的无偿工作)或者每个周末参加不同的黑客马拉松? 可能不是. 但这些正是你需要的活动,才能在你的简历中出现“执行”和“捕获”这样的字眼, which the AI tool “learned” to see as signs of a desirable candidate. 

我们不要忘记,比尔·盖茨和马克·扎克伯格之所以能够从哈佛大学退学,去追求他们建立科技帝国的梦想,是因为他们从中学开始就一直在学习代码,并为科技事业进行有效的培训. The list of founders and CEOs of tech companies is composed exclusively of men, most of them white and raised in wealthy families. Privilege, across several different axes, fueled their success.

Artificial Intelligence is not at fault here. 在科技行业取得成功意味着什么,这种扭曲的价值观是罪魁祸首. 我们需要揭露这些价值观,并让像亚马逊这样的公司对继续遵守这些价值观负责. 他们必须为自己根本不公平的做法承担责任,因为他们把人简化为简历中的“文字包”, instead of nurturing and advancing their human potential. 

 

图片来源:pathdoc,“卡通机器人和求职者一起排队面试.“在上面. 网络. 二零一八年十月三十日.

相关的