微博

ECO中文网

 找回密码
 立即注册

QQ登录

只需一步,快速开始

查看: 2974|回复: 0
打印 上一主题 下一主题
收起左侧

萨菲亚-诺布尔 互联网研究和数字媒体学者

[复制链接]
跳转到指定楼层
1
发表于 2022-2-22 18:31:45 | 只看该作者 回帖奖励 |倒序浏览 |阅读模式

马上注册 与译者交流

您需要 登录 才可以下载或查看,没有帐号?立即注册

x
Safiya Noble
Internet Studies and Digital Media Scholar | Class of 2021
Highlighting the ways digital technologies and internet architectures magnify racism, sexism, and harmful stereotypes.


Portrait of Safiya Noble

Title
Internet Studies and Digital Media Scholar
Affiliation
Department of Gender Studies and African American Studies, University of California / Los Angeles
Location
Los Angeles, California
Age
51 at time of award
Area of Focus
Media Studies, Civil Rights and Civil Liberties
Website
safiyaunoble.com
Center for Critical Internet Inquiry
UCLA: Safiya Noble
Social
Twitter
Instagram
Published September 28, 2021
ABOUT SAFIYA'S WORK
Safiya Noble is an internet studies and digital media scholar transforming our understanding of the ways digital technologies and internet architectures replicate and magnify discriminatory racial, gender, and power dynamics. Drawing on training in information science and a deep understanding of the intersections among culture, race, and gender, she is revealing how the artificial intelligence and algorithms underpinning technologies we use daily have both real and negative impacts on the lives of vulnerable people, particularly women and girls of color.

In her book Algorithms of Oppression: How Search Engines Reinforce Racism (2018), Noble demonstrates that search engines are not sources of neutral and objective information. Rather, economic incentives (primarily advertising revenue) and the social values assigned to ideas, objects, or people shape search engine results. For example, the first page of results of a 2011 keyword search for “Black girls” in Google yielded mostly pornographic and hypersexualized content, exacerbating racist and sexist stereotypes about Black women. The same stereotyping was true for other racialized categories of women like Asian and Latina girls. Noble explains that the classification and ranking methods used by Google’s proprietary search algorithms (at that time, PageRank) are based on traditional frameworks for organizing information, namely, the Library of Congress classification system created in the late nineteenth century, which has a deep history of exclusion and misrepresentation. Commercial search engines are now ubiquitous—used daily by educators, students, parents, and the public to understand the world around us and make crucial, life-altering decisions—and Noble points to the need for greater accountability and regulation of tech companies that have an outsized share of power over how we understand the world. She details how bias embedded within search algorithms promotes disinformation, reduces the political and social agency of marginalized people, and can lead to real-world violence. For example, internet propaganda blaming Asians for the COVID-19 pandemic played a role in escalating violence against Asian Americans, and Dylann Roof massacred nine African American churchgoers after reading White nationalist websites that were highly ranked in web searches on race and crime.

In addition to her research, Noble works with engineers, executives, artists, and policymakers to think through the broader ramifications of how technology is built, deployed, and used in unfair ways. She challenges them to examine the harms algorithmic architectures cause and shows the necessity of addressing the civil and human rights that are violated through their technologies. Noble is also co-founder of the newly established University of California at Los Angeles Center for Critical Internet Inquiry, an interdisciplinary research center focused on the intersection of human rights, social justice, democracy, and technology. Noble’s work deepens our understanding of the technologies that shape the modern world and facilitates critical conversations regarding their potential harms.

BIOGRAPHY
Safiya Noble received a BA (1995) from California State University at Fresno and an MS (2009) and PhD (2012) from the University of Illinois at Urbana-Champaign. Noble has been affiliated with the University of California at Los Angeles since 2014, where she is an associate professor in the Department of Gender Studies and Department of African American Studies and holds affiliate appointments in the School of Education and Information Studies. She has co-edited two additional books, The Intersectional Internet and Emotions, Technology, and Design, and is co-editor of the “Commentary and Criticism” section of Feminist Media Studies. Her research has been published in The Scholar and Feminist Online, the Journal of Education for Library and Information Science, and InVisible Culture. Her non-profit community work to foster civil and human rights, the expansion of democracy, and intersectional racial justice is developing at The Equity Engine.

IN SAFIYA'S WORDS
Smiling woman with long curly black hair wearing a black jacket and white top. Quote text beneath photo: We have more data and technology than ever, and we have more economic and social injustice to go with it. What if we thought of data in more complex ways that reveal how it is used for racial profiling, digital redlining, and practices that undermine civil and human rights?


We have more data and technology than ever, and we have more economic and social injustice to go with it. What if we thought of data in more complex ways that reveal how it is used for racial profiling, digital redlining, and practices that undermine civil and human rights? What if we questioned unlimited access to personal identifiable information being bought and sold in a 24/7 marketplace? How will we undo the making of autonomous AI systems that threaten life on the planet? How do unregulated technologies concentrate wealth in the hands of a few at the expense of the 99 percent? These are some the big questions we must grapple with immediately if we want to see more freedom, more democracy, and more humanity for the generations to come.



萨菲亚-诺布尔
互联网研究和数字媒体学者 | 2021级
突出数字技术和互联网架构放大种族主义、性别歧视和有害的定型观念的方式。


萨菲亚-诺布尔的画像

标题
互联网研究和数字媒体学者
工作单位
加州大学/洛杉矶分校性别研究和非裔美国人研究系
工作地点
洛杉矶,加利福尼亚
年龄
获奖时为51岁
重点领域
媒体研究, 公民权利和公民自由
网站
safiyaunoble.com
关键互联网调查中心
加州大学洛杉矶分校。萨菲亚-诺布尔
社会
推特
脸书
发表于2021年9月28日
关于萨菲亚的工作
萨菲亚-诺布尔是一位互联网研究和数字媒体学者,她改变了我们对数字技术和互联网架构复制和放大歧视性的种族、性别和权力动态的理解。她利用信息科学的培训和对文化、种族和性别之间交叉的深刻理解,揭示了我们日常使用的人工智能和算法是如何对弱势人群,特别是有色人种妇女和女孩的生活产生真实和负面的影响。

在她的书《压迫的算法》中。搜索引擎如何强化种族主义》(2018年)中,诺布尔证明了搜索引擎不是中立和客观信息的来源。相反,经济激励(主要是广告收入)和分配给思想、物体或人的社会价值塑造了搜索引擎的结果。例如,2011年在谷歌搜索 "黑人女孩 "时,第一页的结果大多是色情和超性感的内容,加剧了对黑人妇女的种族主义和性别歧视的定型观念。对其他种族化类别的妇女,如亚洲和拉丁裔女孩,也有同样的定型观念。诺布尔解释说,谷歌专有的搜索算法(当时是PageRank)所使用的分类和排名方法是基于传统的信息组织框架,即19世纪末创建的国会图书馆分类系统,该系统有着深厚的排斥和误导的历史。商业搜索引擎现在无处不在--教育工作者、学生、家长和公众每天都在使用,以了解我们周围的世界,并做出关键的、改变生活的决定--诺布尔指出,需要加强对科技公司的问责和监管,这些公司对我们如何理解世界拥有巨大的权力份额。她详细介绍了嵌入在搜索算法中的偏见如何促进虚假信息,减少边缘化人群的政治和社会机构,并可能导致现实世界的暴力。例如,将COVID-19大流行归咎于亚洲人的互联网宣传在针对亚裔美国人的暴力升级中发挥了作用,而迪伦-鲁夫在阅读了在有关种族和犯罪的网络搜索中排名靠前的白人民族主义网站后,屠杀了9名非裔美国教徒。

除了她的研究,诺布尔还与工程师、管理人员、艺术家和政策制定者合作,思考技术如何被建造、部署和以不公平的方式使用的更广泛的影响。她挑战他们去研究算法架构所造成的伤害,并表明有必要解决通过其技术所侵犯的公民权利和人权。诺布尔也是新成立的加州大学洛杉矶分校关键互联网调查中心的共同创始人,该中心是一个跨学科研究中心,专注于人权、社会正义、民主和技术的交叉。诺布尔的工作加深了我们对塑造现代世界的技术的理解,并促进了关于其潜在危害的批评性对话。

个人简历
萨菲亚-诺布尔在加利福尼亚州立大学弗雷斯诺分校获得学士学位(1995年),在伊利诺伊大学厄巴纳-香槟分校获得硕士学位(2009年)和博士学位(2012年)。自2014年以来,诺布尔一直隶属于加州大学洛杉矶分校,她是性别研究系和非洲裔美国人研究系的副教授,并在教育和信息研究学院担任附属职位。她还与人合编了另外两本书:《交叉的互联网》和《情感、技术和设计》,并且是《女性主义媒体研究》的 "评论和批评 "部分的共同编辑。她的研究发表在《学者和女权主义者在线》、《图书馆和信息科学教育杂志》以及《可见的文化》上。她的非营利性社区工作旨在促进公民权利和人权,扩大民主,以及交叉种族正义,并在公平引擎中发展。

萨菲亚的话
身穿黑色外套和白色上衣的黑色长卷发女性,面带微笑。照片下的引用文字。我们拥有比以往更多的数据和技术,同时我们也有更多的经济和社会不公正现象。如果我们以更复杂的方式思考数据,揭示它是如何被用于种族定性、数字红线以及破坏公民权利和人权的做法的呢?


我们拥有比以往更多的数据和技术,我们也有更多的经济和社会不公正。如果我们以更复杂的方式思考数据,揭示它是如何被用于种族定性、数字红线以及破坏公民权利和人权的做法的呢?如果我们质疑在一个24/7的市场中被买卖的个人身份信息的无限访问,会怎么样?我们将如何撤销威胁到地球上生命的自主人工智能系统的制造?不受监管的技术如何将财富集中在少数人手中,而牺牲99%的人的利益?如果我们想看到更多的自由、更多的民主和更多的人性,这些是我们必须立即解决的一些大问题。
分享到:  QQ好友和群QQ好友和群 QQ空间QQ空间 腾讯微博腾讯微博 腾讯朋友腾讯朋友
收藏收藏 分享分享 分享淘帖 顶 踩
您需要登录后才可以回帖 登录 | 立即注册

本版积分规则

QQ|小黑屋|手机版|网站地图|关于我们|ECO中文网 ( 京ICP备06039041号  

GMT+8, 2024-4-20 08:12 , Processed in 1.143903 second(s), 19 queries .

Powered by Discuz! X3.3

© 2001-2017 Comsenz Inc.

快速回复 返回顶部 返回列表