November 14, 2018
Race After Technology
Ruha Benjamin
Associate Professor of African American Studies, Princeton University
Minutes of the Tenth Meeting of the 77th Year
President Coale called the 10th meeting of the 77th year of the Old Guard of Princeton to order at 10:15 a.m. on November 14, 2018. Charlie Clark led the invocation, and Bernie Miller read the minutes of the meeting held on November 7, 2018, that had been prepared by David Vilkomerson. Jules Richter introduced a guest, his wife Marsha.
President Coale described arrangements for handicapped parking and encouraged members to volunteer as minute-takers by contacting Ruth Scott. President Coale also announced that the next meeting of the 77th year will be held at the Friend Center at 10:15 on November 28, 2018. The speaker will be Fred Wherry, Professor of Sociology at Princeton University. The title of his talk is "Money Talks." She then introduced Shirley Satterfield, who introduced the speaker.
Dr. Ruha Benjamin is an associate professor in the Department of African American Studies at Princeton University. She is also a faculty associate in several other centers and departments at the University. Born in India to a Persian-Indian mother and an African American father, she grew up in an ethnically mixed family and lived in the South Pacific, South Africa, and several regions in the United States. Professor Benjamin holds degrees from Waterford Kamhlaba United World College in South Africa, Spelman College, and the University of California at Berkeley. She is a member of the Institute for Advanced Study. Her first book was titled People's Science: Bodies and Rights on the Stem Cell Frontier. She has lectured widely, including at a Nobel Conference and has given a "TED Talk." Her talk was titled "Race After Technology".
Dr. Benjamin drew attention to a slide of the title of her next book, "Race and Technology." "Race" refers to the speed of pursuing technology and "technology" to whether it will save us or devour us. Instead, we should be thinking about our relationship to technology. Her goal is to explore how society shapes technology as well as how we are impacted by it.
She then focused on what she termed Exhibit A—two words, underserved and overserved, the latter of which she was not sure was actually a word. Sociologists have a principle called "relationality"—if there's an up, there's a down. We can change technology in this case by using the term "overserved" and eventually it will be recognized in any Google search.
Exhibit B looked at facial recognition, and she noted that systems have difficulty recognizing those with darker skin. Similarly, Asians were asked if they were blinking. Systems developed in Asia did not have this problem. Output depends on who is developing the technology.
Exhibit C involved a beauty contest held by a Hong Kong/Australia company. Judges were algorithms. Over 6,000 selfies were submitted, and the robot judges selected 32 winners, only three of whom were non-whites. The programmers who designed the contest had used characteristics of celebrities and models as a universal standard of beauty.
Dr. Benjamin then recalled a conversation that she overheard in Newark Airport in which one speaker said he wanted to have power over other people. We can enroll technology to achieve this goal. Her next slide showed an ad from 1957 depicting robot slaves. "We'll all have personal slaves again," it said. The text of the ad identified who should have power.
She continued by observing that the digital gap between rich and poor kids is not what we expected. Parents of poor kids are pushing to get as much technology as possible into their schools. The narrative has been to fill the gap. On the other hand, well-to-do parents are trying to limit access, even asking their nannies to sign no phone contracts. Working class children are catching on. A few days ago some Brooklyn students walked out of school in protest over having so much of their learning using online technology. They know this is not true teaching.
This is becoming a conversation about who designs technology as much as who gets to use it. Dr. Benjamin gave the example of the boy from Kuwait who drew a clock on his pencil box and was arrested on suspicion of bringing a bomb to school. If Ahmed had been a white kid, the result may have been different.
Dr. Benjamin next turned to the application of technology to the life sciences. We are learning about experiments using a patient's own cells to regenerate rather than using a donor in transplants to avoid rejection. Who will have access to this approach? Our imagination grows limp when we consider applications of technology for the social good. We need to recalibrate. We seem to have anemia with technological prowess.
Racism isn't the only issue. The problem is three-fold, involving racism, militarism, and materialism. A while ago, the U. S. Army asked for proposals on "biodegradable" bullets. So many bullets used in training litter the Earth. This approach fails to consider why we need so many bullets.
She next looked at a situation in the Marshall Islands, a former nuclear testing site. Women are still giving birth to deformed babies. The native population has been pushed onto a single island, which is a slum. Kids when playing act dead and can't see their lives going beyond 18 years. Inequality has been engineered. This is a metaphor for the majority.
Dr. Benjamin then began showing us some examples of benches. First was a slide of a park bench from Berkeley with armrests. These rests were positioned to deter the homeless from lying down. Homelessness in the Bay Area has escalated. How do we build the material world as a fix for our social ills? She found single occupancy benches in Helsinki and caged benches in France. Her favorite example, however, was the spiked bench with a box to take coins to lower the spikes for a period of time. This was designed by a German artist, but municipalities in China have inquired about it.
We are creating technological fixes for wide-ranging social problems. How do we address racist and sexist robots? How do we embed more egalitarian values in our technology? Several books have been written on the subject, which Dr. Benjamin calls "The New Jim Code"—innovation that enables containment.
She gave us a few tools for dealing with technological inequity. The acronym for the toolkit is S.H.A.L.T. Thou shalt use:
• Social Literacy—Train students to see wider patterns, outsourcing criminal risk assessment is now in use—white parolees get low risk numbers—variables were shaped by inequality;
• Historical Literacy—Facebook uses targeted advertising, advertisers can design who should see ads, realtors can exclude groups in ads—tech companies cannot be left to police themselves;
• Algorithmic Accountability—Who should be regulating technology; too much design is based on what is present rather than what could be;
• Linguistic Awareness—Language online perpetuates certain biases; software makes positive associations with white names—there is a 50% better call back rate for white job candidates with resumes processed by software; and
• Technological Humility—We must share technology that reflects our highest values; Europeans have developed ten principles of data rights.
Dr. Benjamin underscores the value of imagination. Derrick Bell said it like this: "To see things as they really are, you must imagine them for what they might be." We are pattern makers and we must change the content of our existing patterns.
In response to a question, Dr. Benjamin said, "People are governing through technology without a mandate."
Respectfully submitted,
Robert S. Fraser
President Coale described arrangements for handicapped parking and encouraged members to volunteer as minute-takers by contacting Ruth Scott. President Coale also announced that the next meeting of the 77th year will be held at the Friend Center at 10:15 on November 28, 2018. The speaker will be Fred Wherry, Professor of Sociology at Princeton University. The title of his talk is "Money Talks." She then introduced Shirley Satterfield, who introduced the speaker.
Dr. Ruha Benjamin is an associate professor in the Department of African American Studies at Princeton University. She is also a faculty associate in several other centers and departments at the University. Born in India to a Persian-Indian mother and an African American father, she grew up in an ethnically mixed family and lived in the South Pacific, South Africa, and several regions in the United States. Professor Benjamin holds degrees from Waterford Kamhlaba United World College in South Africa, Spelman College, and the University of California at Berkeley. She is a member of the Institute for Advanced Study. Her first book was titled People's Science: Bodies and Rights on the Stem Cell Frontier. She has lectured widely, including at a Nobel Conference and has given a "TED Talk." Her talk was titled "Race After Technology".
Dr. Benjamin drew attention to a slide of the title of her next book, "Race and Technology." "Race" refers to the speed of pursuing technology and "technology" to whether it will save us or devour us. Instead, we should be thinking about our relationship to technology. Her goal is to explore how society shapes technology as well as how we are impacted by it.
She then focused on what she termed Exhibit A—two words, underserved and overserved, the latter of which she was not sure was actually a word. Sociologists have a principle called "relationality"—if there's an up, there's a down. We can change technology in this case by using the term "overserved" and eventually it will be recognized in any Google search.
Exhibit B looked at facial recognition, and she noted that systems have difficulty recognizing those with darker skin. Similarly, Asians were asked if they were blinking. Systems developed in Asia did not have this problem. Output depends on who is developing the technology.
Exhibit C involved a beauty contest held by a Hong Kong/Australia company. Judges were algorithms. Over 6,000 selfies were submitted, and the robot judges selected 32 winners, only three of whom were non-whites. The programmers who designed the contest had used characteristics of celebrities and models as a universal standard of beauty.
Dr. Benjamin then recalled a conversation that she overheard in Newark Airport in which one speaker said he wanted to have power over other people. We can enroll technology to achieve this goal. Her next slide showed an ad from 1957 depicting robot slaves. "We'll all have personal slaves again," it said. The text of the ad identified who should have power.
She continued by observing that the digital gap between rich and poor kids is not what we expected. Parents of poor kids are pushing to get as much technology as possible into their schools. The narrative has been to fill the gap. On the other hand, well-to-do parents are trying to limit access, even asking their nannies to sign no phone contracts. Working class children are catching on. A few days ago some Brooklyn students walked out of school in protest over having so much of their learning using online technology. They know this is not true teaching.
This is becoming a conversation about who designs technology as much as who gets to use it. Dr. Benjamin gave the example of the boy from Kuwait who drew a clock on his pencil box and was arrested on suspicion of bringing a bomb to school. If Ahmed had been a white kid, the result may have been different.
Dr. Benjamin next turned to the application of technology to the life sciences. We are learning about experiments using a patient's own cells to regenerate rather than using a donor in transplants to avoid rejection. Who will have access to this approach? Our imagination grows limp when we consider applications of technology for the social good. We need to recalibrate. We seem to have anemia with technological prowess.
Racism isn't the only issue. The problem is three-fold, involving racism, militarism, and materialism. A while ago, the U. S. Army asked for proposals on "biodegradable" bullets. So many bullets used in training litter the Earth. This approach fails to consider why we need so many bullets.
She next looked at a situation in the Marshall Islands, a former nuclear testing site. Women are still giving birth to deformed babies. The native population has been pushed onto a single island, which is a slum. Kids when playing act dead and can't see their lives going beyond 18 years. Inequality has been engineered. This is a metaphor for the majority.
Dr. Benjamin then began showing us some examples of benches. First was a slide of a park bench from Berkeley with armrests. These rests were positioned to deter the homeless from lying down. Homelessness in the Bay Area has escalated. How do we build the material world as a fix for our social ills? She found single occupancy benches in Helsinki and caged benches in France. Her favorite example, however, was the spiked bench with a box to take coins to lower the spikes for a period of time. This was designed by a German artist, but municipalities in China have inquired about it.
We are creating technological fixes for wide-ranging social problems. How do we address racist and sexist robots? How do we embed more egalitarian values in our technology? Several books have been written on the subject, which Dr. Benjamin calls "The New Jim Code"—innovation that enables containment.
She gave us a few tools for dealing with technological inequity. The acronym for the toolkit is S.H.A.L.T. Thou shalt use:
• Social Literacy—Train students to see wider patterns, outsourcing criminal risk assessment is now in use—white parolees get low risk numbers—variables were shaped by inequality;
• Historical Literacy—Facebook uses targeted advertising, advertisers can design who should see ads, realtors can exclude groups in ads—tech companies cannot be left to police themselves;
• Algorithmic Accountability—Who should be regulating technology; too much design is based on what is present rather than what could be;
• Linguistic Awareness—Language online perpetuates certain biases; software makes positive associations with white names—there is a 50% better call back rate for white job candidates with resumes processed by software; and
• Technological Humility—We must share technology that reflects our highest values; Europeans have developed ten principles of data rights.
Dr. Benjamin underscores the value of imagination. Derrick Bell said it like this: "To see things as they really are, you must imagine them for what they might be." We are pattern makers and we must change the content of our existing patterns.
In response to a question, Dr. Benjamin said, "People are governing through technology without a mandate."
Respectfully submitted,
Robert S. Fraser