Let’s pull back the curtain for a moment on our office here at My TechDecisions: hi, I’m Adam, and I sit near a colleague named Craig. Craig writes for our sister publication, Commercial Integrator, and recently published a blog about robot racism.
What you need to know about Craig: he purposefully uses a tone of slight alarmism to talk about technology and how he doesn’t care for all of its uses. It’s sort of his deal, and I appreciate his signature style — it’s a breath of… well, I’m not going to call a curmudgeon’s ramblings around the tech of today “fresh air,” but it sure breaks up the work week!
Seriously, he’s a good writer, and one you should follow. But I feel his latest blog didn’t use quite enough alarmism.
Robot racism is a sign of the human element behind tech… for the worse
His blog covers a new study from the Human Interface Technology Laboratory in New Zealand, which basically states people apply racial biases to robots:
The study “suggests people perceive physically human-like robots to have a race and therefore apply racial stereotypes to white and black robots,” according to a recent CNN report.
“These colors have been found to trigger social cues that determine how humans react to and behave toward other people and also, apparently, robots.”
“The bias against black robots is a result of bias against African-Americans,” lead researcher Christoph Bartneck said in the CNN report. “It is amazing to see how people who had no prior interaction with robots show racial bias towards them.”
The researchers think this is an issue that needs to be addressed… and I completely agree.
How does this even happen?
The robots used in the study are definitely robots, but have human-like limbs and a head, with complexions that are white or black. In the “shooter bias” test, black and white people and robots appeared on a screen for less than a second, and participants were told to “shoot” those holding a weapon.
Black robots that were not holding weapons were shot more than the white ones not carrying them. The researchers also saw over-representation of white robots in Google Images searches for “robots.”
What’s the point?
In his response to the study, Craig questioned who would want this research, and who would actually believe people could be racist towards machines, no matter how lifelike they are.
He did his due diligence: he even admits in his blog, “I’m a middle-aged white man who is generally immune from the ills of racism personally…” I don’t really blame him for reading this study and wondering, “what’s the point” and “can people really be racist to machines?!”
My response? In short, they absolutely can, and more importantly, the fact that they are in some cases doesn’t reflect well on us as a species.
“If robots are supposed to function as teachers, friends or carers, for instance, then it will be a serious problem if all of these roles are only ever occupied by robots that are racialized as White,” says the robot racism study.
“Human-shaped robots should represent the diversity of humans. Imagine a world in which all Barbie dolls are white. Imagine a world in which all the robots working in Africa or India are white. Further imagine that these robots take over roles that involve authority.”
Furthermore, diversity and racial prejudices shouldn’t be viewed only as a social/political issue… it’s a serious business issue. Having diverse teams provides plenty of business advantages, not the least of which is the fact that additional viewpoints & backgrounds can help solve challenges in different ways.
If robots are also to play an increasing important role in many business functions, it’s important that we chronicle cases in which humans are behaving with prejudice toward certain robot appearances.
In a second robot racism study, the HIT Lab NZ team added lighter brown robots, finding that as they increased the racial diversity, participants’ racial bias toward the robots disappeared altogether.
This “potentially means that diversification of robots might lead to a reduction in racial bias towards them,” according to that study.