2018-2019 NHSEB Case 8: Data Violence

According to case #12, software sometimes commits some pretty cringeworthy errors, such as Google tagging photos of African-Americans as gorillas (yikes…), or “airport body scanners flagging transgender bodies as threats” (double yikes…), or translation software replacing intentionally gender-neutral terms with gender-specific terms.

body scan machine, credit AOL

One pragmatic reason for programmers to correct and prevent such mistakes is that if customers can’t trust their software, they’ll buy and use software that they can.

But apart from the financial incentive, there’s also a moral obligation to correct such errors to the extent they cause vulnerable groups undeserved, avoidable harm.

Tagging African-Americas as gorillas triggers our moral radar not only because it’s a gross program error, but because African-Americans are sometimes demeaned as sub-human, and the program mistake exposes and deepens that wound. While one person might laugh it off, another might find it devastating.

In the case of the body scanner flagging a transgender person as a threat, I can only imagine the embarrassment this could cause, especially if the machine set off alarms or otherwise caused a scene. Whether we can fully understand why a person would want to modify their bodies in a gender altering way, basic decency suggests tweaks to the scanners and discreet handling of alarms. Hopefully a transgender person being screened in the name of passenger safety could overlook the inconvenience if it’s carried out tactfully, and especially if software improvements are underway.

And in the case of translation software replacing gender-neutral terms with gender-specific terms, if an author has gone to the trouble to gender neutralize their writing, translation software that misses that nuance would seem not only bad (translations are useful only to the extent they precisely convey author intent), but callous to the plight of people who reject gender assignment. Some people’s dignity turns on not being labeled he or she, and respecting that request seems easy and harmless enough. Even easier and more harmless — respecting the intent of authors who go to the trouble to use gender-neutral language.

Ultimately, appreciating the argument above requires some degree of sensitivity to and empathy with the plight of the impacted groups. As a straight, white male, I can only imagine how these errors could ruin a person’s week. But when I do imagine, I see a transgender teenager, or an elderly black man, or any already vulnerable person suffering an unnecessary, avoidable harm. I think about how a person could feel alienated and discounted already, and how these errors could compound their suffering. If the happiness of persons matters, it seems pretty clear programmers should go to the trouble to root out errors like those mentioned in case #12, and to take steps to prevent them in the first place.

Leave a Reply