Viewport width =
July 30, 2018 | by  | in Super Science Trends | [ssba]

Super Science Trends

B.I: Biased Intelligence
CW: Discussion of racism and homophobia.

On August 16th 2017, Chukwuemeka Afigbo, a worker in tech from Nigeria, posted a video to Twitter of an automatic soap dispenser in a public bathroom.

The video begins with a white man’s hand under an automatic soap dispenser. Dutifully, the machine disperses a jet of soap into his open palm. Afigbo  then asks his friend Noel, a black man, to put his hand underneath the dispenser. He does, and the dispenser does nothing. Noel waves his hand up and down to try and get the dispenser to “notice” him, a typical ritual to appease the sensors, but the dispenser still does nothing.
Both men conclude that his hand is “too black” for the machine, which is coded only to dispense when it recognises one kind of skin definition. A simple design flaw turned a modern convenience into an alienating experience for both him and his friend. Just to hammer the point home, Noel then grabs a (white) paper towel and holds it under the dispenser. Sure enough, the machine dispenses soap onto it.
Afigbo shared the video in an attempt to address “the importance of diversity in tech and its impact on society”, and the video quickly went viral, as conversations began around biases in tech development. We assume that machines are incapable of prejudice, but there is always a value judgement made by the people who build them about how they should work and for whom they’re intended. Where this gets more complicated is our current use of computer algorithms, which use existing data to make broad assumptions about people based on select information. You don’t have to work in technology to know that they can effectively become a recipe for perpetuating biases.

In a tone-deaf study from the University of Stanford last year, researchers used facial recognition software on 14,000 profile pictures on an American dating website paired with an algorithm that recorded the sexual orientation information listed on each profile.
The resulting software made composite images of “average” straight and queer faces, and claimed to be able to distinguish between queer and heterosexual faces with 81% accuracy for men and 71% accuracy for women, dubbed by outlets as an “gaydar AI”.
The Human Rights Commission and GLAAD both heavily criticised the study for effectively creating a persecution search engine. To quote Ian Malcolm, they were so preoccupied with whether or not they could, they didn’t stop to think if they should. Even then, at best the data is only really useful for benign sociological uses, like learning about dating cultures. The Stanford researchers rejected their concerns for “lacking scientific training”, with one of them doubling down to say the study could go on to demonstrate whether facial recognition software can predict political orientation or potential criminality. Which… uuurgh, can we please not bring phrenology back into vogue within my lifetime? The only thing that my furrowed brow will tell you about my character is that I have no tolerance for pseudoscience. I can’t imagine any queer researcher would be terribly concerned with determining the objective “gay face” either.
As our societies become more automated and dependent on algorithms, these issues will only become more commonplace, unless they’re addressed in the design stage. Tech companies and researchers should either anticipate a diverse user base or, better yet, hire a diverse staff who can inform development with their own life experiences. It’s important to be critical of who makes this technology and for what purpose; you don’t want to wonder what will happen when you’re left up to your own devices.


About the Author ()

Comments are closed.

Recent posts

  1. VUW Halls Hiking Fees By 50–80% Next Year
  2. The Stats on Gender Disparities at VUW
  3. Issue 25 – Legacy
  4. Canta Wins Bid for Editorial Independence
  5. RA Speaks Out About Victoria University Hall Death
  6. VUW Hall Death: What We Know So Far
  8. New Normal
  9. Come In, The Door’s Open.
  10. Love in the Time of Face Tattoos

Editor's Pick

Uncomfortable places: skin.

:   Where are you from?  My list was always ready: England, Ireland, Scotland, Wales, puppy dogs’ tails, a little Spanish, maybe German, and—almost as an afterthought—half Samoan. An unwanted fraction.   But you don’t seem like a Samoan. I thought you were [inser

Do you know how to read? Sign up to our Newsletter!

* indicates required