Skip links

Facial Recognition is “biased by design”

Google’s image recognition system that labels a picture of a pair of black friends as gorillas or an airport system that refuses to recognise a transgender person as female. Facial recognition systems are inherently biased, because they are often not designed to match a world of different kinds of people, Rose Eveleth writes in a Motherboard column based on interviews with designers of facial recognition tech and experts in ethics and technology. The problem is, she states, that “…a homogenous team produces homogenous products for a very heterogeneous world”.

One key problem is that the designers of the systems do not acknowledge that there is a problem of bias in the systems. But there is. A computer only recognises the faces it has been fed with. Meaning that if it is only trained to look at a homogenous set of human faces it will categorise accordingly. One solution is therefore to feed the system with a more varied set of training data. Another option is an outside ethical audit of the algorithms.

But the real question is fundamentally ethical – do we want the crowd to be recognized, compared and categorized (accurately or not..)?

16836491502_87fc3335df_z

Read “The Inherent Bias of Facial Recognition”

Image by: Sheila Scarborough