Ï㽶ÊÓƵ

News

A new book reveals the hidden biases that can creep into Data Science models

Published: 7 September 2016

Yesterday (September 6, 2016) saw the launch of a new book by Cathy O'Neil with the provactive title Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. O'Neil holds a Ph.D. from Harvard in Math and was a tenure-track math professor until 2007, when she quit academia to join Wall Street. That fledging second career came to an end just a year later with the Financial Crisis, after which O'Neil again changed careers and became a data scientist. She runs a blog in which she identifies and debunks misuses of mathematical modelling.

O'Neil identifies three characteristics of the misused models that she calls "Weapons of Math Destruction":

  1. The model's reasoning is opaque, even to the entities that use them.
  2. The model's outputs can cause some harm to the people whose behaviour is being modelled, as in being fired, refused a job or admittance to a university;
  3. The model is being deployed on a large scale.

O'Neil's book provides many concrete examples of WMDs. Here are just two: 

  • Credit scores used to be based on people's actual history of taking loans and paying them back. People can view their credit histories and request corrections when they see a mistake. But some institutions are starting to judge a person's credit-worthiness using so-called "e-scores", which use data elements such as postal codes, telephone area codes and web-surfing history to predict who is likely to pay back a loan. Such a model will notice that people who live in affluent areas tend to pay back loans more often than do people from poor areas, hardly a surprising conclusion. But that means that credit-worthiness is judged not on what people have themselves done in the past but on the neighbourhoods in which they live. This kind of profiling will mean that disadvantaged people will remain at a disadvantage, having a harder time getting credit and paying higher interest when they do.
  • Employers are now basing hiring decisions by looking at the past history of their own employees and seeing which traits best predict success. If there are already hidden biases in a company's workplace, such as a preponderance of males in a particular job, this kind of modelling will only perpetuate those biases. A female candidate will not "look like" these previously successful employees, so she will not be hired. The workplace history will remain predomindantly male, which will only reinforce the model's gender bias going forward, making it ever less likely that a female will be hired.

O'Neil's book has drawn a great deal of attention from mainstream media and several high-profile blogs. Here are just a few places where you can read more about it:

  • A in the UK newspaper The Guardian.
  • An in Time magazine.
  • An in New York magazine.
Back to top