How one scientist is cleaning up the world’s dirty data

Centre Director Professor Shazia Sadiq was featured in this piece by The Brilliant looking at how data underlying algorithms is collected and the biases embedded in them. The way to disrupt this cycle of bias is information resilience, which means understanding how information is collected, and identifying all opportunities for bias to be introduced. 

When Amazon developed an automated recruitment tool, the hope was that an unbiased, logical algorithm could read a CV and identify the best candidates. The algorithm turned out to be an engine of sexism, which was not only biased towards male resumes, but actively downgraded candidates if they came from one of two women’s universities in the USA. The problem was that the tool sought applicants whose CVs resembled previously successful job seekers; as most of these were men, the algorithm learned to reject women. It was stunning proof that algorithms are not neutral. They work according to the biases of the people who program them – a problem that computer scientist and Professor Shazia Sadiq is acutely conscious of.

Read the full feature article from The Brilliant here