Knowledge is everything!
Sign up for our newsletter to receive:
- an extra 10% off your ticket!
- insights, interviews, tips, news, and much more about Predictive Analytics World
- price break reminders
There have been multiple instances when a machine learning model was found to discriminate against a particular section of society, be it rejecting female candidates during hiring, systemically disapproving loans to working women, or having a high rejection rate for darker color candidates. Recently, it was found that facial recognition algorithms that are available as open-source have lower accuracy on female faces with darker skin color than vice versa. In another instance, research by CMU showed how a Google ad showed an ad for high-income jobs to men more often than women. Using credit risk data where Publicis Sapient wanted to predict the probability of someone defaulting on a loan, they were able to shortlist features that were discriminatory in nature.