DjangoCon 2018 - It's not a bug, it's a bias
Product makers have a biased view of the world, and this translates to biased algorithms.
How can we take this into account, and create a fairer world though fairer algorithms?
Even though Apple's Siri came out with a built-in response to where to hide a body, it was incapable of pointing a user to an abortion clinic.
How did an Artificial Assistant get iffy about abortion?
And how can I stop my own biases from seeping into the products and services I create?
In this talk, I explore our preconceptions about the nature of algorithms, and how users and makers influence them to model the world according to their perspective.
I conclude the talk by proposing a change of mindset for product designers, to be more aware of our capacity to let our biases infuse our services, and to put in place tools to create more inclusive experiences for our clients.