What I'm reading this week 10/13/2018: Data is inherently political. And so is code

After a Reuters article about Amazon shutting down its hiring algorithm because it was biased against women, I began re-reading articles I kept for future research on the outcomes of data and AI on political, sexual, and racial minorities. I’m sharing them here. I hope they are enlightening.

https://www.technologyreview.com/the-download/612253/amazon-ditched-ai-recruitment-software-because-it-was-biased-against-women/

https://digitaltalkingdrum.com/2017/08/15/against-black-inclusion-in-facial-recognition/

https://thebaffler.com/salvos/big-brothers-blind-spot-mcneil

https://medium.com/@blaisea/do-algorithms-reveal-sexual-orientation-or-just-expose-our-stereotypes-d998fafdf477

The logarithms for AI are only as biased as the people writing them. The whole facial recognition software thing has weirded me out for a while. I don’t see how there’s a significant benefit to it, without it also being used for nefarious purposes by governments and law enforcement.

Absolutely. There’s a tendency to think that the technology deployed is neutral but we forget to take into account the positionality of the people deploying it.

1 Like