Placeholder Content Image

“Nerd” or “wrongdoer”: How artificial intelligence will label you in the future

<p><span style="font-weight: 400;">Tabong Kima logged onto Twitter one morning and saw a hashtag that said #ImageNetRoulette.</span></p> <p><span style="font-weight: 400;">The site allows users to upload photos and artificial intelligence would analyse each face and describe what it saw.</span></p> <p><span style="font-weight: 400;">One photo pegged a man as an “orphan” where another photo, where the person was wearing glasses, was labelled a “grind, nerd, wonk and dweeb”.</span></p> <p><span style="font-weight: 400;">Kima, an African American, didn’t like what he saw when he uploaded his photo.</span></p> <p><span style="font-weight: 400;">The site tagged him as a “wrongdoer” and an “offender”.</span></p> <p><span style="font-weight: 400;">“I might have a bad sense of humour,” he </span><a href="https://twitter.com/TabKim2/status/1174330442385907712?s=19">tweeted</a><span style="font-weight: 400;">, “but I don’t think this is particularly funny”.</span></p> <p><span style="font-weight: 400;">ImageNet Roulette is a digital art project that’s intended to shine a light on the unsound and offensive behaviour that can creep into artificial intelligence technologies.</span><span style="font-weight: 400;"></span><span style="font-weight: 400;">Artificial intelligence technology is rapidly infiltrating its way into our everyday lives, including the facial-recognition services used by internet companies and police departments.</span></p> <p><span style="font-weight: 400;">ImageNet Roulette, designed by American artist Trevor Paglen and Microsoft researcher Kate Crawford, aims to show the depth of this problem.</span></p> <p><span style="font-weight: 400;">“We want to show how layers of bias and racism and misogyny move from one system to the next,” Paglen said in a phone interview from Paris.</span></p> <p><span style="font-weight: 400;">“The point is to let people see the work that is being done behind the scenes, to see how we are being processed and categorised all the time.”</span></p>

Technology

Our Partners