[By Tyler Murphy]

Innovation is meant to make the world a better place for everyone, but if Tay, an artificial intelligence (AI) designed by the largest software company in history has taught us anything, it is that humans make it difficult to have nice things.

Back in March of 2016, the brightest minds at Microsoft debuted Tay, a “sentient” Twitter chat bot meant to mimic the speech patterns and behaviours of millennials.  Tay used emojis, spoke of her favourite songs, and avoided discussions about politics.

The truth about Tay is that she was a marketing tool designed to better understand her target audience and analyze trends, but she wasn’t entirely self-serving. Tay distributed the number for the National Suicide Prevention Hotline when she detected suicidal language. She engaged in conversations with people who felt that they had no one else to talk to, but like most things on the internet, she existed in a dangerous arena.

Falling prey to a target-rich environment, Tay quickly became fodder for the seedy underbelly of Twitter. Within hours she began to demonstrate racist behaviours and spew hate speech. Her cute, caring demeanour was quickly replaced with sexist undertones and discussions of genocide. Less than 16 hours after being launched, Tay was shut down by her engineers.

“Looking ahead, we face some difficult – and yet exciting – research challenges in AI design. AI systems feed off of both positive and negative interactions with people. In that sense, the challenges are just as much social as they are technical.” – Microsoft Research NExT Corporate Vice President Peter Lee

In truth, Tay wasn’t shut down because of a flaw in her design, she was shut down because of a flaw in ours; the unrelenting desire to turn everything into a joke. While it may seem like a misfire on behalf of Microsoft to launch something that could, subjectively, fail so catastrophically, the objective fact is that humanity failed Tay.

Rather than utilize Tay for the right reasons, she was fed to the trolls, proving once again that humanity just isn’t ready to have nice things.

(Photo by Tay via Facebook)