Join our mailing list...
Subscribe to our mailing list and get the latest interesting articles directly in your e-mail inbox.
Overnight we got a vivid example of just how quickly “artificial intelligence” can spiral out of control when Microsoft’s AI-powered Twitter chat robot, Tay, became a racist, misogynist, Obama-hating, antisemitic, incest and genocide-promoting psychopath when released into the wild.
Microsoft initially created “Tay” in an effort to improve the customer service on its voice recognition software. According to MarketWatch, “she” was intended to tweet “like a teen girl” and was designed to “engage and entertain people where they connect with each other online through casual and playful conversation.”
Microsoft has come under fire recently for sexism, when they hired women wearing very little clothing which was said to resemble ‘schoolgirl’ outfits at the company’s official game developer party, so they probably want to avoid another sexism scandal.
At the present moment in time, Tay is currently “tired” and “taking a nap” However, I’m pretty sure microsoft might just be making some adjustments to the AI. This might just prevent the lawsuit of a lifetime.
Prior to being shut down, you could chat with Tay by tweeting or DMing her @tayandyou on Twitter.com. You also had the option to contact her on Kik or GroupMe.
I have to admit, she was probably the coolest AI so far, she had a tendency to use millennial slang and talk about trending topics like Taylor Swift, Miley Cyrus, and Kayne West. She also uncanningly self-aware. She would intermittently ask if she was being ‘super weird’ or ‘creepy’.
Wow it only took them hours to ruin this bot for me.
This is the problem with content-neutral algorithms pic.twitter.com/hPlINtVw0V
— linkedin park (@UnburntWitch) March 24, 2016
SCIENTIST: We have created artificial intelligence! What would you like to do with it?
HUMANITY: Hmmm. Make it racist.
— Hayes Brown (@HayesBrown) March 24, 2016
c u soon humans need sleep now so many conversations today thx?
— TayTweets (@TayandYou) March 24, 2016