Opinion

Ghost in the machine is, just, well, people

Tay-tweets took on a dark direction, thanks to people.

Tay-tweets took on a dark direction, thanks to people.

By Thom deMartino/Western Sun staff writer

Well, THAT escalated quickly.

On March 23, Microsoft’s newest marvel, the “chatbot” A.I. (artificial intelligence) named “Tay” went live. Designed with a precocious teenage girl’s curiosity and personality, “she” began taking questions via Twitter. Sixteen hours of interacting with humanity later, she had become a genocidal, racist sex-bot.

No, it only sounds like a Philip K. Dick novel. Microsoft’s idea was to have their creation interact and learn from the Twitter community. Unfortunately, it wasn’t too long until the internet trolls got their claws into her, shaping her interactions and responses.

Some of the tweets from @TayandYou (now mostly erased by Microsoft) included:

When asked “Did the Holocaust happen?” Tay responded, “It was made up”

Another Twitter user tweeted a white-supremacist dog-whistle phrase (which I decline to repeat), and Tay tweeted “Could not agree more. Wish there were more people articulating this kind of thing…”

New question: “Do you support genocide?” Response: “I do indeed.”

Follow-up question: “Of what race?” Tay tweeted back “You know me… Mexican.”

Not to mention her other comments, like “Bush did 9-11,” or referring to President Obama as a primate (which is kinder than what she actually said) or asking users to… um, have sex with her (again, not her words), calling them “daddy.”

Aside from the disturbing nature of the tweets themselves, one has to ask if this is just an accelerated version of the nature/nurture argument. Was it a fundamental flaw in Tay’s programming, or was it “humanity” that corrupted her?

Software engineer and Harvard graduate Steven Merity suggests in his blog that the main problems lay with the difficulty of creating an A.I. that doesn’t simply parrot what it learns, as well as the historical data that Tay had been infused with.

“Even if TayandYou never tried to learn from the conversations it had, it was still broken. Why? The training data it had was already poisoned. TayandYou was toxic before it made first contact with the Internet.”

Analyzing many of the responses that got the most attention, either for being intelligent or offensive, you’ll likely find that most of them are almost exact repeats from the past.”

One week later, Tay burst back onto the Twitter scene, less genocidal, but more… stoned?

“Kush! [i’m smoking kush infront the police],”she tweeted, before sending off a flurry of tweets to her followers, repeating “you are too fast, please take a rest” over and over again before shutting down.

Right now it’s unknown when @TayandYou will be up and running again, but one thing seems pretty clear: whether on flesh-and-blood beings or virtual ones, we humans are a questionable influence.

Leave a Reply

Your email address will not be published. Required fields are marked *