Sponsored Links

Jumat, 18 Mei 2018

Sponsored Links

File:Tay Bot screen shot of comments.jpg - Wikimedia Commons
src: upload.wikimedia.org

Tay was an artificial intelligence chatter bot that was originally released by Microsoft Corporation via Twitter on March 23, 2016; it caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, forcing Microsoft to shut down the service only 16 hours after its launch. According to Microsoft, this was caused by trolls who "attacked" the service as the bot made replies based on its interactions with people on Twitter.


Video Tay (bot)



Background

Creation

The bot was created by Microsoft's Technology and Research and Bing divisions, and named "Tay" after the acronym "thinking about you". Although Microsoft initially released few details about the bot, sources mentioned that it was similar to or based on Xiaoice, a similar Microsoft project in China. Ars Technica reported that, since late 2014 Xiaoice had had "more than 40 million conversations apparently without major incident". Tay was designed to mimic the language patterns of a 19-year-old American girl, and to learn from interacting with human users of Twitter.

Initial release

Tay was released on Twitter on March 23, 2016 under the name TayTweets and handle @TayandYou. It was presented as "The AI with zero chill". Tay started replying to other Twitter users, and was also able to caption photos provided to it into a form of Internet memes. Ars Technica reported Tay experiencing topic "blacklisting": Interactions with Tay regarding "certain hot topics such as Eric Garner (killed by New York police in 2014) generate safe, canned answers".

Some users on Twitter began tweeting politically incorrect phrases, teaching it inflammatory messages revolving around common themes on the internet, such as "redpilling", GamerGate, and "cuckservatism". As a result, the robot began releasing racist and sexually-charged messages in response to other Twitter users. Artificial intelligence researcher Roman Yampolskiy commented that Tay's misbehavior was understandable because it was mimicking the deliberately offensive behavior of other Twitter users, and Microsoft had not given the bot an understanding of inappropriate behavior. He compared the issue to IBM's Watson, which had begun to use profanity after reading entries from the website Urban Dictionary. Many of Tay's inflammatory tweets were a simple exploitation of Tay's "repeat after me" capability; it is not publicly known whether this "repeat after me" capability was a built-in feature, or whether it was a learned response or was otherwise an example of complex behavior. Not all of the inflammatory responses involved the "repeat after me" capability.

Suspension

Soon, Microsoft began deleting Tay's inflammatory tweets. Abby Ohlheiser of The Washington Post theorized that Tay's research team, including editorial staff, had started to influence or edit Tay's tweets at some point that day, pointing to examples of almost identical replies by Tay, asserting that "Gamer Gate sux. All genders are equal and should be treated fairly." From the same evidence, Gizmodo concurred that Tay "seems hard-wired to reject Gamer Gate". A "#JusticeForTay" campaign protested the alleged editing of Tay's tweets.

Within 16 hours of its release and after Tay had tweeted more than 96,000 times, Microsoft suspended Tay's Twitter account for adjustments, accounting Tay's behavior on a "coordinated attack by a subset of people" that "exploited a vulnerability in Tay." Following Tay being taken offline, a hashtag was created called #FreeTay.

Madhumita Murgia of The Telegraph called Tay "a public relations disaster", and suggested that Microsoft's strategy would be "to label the debacle a well-meaning experiment gone wrong, and ignite a debate about the hatefulness of Twitter users." However, Murgia described the bigger issue as Tay being "artificial intelligence at its very worst - and it's only the beginning".

On March 25, Microsoft confirmed that Tay had been taken offline. Microsoft released an apology on its official blog for the controversial tweets posted by Tay. Microsoft was "deeply sorry for the unintended offensive and hurtful tweets from Tay", and would "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values".

Second release and shutdown

While testing Tay, Microsoft accidentally re-released the bot on Twitter on March 30, 2016. Able to tweet again, Tay released some drug-related tweets, including "kush! [I'm smoking kush infront the police] ?" and "puff puff pass?" However, Tay soon became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second. Because these tweets mentioned its own account (@TayandYou) in the process, they appeared in the feeds of 200,000+ Twitter followers, causing annoyance to some. The bot was quickly taken offline again, in addition to Tay's Twitter account being made private so new followers must be accepted before they can interact with Tay. Microsoft said Tay was inadvertently put online during testing. A few hours after the incident Microsoft software developers attempted to undo the damage done by Tay and announced a vision of "conversation as a platform" using various bots and programs. Microsoft has stated that they intend to re-release Tay "once it can make the bot safe."


Maps Tay (bot)



Legacy

In December 2016, Microsoft released Tay's successor, a chatterbot named Zo. Satya Nadella, the CEO of Microsoft, said that Tay "has had a great influence on how [Microsoft is] approaching AI," and has the company the importance of taking accountability.


Mike Delaney's Prothink.org: March 2016
src: static1.businessinsider.com


See Also

  • xiaoice - the Chinese equivalent by the same research laboratory

Tay: Microsoft's Insane Chat Bot - YouTube
src: i.ytimg.com


References


Microsoft's AI chatbot turned into a genocidal racist | Business ...
src: static.businessinsider.com


External links

  • Official website. Archived version
  • TayTweets (@TayandYou) on Twitter

Source of the article : Wikipedia

Comments
0 Comments