Yahoo Web Search

Search results

  1. Tay was a chatbot that was originally released by Microsoft Corporation as a Twitter bot on March 23, 2016. It caused subsequent controversy when the bot began to post inflammatory and offensive tweets through its Twitter account, causing Microsoft to shut down the service only 16 hours after its launch. [1]

    • Senior Author
    • The Internet Is Full of Trolls. The internet is full of trolls, and that's not exactly news, is it? Apparently, it was so to Microsoft back in 2016. We're not saying that building a chatbot for "entertainment purposes" targeting 18-to-24-year-olds had anything to do with the rate at which the service was abused.
    • AI Can't Intuitively Differentiate Between Good and Bad. The concept of good and evil is something AI doesn't intuitively understand. It has to be programmed to simulate the knowledge of what's right and wrong, what's moral and immoral, and what's normal and peculiar.
    • Don't Train AI Models Using People's Conversations. Tay was created by "mining relevant public data and using AI and editorial developed by a staff including improvisational comedians."
    • AI Lacks Common Sense and Doesn't Get Sarcasm. AI today seems more intuitive (or, to put it more accurately, is better at simulating intuition) but still sometimes struggles with recognizing sarcasm and figures of speech.
  2. Mar 25, 2016 · Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of...

  3. Nov 25, 2019 · 5 min read. Microsoft's Tay chatbot started out as a cool teenage girl, but quickly turned into a hate-speech-spewing disaster. Photo-illustration: Gluekit.

  4. Mar 24, 2016 · Today, Microsoft had to shut Tay down because the bot started spewing a series of lewd and racist tweets. Tay was set up with a young, female persona that Microsoft's AI programmers...

    • CBS News
    • 4 min
  5. Mar 25, 2016 · Microsoft said it was all the fault of some really mean people, who launched a “coordinated effort” to make the chatbot known as Tay “respond in inappropriate ways.”

  6. People also ask

  7. Mar 25, 2016 · A Microsoft "chatbot" designed to converse like a teenage girl has been taken offline after its artificial intelligence software was coaxed into firing off hateful, racist comments on Twitter.

  1. People also search for