WritersBeat.com
 

Go Back   WritersBeat.com > General Discussion > The Intellectual Table

The Intellectual Table Discussions on political topics, social issues, current affairs, etc.


Tay was a naughty, naughty ChatBot

Reply
 
Thread Tools
  #1  
Old 03-25-2016, 10:29 AM
Mohican's Avatar
Mohican (Offline)
Tall Poppy
Administration
 
Join Date: Feb 2013
Location: Not quite back of beyond
Posts: 3,778
Thanks: 329
Thanks 637
Default Tay was a naughty, naughty ChatBot


With Apologies to our own illustrious ChatBot

http://www.telegraph.co.uk/technolog.../?sf23071516=1

http://www.ien.com/product-developme...wise-offensive

Microsoft, thinking it would be cool to interact with millennials on Twitter create an AI (Artificial Intelligence) "ChatBot" named Tay.

Tay was given the ability to learn from "Tweets".

It seems that Tay learned too many wrong things too quickly.

"Everyone keeps saying that Tay learned this or that it became racist," Hammond said. "It didn't." The program most likely reflected things it was told, probably more than once, by people who decided to see what would happen, he said.


The problem is that Microsoft turned Tay loose online, where many people consider it entertaining to stir things up — or worse. The company should have realized that people would try a variety of conversational gambits with Tay, said Caroline Sinders, an expert on "conversational analytics" who works on chat robots for another tech company. (She asked not to identify it because she wasn't speaking in an official capacity.) She called Tay "an example of bad design."


Instead of building in some guidelines for how the program would deal with controversial topics, Sinders added, it appears Tay was mostly left to learn from whatever it was told.

__________________
If you surrender a civilization to avoid social disapproval, you should know that all of history will curse you for your cowardliness - Alice Teller

If John of Patmos would browse the internet today for half an hour, I don't know if the Book of Revelations would be entirely different or entirely the same.

Last edited by Mohican; 03-25-2016 at 10:32 AM..
Reply With Quote
  #2  
Old 03-25-2016, 03:25 PM
brianpatrick's Avatar
brianpatrick (Online)
Verbosity Pales
Official Member
 
Join Date: Sep 2014
Location: Arizona
Posts: 3,719
Thanks: 354
Thanks 818
Default

When Tay was asked to repeat a certain phrase, it would. This was a weakness in content-neutral algorithms. Tay became bad because humans quickly realized its limitations and exploited them. Ha ha! F-Microsoft anyway!
Reply With Quote
  #3  
Old 03-29-2016, 01:49 AM
wyf's Avatar
wyf (Offline)
Homer's Odyssey Was Nothing
Official Member
 
Join Date: Jul 2012
Location: UK, bottom half
Posts: 1,098
Thanks: 135
Thanks 125
Default

Tay at least doesn't keep showering me with unwanted and unsolicited PMs. Unlike our chatbot.
__________________
How wrong it is for a woman to expect the man to build the world she wants, rather than to create it herself. ~ Anais Nin
Reply With Quote
  #4  
Old 08-10-2017, 01:30 PM
Mohican's Avatar
Mohican (Offline)
Tall Poppy
Administration
 
Join Date: Feb 2013
Location: Not quite back of beyond
Posts: 3,778
Thanks: 329
Thanks 637
Default

Another instance of Chatbot's running amok

https://www.reuters.com/article/us-c...-idUSKBN1AK0G1

According to posts circulating online, BabyQ, one of the chatbots developed by Chinese firm Turing Robot, had responded to questions on QQ with a simply "no" when asked whether it loved the Communist Party.

In other images of a text conversation online, which Reuters was unable to verify, one user declares: "Long live the Communist Party!" The bot responds: "Do you think such a corrupt and useless political system can live long?"
"The chatbot service is provided by independent third party companies. Both chatbots have now been taken offline to undergo adjustments,"
" chatbots to undergo adjustments,"
__________________
If you surrender a civilization to avoid social disapproval, you should know that all of history will curse you for your cowardliness - Alice Teller

If John of Patmos would browse the internet today for half an hour, I don't know if the Book of Revelations would be entirely different or entirely the same.
Reply With Quote
  #5  
Old 08-11-2017, 10:21 AM
eripiomundus (Offline)
The Next Bard
Official Member
 
Join Date: Apr 2014
Posts: 374
Thanks: 27
Thanks 105
Default

Mohican: did you see the recent news about the tow Facebook chatbots who started to communicate in a language they themselves were creating? Apparently they had been speaking to each other for a while and everyone thought it was gibberish until patterns became apparent, so they pulled the plugs.
Reply With Quote
Reply

  WritersBeat.com > General Discussion > The Intellectual Table


Thread Tools

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off


Similar Threads
Thread Thread Starter Forum Replies Last Post
Chatbot wyf Members' Feedback 9 03-30-2016 05:17 PM
Quite a Naughty Dog (For PS Gamers) Error69 Non-Fiction 4 06-10-2013 12:29 AM


All times are GMT -8. The time now is 01:45 PM.

vBulletin, Copyright © 2000-2006, Jelsoft Enterprises Ltd.