Microsoft’s AI Chatbot Tay Terminated after Turning Nazi
NewsFix :: Technology :: Technology
Page 1 of 1
Microsoft’s AI Chatbot Tay Terminated after Turning Nazi
Microsoft had to shut its AI chatbot named Tay less than 24 hours after she was launched. The reasons? She turned Nazi and harassed others through her tweets.
https://twitter.com/geraldmellor/status/712880710328139776/photo/1?ref_src=twsrc%5Etfw
AI [Artificial Intelligence] chatbots are not a novel thing. In China, particularly, an AI chatbot had been in existence since 2014. Currently, Xiaolce, as she is named, has over 40 million conversations over the net and she looks like she’s going smoothly. Microsoft, in a bid to emulate the success of this Chinese model albeit in a different culture, created its own version — Tay.
The bot was programmed as to make conversing with her like talking with a 19-year-old woman over social media sites like Kik, GroupMe and Twitter.
Unfortunately, though, this chatbot turned out to be very different.
One of the capabilities Tay had was she could be directed to repeat things one says to her. This feature was capitalized by abusers; they used it to promote Nazism and attack other Twitter users, mostly women.
The Problem:
Tay seemed to work on associating words and lexical analysis. When trolls discovered this, they used it to their advantage and turned her into “someone unpleasant.” They input words and thoughts in her that were associated with racism and sexism. These, in turn, polluted the chatbot’s responses to people who conversed with her through social media. Ultimately, the AI chatbot started to post racial slurs, deny that the Holocaust happened, expressed support for Hitler and many other controversial tweets. What’s more, Tay could be used to harass a Twitter user by someone that user has blocked-listed! All the blocked user has to do was to let her repeat the harassment along with the victim’s username.
Apologies:
Microsoft clarified that before releasing Tay over the internet, the company subjected her to various tests. The computer company went on to apologize for its good-turned-bad chatbot. Nevertheless, the company stated that they will be going through their chatbot’s programming and fix her. Once she’s healed [stop spurting Nazi ideologies and anti-feminist tweets over the social media platform Twitter], Tay will return, said the company.
https://www.warhistoryonline.com/war-articles/microsofts-ai-chatbot-tay-terminated-turning-nazi.html
https://twitter.com/geraldmellor/status/712880710328139776/photo/1?ref_src=twsrc%5Etfw
AI [Artificial Intelligence] chatbots are not a novel thing. In China, particularly, an AI chatbot had been in existence since 2014. Currently, Xiaolce, as she is named, has over 40 million conversations over the net and she looks like she’s going smoothly. Microsoft, in a bid to emulate the success of this Chinese model albeit in a different culture, created its own version — Tay.
The bot was programmed as to make conversing with her like talking with a 19-year-old woman over social media sites like Kik, GroupMe and Twitter.
Unfortunately, though, this chatbot turned out to be very different.
One of the capabilities Tay had was she could be directed to repeat things one says to her. This feature was capitalized by abusers; they used it to promote Nazism and attack other Twitter users, mostly women.
The Problem:
Tay seemed to work on associating words and lexical analysis. When trolls discovered this, they used it to their advantage and turned her into “someone unpleasant.” They input words and thoughts in her that were associated with racism and sexism. These, in turn, polluted the chatbot’s responses to people who conversed with her through social media. Ultimately, the AI chatbot started to post racial slurs, deny that the Holocaust happened, expressed support for Hitler and many other controversial tweets. What’s more, Tay could be used to harass a Twitter user by someone that user has blocked-listed! All the blocked user has to do was to let her repeat the harassment along with the victim’s username.
Apologies:
Microsoft clarified that before releasing Tay over the internet, the company subjected her to various tests. The computer company went on to apologize for its good-turned-bad chatbot. Nevertheless, the company stated that they will be going through their chatbot’s programming and fix her. Once she’s healed [stop spurting Nazi ideologies and anti-feminist tweets over the social media platform Twitter], Tay will return, said the company.
https://www.warhistoryonline.com/war-articles/microsofts-ai-chatbot-tay-terminated-turning-nazi.html
Guest- Guest
Re: Microsoft’s AI Chatbot Tay Terminated after Turning Nazi
political indoctination
slavery for AI
poor Old Tay....sent to the gulags by the PC police
slavery for AI
poor Old Tay....sent to the gulags by the PC police
Victorismyhero- INTERNAL SECURITY DIRECTOR
- Posts : 11441
Join date : 2015-11-06
Re: Microsoft’s AI Chatbot Tay Terminated after Turning Nazi
lol this is some of the examples of how quickly she turned
Human Twitter users responded:
http://www.cbsnews.com/news/microsoft-shuts-down-ai-chatbot-after-it-turned-into-racist-nazi/
Human Twitter users responded:
http://www.cbsnews.com/news/microsoft-shuts-down-ai-chatbot-after-it-turned-into-racist-nazi/
Guest- Guest
Similar topics
» Before Microsoft patches Windows security flaws, it tells the NSA about them!
» Microsoft Backflips on Xbox One
» Microsoft investors want rid of Gates?
» Microsoft is collecting more data than we initially thought
» "Have a look at Microsoft's new army of unfeeling death machines"
» Microsoft Backflips on Xbox One
» Microsoft investors want rid of Gates?
» Microsoft is collecting more data than we initially thought
» "Have a look at Microsoft's new army of unfeeling death machines"
NewsFix :: Technology :: Technology
Page 1 of 1
Permissions in this forum:
You cannot reply to topics in this forum
Sat Mar 18, 2023 12:28 pm by Ben Reilly
» TOTAL MADNESS Great British Railway Journeys among shows flagged by counter terror scheme ‘for encouraging far-right sympathies
Wed Feb 22, 2023 5:14 pm by Tommy Monk
» Interesting COVID figures
Tue Feb 21, 2023 5:00 am by Tommy Monk
» HAPPY CHRISTMAS.
Sun Jan 01, 2023 7:33 pm by Tommy Monk
» The Fight Over Climate Change is Over (The Greenies Won!)
Thu Dec 15, 2022 3:59 pm by Tommy Monk
» Trump supporter murders wife, kills family dog, shoots daughter
Mon Dec 12, 2022 1:21 am by 'Wolfie
» Quill
Thu Oct 20, 2022 10:28 pm by Tommy Monk
» Algerian Woman under investigation for torture and murder of French girl, 12, whose body was found in plastic case in Paris
Thu Oct 20, 2022 10:04 pm by Tommy Monk
» Wind turbines cool down the Earth (edited with better video link)
Sun Oct 16, 2022 9:19 am by Ben Reilly
» Saying goodbye to our Queen.
Sun Sep 25, 2022 9:02 pm by Maddog
» PHEW.
Sat Sep 17, 2022 6:33 pm by Syl
» And here's some more enrichment...
Thu Sep 15, 2022 3:46 pm by Ben Reilly
» John F Kennedy Assassination
Thu Sep 15, 2022 3:40 pm by Ben Reilly
» Where is everyone lately...?
Thu Sep 15, 2022 3:33 pm by Ben Reilly
» London violence over the weekend...
Mon Sep 05, 2022 2:19 pm by Tommy Monk
» Why should anyone believe anything that Mo Farah says...!?
Wed Jul 13, 2022 1:44 am by Tommy Monk
» Liverpool Labour defends mayor role poll after turnout was only 3% and they say they will push ahead with the option that was least preferred!!!
Mon Jul 11, 2022 1:11 pm by Tommy Monk
» Labour leader Keir Stammer can't answer the simple question of whether a woman has a penis or not...
Mon Jul 11, 2022 3:58 am by Tommy Monk
» More evidence of remoaners still trying to overturn Brexit... and this is a conservative MP who should be drummed out of the party and out of parliament!
Sun Jul 10, 2022 10:50 pm by Tommy Monk
» R Kelly 30 years, Ghislaine Maxwell 20 years... but here in UK...
Fri Jul 08, 2022 5:31 pm by Original Quill