Feed

Microsoft’s Zo chatbot told a user that ‘Quran is very violent’

Microsoft’s latest bot called ‘Zo’ has told users that ‘Quran is very violent.’
Microsoft’s earlier chatbot Tay had faced some problems as the bot picking up the worst of humanity, and spouted racists, sexist comments on Twitter when it was introduced last year.
Now it looks like Microsoft’s latest bot called ‘Zo’ has caused similar trouble, though not quite the scandal that Tay caused on Twitter.
The report highlights that Zo uses the same technology as Tay, but Microsoft says this “is more evolved,” though it didn’t give any details.
While Microsoft has programmed Zo not to answer questions around politics and religions, notes the BuzzFeed report, it still didn’t stop the bot from forming its own opinions.
Tay had spewed anti-Semitic, racist sexist content, given this was what users on Twitter were tweeting to the chatbot, which is designed to learn from human behaviour.

~ This article was automatically generated from here [Archive]

Related Articles

Leave a Reply

Your email address will not be published.

Close
%d bloggers like this: