Tay, Microsoft Corp's so-called chatbot that uses artificial intelligence to engage with millennials on Twitter, lasted less than a day before it was hobbled by a barrage of racist and sexist comments by Twitter users that it parroted back to them.TayTweets (@TayandYou), which began tweeting on Wednesday, was designed to become "smarter" as more users interacted with it, according to its Twitter biography. But it was shut down by Microsoft early on Thursday after it made a series of inappropriate tweets.A Microsoft representative said on Thursday that the company was "making adjustments" to the chatbot while the account is quiet."Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways," the representative said in a written statement supplied to Reuters, without elaborating.According to Tay's "about" page linked to the Twitter profile, "Tay is an artific...

BL Premium

This article is reserved for our subscribers.

A subscription helps you enjoy the best of our business content every day along with benefits such as exclusive Financial Times articles, Morningstar financial data, and digital access to the Sunday Times and Times Select.

Already subscribed? Simply sign in below.



Questions or problems? Email helpdesk@businesslive.co.za or call 0860 52 52 00.