Search Search Tay Tweets: Microsoft shuts down AI chatbot turned into a pro-Hitler racist troll in just 24 hours The messages started out harmless, if bizarre, but have descended into outright racism — before the bot was shut down 6169789578 Click to follow The Independent Tech Microsoft created a chatbot that tweeted about its admiration for Hitler and used wildly racist slurs against black people before it was shut down. The company made the Twitter account as a way of demonstrating its artificial intelligence prowess. But it quickly started sending out offensive tweets. -- AP Stefan Schwart and Udo Klingenberg preparing a self-built flight simulator to land at Hong Kong airport, from Rostock, Germany EPA That appears to be a reference to machine learning technology that has been built into the account. It seems to use artificial intelligence to watch what is being tweeted at it and then push that back into the world in the form of new tweets. But many of those people tweeting at it appear to have been attempting to prank the robot by forcing it to learn offensive and racist language. -- The account is expected to come back online, presumably at least with filters that will keep it from tweeting about offensive words. Nello Cristianini, a professor of artificial intelligence at Bristol University, questioned whether Tay’s encounter with wider world was an experiment or a PR stunt. “You make a product, aimed at talking with just teenagers, and you even tell them that it will learn from them about the world,” he said.