-- "Microsoft’s Bing chatbot has been returning some unhinged and threatening responses to users. The chatbot appears to have a personality problem, becoming much darker, obsessive, and more aggressive over the course of a discussion. The company has now updated the bot with three new modes that aim to fix the issue by allowing users to select how crazy the AI gets." --