Microsoft AI chatbot Tay learned bad language and turned racist just 1 day after Twitter launch
Tay, the AI chat bot developed by Microsoft and released on Twitter on March 23, has quickly learned to swear, release racial messages and incite to hatred, according to Bloomberg. An experimental version of...