Her purpose, unlike AI-powered virtual assistants like Facebook's M, is almost entirely to amuse.
Ars Technica reported Tay experiencing topic "blacklisting": Interactions with Tay regarding "certain hot topics such as Eric Garner (killed by New York police in 2014) generate safe, canned answers".
If you’re in the Messenger app, try searching to find it at “@hiponcho.” The first notable thing about this bot that it has an onboarding process where many I encounter have none at all, asking where you’re located, when you’d like your daily report, and suggests commands to try. Poncho is a prime example of an effective use of the bot format.
Cons: The Slack integrated version is garbage compared to Messenger, wish it was easier to access Poncho across platforms.
Microsoft was "deeply sorry for the unintended offensive and hurtful tweets from Tay", and would only "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values".
However, Tay soon became stuck in a repetitive loop of tweeting "You are too fast, please take a rest", several times a second.