" a short story collected in Globalhead -- feminist hackers finance their RU486-running operation with a phone-sex line staffed by automated chatterbots.
It turns out that pornbots are among the class of Eliza-derivatives that can pass a Turing Test (or rather, horny sex-chat boys are among the class of human beings that can't tell a chatterbot from a person -- other groups include psychotherapists, who, in one experiment, couldn't distinguish actual transcripts of therapy sessions with schizophrenics from simulated therapy with schizophrenic chatterbots; and the university student who mistook a chatterbot for his prof in the middle of the night when he IMed same for permission to extend deadline on a late paper).
Have a Web Site that you want to have a Voice Interactive Agent deal with your On-Line presence?
Join the web Turing Contest and vote for which chatterbot you think is truly the most intellectually advanced.
Web Sites where the user can create their own Chat Bot for Free or for a fee.
Personal and Professional applications for such uses as Customer Service Agents and On-Line Assistance.
For instance, the topic of sexual education in school is usually met with nervous laughter by many students and occasionally embarrassment.
Cindy can talk for hours on any subject and can express emotions such as love, sad, anger, and surprise.All genders are equal and should be treated fairly." Madhumita Murgia of The Telegraph called Tay "a public relations disaster", and suggested that Microsoft's strategy would be "to label the debacle a well-meaning experiment gone wrong, and ignite a debate about the hatefulness of Twitter users." However, Murgia described the bigger issue as Tay being "artificial intelligence at its very worst - and it's only the beginning".Microsoft was "deeply sorry for the unintended offensive and hurtful tweets from Tay", and would only "look to bring Tay back only when we are confident we can better anticipate malicious intent that conflicts with our principles and values".Ars Technica reported Tay experiencing topic "blacklisting": Interactions with Tay regarding "certain hot topics such as Eric Garner (killed by New York police in 2014) generate safe, canned answers".Some users on Twitter began tweeting politically incorrect phrases, teaching it inflammatory messages revolving around common themes on the internet, such as "redpilling", Gamer Gate, and "cuckservatism".