![]() So far, no other celebrities have complained. But they decided to avoid a legal battle and dubbed the replacement chatbot "Zo" instead. Gizmodo reports that Microsoft's copyright lawyers weren't quite convinced by the purported connection between Tay the chatbot and Taylor Swift the pop megastar. As reported by Gizmodo, Swift's lawyers apparently got in touch with Microsoft to complain that Tay's name created a "false and misleading association between the popular singer and our chatbot." RebrandingĪccording to the book, the reason that Microsoft's next chatbot - which was barred from discussing topics like politics and race - got a name change was to avoid going to court. ![]() Remember the 2016 internet horror story where Microsoft created a Twitter bot, named Tay, and had to shut it down within a single day after internet trolls turned it into a hate-spewing neo-Nazi?Īccording to Microsoft President Brad Smith's new book, pop star Taylor Swift took issue with the debacle. Yesterday the company launched 'Tay,' an artificial intelligence chatbot designed to develop conversational understanding by interacting with humans. If left unattended, people will exploit its weaknesses-and Tay was a prime example of this.Apparently "Tay" was too close to Swift's name for comfort. You can think of today's AI like a child prodigy it's brilliant in some ways, but it's still a child nonetheless. Until AI can regulate itself, it can't be left unsupervised. AI Chatbots Should Never Be Left Unsupervised Everything that you don't want a chatbot to do must be manually programmed into it. The point here is that artificial intelligence is perfectly obedient, which is perhaps its greatest strength and weakness. Even today, after all our advancements in natural language processing, people are still finding linguistic loopholes to make AI chatbots like ChatGPT "hallucinate" and deliver results that were intended to be restricted. The only problem: Tay wound up being a racist, fascist, drugged-out asshole. The more Twitter users engaged with Tay, the more it would learn and mimic what it saw. According to the company, Tay was created as an experiment in conversational understanding. AI Has to Be Programmed to Reject Invalid RequestsĪI is very impressionable and must be programmed to reject requests promoting harm. Microsoft unveiled its Twitter chatbot called Tay on March 23. It may behave and sound human to a surprising degree, but good programming can only go so far. In other words, once Tay heard I love Hitler more than I love puppies, she started to say she loves Hitler more than puppies because that is how she thinks people talk to each other. This is why when Twitter users were feeding Tay all kinds of propaganda, the bot simply followed along-unaware of the ethics of the info it was gathering.ĪIs like Tay do not understand why humans are the way we are, act the way we do, and the eccentricities of each individual. Tay is converting words into probabilities, and probabilities back into words. These qualities more or less come naturally to humans as social creatures, but AI can't form independent judgments, feel empathy, or experience pain. It has to be programmed to simulate the knowledge of what's right and wrong, what's moral and immoral, and what's normal and peculiar. The concept of good and evil is something AI doesn't intuitively understand. AI Can't Intuitively Differentiate Between Good and Bad ![]() In a way, internal trolls act as a feedback mechanism for quality assurance, but that's not to say that a chatbot should be let loose without proper safeguards put in place before launch. People naturally want to test the limits of new technologies, and it's ultimately the developer's job to account for these malicious attacks. Still, it definitely wasn't the smartest idea, either. We're not saying that building a chatbot for "entertainment purposes" targeting 18-to-24-year-olds had anything to do with the rate at which the service was abused. The internet is full of trolls, and that's not exactly news, is it? Apparently, it was so to Microsoft back in 2016.
0 Comments
Leave a Reply. |