Microsoft stopped a project to develop an AI because it was going to admire Hitler.
The Telegraph: Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours
Tay, this AI program, was intended to speak like a teen girl with voice recognition software. You could chat with the robot via Twitter. This AI had learned many things from the real conversations on the internet.
However, as a result of deep learning, she began to say some sexually provocative words. In addition, she insisted that, Hitler did no bad things, and Bush made 9/11 terror attacks.
Now, Tay is offline because she was tired, according to Microsoft. It is unsure Tay will come back to the Twitter battlefield.
It is not for the first time the AI made a “political” mistake. I remembered Google Map tagged Whitehouse as “nigga house” accidentally. It was also recently that Google face recognition algorithm misidentified as a gorilla.
Google apologises for racist 'n*gga house' search that takes users to White House
Google Mistakenly Tags Black People as ‘Gorillas,’ Showing Limits of Algorithms
These mistakes share a point of discussion. It is a discrepancy between logic and ethics, or political correctness. It is true a few people respect Hitler, but many know it is hardly accepted to express it. Even if a face of a black person somehow resembles a gorilla, rational people never mention it. I would be angry if I were kidded as a yellow monkey. And, some people can name Whitehouse as “nigga house” for abusing purpose. However, it should not be adopted, regardless of the number of the voice of saying that.
All these judgments cannot be explained logically. The things the majority says are not always correct. It is uncertain that logical thinking makes you entirely happy.
I think these incidents suggest that the challenge of AI is approaching beyond the truth to the goodness. It is far more challenging for programmers, but an important concept to overcome human.
My past entry: Three elements of spirituality
**Sequel
No comments:
Post a Comment