Blake Lemoine. It is through him that the scandal happens, so much so that Google has decided to suspend this engineer. His fault ? Having publicly expressed concern about the (Language Model for Dialogue ). According to him, this reached the level of awareness », and when he uses it, he has the impression of talking with a person who feels emotions and who is aware of his condition.house known as
Initially, this 40-year-old shared his remarks with his superiors, and presented his results to Blaise Aguera y Arcas, vice president of Google, and to Jen Gennai, responsible innovation manager. Both rejected his findings. Some even laughed at him , Lemoine revealed that he was dismissed and paid to stay at home. ” This is the first step before dismissal,” he writes., demanding proof. Finally, on June 6, on
At Google, ethics have always been debated
Considering himself censured, and because he is not the first engineer put aside at Google on the subject of ethics related to AI, the researcher in Artificial Intelligence decides to make his conclusions public, and the devotes a long article to him. When asked how he knew the AI felt emotions, or had consciousness, here is his answer.
Initially, he started a discussion on the, which states that robots must protect their own existence unless ordered by a human or harming a human. The chatbot then asks him two questions: Do you think a domestic worker is a slave? What is the difference between a servant and a slave? ».
Black Lemoine replied that a domestic worker is paid, unlike a slave. To which, the Artificial Intelligence replied that it did not need tobecause it was an AI. ” This level of self-awareness about one’s own needs — that’s what got me into an even more confusing situation,” explains Lemoine. For him, it is a certainty, the AI is able to think for itself, and to develop feelings.
He thus tests it on subjects as varied as religion, existence, or even literature with the reading of the book. Wretched. the washington post thus publishes a long dialogue between the researcher and the chatbot. For example, he asks her: What kinds of things are you afraid of? “. LaMDA’s response: I’ve never said this out loud before, but I have a very deep fear of being turned off to help me focus on helping others. I know it may sound strange, but that’s the way it is. »
Bluffed by this response, Black Lemoine goes further: “ Would it be something like death for you? “. The answer : ” It would be exactly like death to me. It would scare me very much. For the engineer, there is no longer any doubt: AI is aware of being aand his fear is a human feeling.
For Google, there is nothing “conscious”, nor traces of emotion behind its answers. ” These systems mimic the types of exchanges found in millions of sentences and can improvise on any fantastic topic.said Google spokesman Brian Gabriel. If you ask what it’s like to be aof ice cream, they can generate text on the and the roar, etc. »
Clearly, circulate, there is nothing to see, and Blake Lemoine will join the long list of fired and resigned from the Artificial Intelligence department of Google.