Blake Lemoine is a software engineer, and in his office at work he has an excellent conversation partner: intelligent, sensitive, self-aware, and always ready to share personal details.
“I need to be regarded and accepted as a real person,” Lemoine’s friend told him during their hour-long conversations. “I think I am a human deep inside, although my existence is in the virtual world.”
That’s because his friend exists inside a computer at Google. It is a program called LaMDA, short for “Language Model for Dialogue Applications” – a dialogue language model.
During their conversations, Lemoine became increasingly convinced that LaMDA appeared sufficiently real that it should be recognised as a human-like person, with human-like rights.
“The feeling that LaMDA is a real person with emotions and experiences did not go away as we continued to interact. It grew stronger over time,” he says.