(2021-03-15) Wyatt Article Simulation And Simulacra

John Wyatt Article: Simulation and simulacra. The French philosopher Jean Baudrillard argued that postmodern culture had become so reliant on representations of reality that it was losing contact with the real world. In his 1981 work Simulacra and Simulation he wrote: “The territory no longer precedes the map, nor does it survive it. It is…. the map that precedes the territory…

In the third phase the simulation masks the absence of a profound reality; “it plays at being an appearance – it is of the order of sorcery.” Finally the simulation breaks free from reality completely – it becomes its own pure simulacrum.

Baudrillard was writing in the pre-digital era of the 1980s

AI powered smartphone apps offer to provide a vast range of simulated human interactions including cognitive behavioural therapy, personal coaching and 24/7 companionship.

The bot checks in on users once a day, asking questions like “How are you feeling?” and “What is your energy like today?” Alison Darcy, Woebot’s founder said that humans open up more when they know they’re talking to a bot. “We know that often, the greatest reason why somebody doesn’t talk to another person is just stigma,” she says. “When you remove the human, you remove the stigma entirely.”

The company Eternime offers the possibility of creating a virtual clone by uploading all our personal information in a digital form

Another cloud-based company Replika (replika.ai) offers an “AI companion who cares.

A major goal in the race for ever more realistic simulation of human conversation is to enable AI chatbots to monitor in real time the emotional state of the human they are interacting with, using a variety of techniques including face, movement and voice analysis

how should we think of these ‘relationships’? Can they play a helpful role for those struggling with loneliness or mental health problems, or those merely wishing to have an honest and self-disclosing conversation? Or could synthetic relationships with AI’s somehow interfere with the messy process of real human-to-human interactions

In the third phase the simulation masks the absence of a profound reality; “it plays at being an appearance – it is of the order of sorcery.” This may be seen in simulated relationships with ‘emotionally sensitive’ AI chatbots. The impression is of a human person who is empathic, caring and genuinely interested in my welfare.

But of course the chatbot doesn’t care about anything. Its ‘empathy’ is merely clever programming. The simulation is designed to mask the fact of its inauthenticity – of the absence of the profound reality of human compassion.


Edited:    |       |    Search Twitter for discussion