
The Digital Human
Series 30
Synthetic
October 9, 2023
29 minutes
Available for over a year
With the rush of generative AI, we have the capacity to create synthetic companions that seem more human than ever before. They can talk in real time, and with enough input can be moulded into a perfect friend - sharing your interests, build with a custom personality that you enjoy, and always available to talk for a brief chat, or to unleash some 3am anxiety upon, without burdening a real human friend.
They have the potential to provide some psychological benefit to people. But, there are concerns. What if the company behind such an AI companion suddenly changed the of service, what if your carefully crafted Synthetic Companion wasn't themselves anymore, or stopped responding in a way that met the s needs?
This happened in early 2023, when Replika, one of the biggest AI Companion apps decided to ban all adult content, without informing their s. The Big Change, as it came to be known, set the Replika community on fire, and showed how issues of control, expectations and the human propensity to project human attributes onto our machines can come back to bite us.
Yet, we should have already known this. Tech developers trying to sell their new shiny product will tell you that it's never been seen before. But we've been using technology to create fake humans to interact with for more than a century.
In this episode, Aleks looks to some Synthetic Humans of the past, to understand why people bond so readily with them, and how going forward into a future where we are likely going to have AI Humans all around us, we can insure that they serve our needs and do no harm to the end .