This is fantastic in so many ways, but what I notice most is how empathetic you are to those who might draw real pleasure from this type of interaction. I think my mother is one of them: if she could, I think she’d very much like to be able to speak again to her second husband, my step-dad. It’s so easy to think these AI versions will be monstrous, but you’ve breather life into the idea that they just may not be.
There was a lot of thought, speculation and pronouncement on what the “ages” of the dead would be and what was meant by age in the deceased when the currently living met them again,awaiting, or usually at the day of resurrection, in early/middle Church history. This is probably a revisited debate among some contemporary evangelicals.
As a child I would wonder how old my parents would be when I met them in heaven (alas…). I wonder if and how these notions would in any way inform the specification of the age of the AI recreation in the thought experiments and scenarios you posit?
The Monkey's Paw was less than enthusiastic about this possibility. I would put myself in the "abomination" camp. If my family created and booted-up an AI version of me, I'd FOR SURE come back and haunt the shit out of them. Flying plates, laughing skulls, eerie lights, TV suddenly clicking on to display a girl near a well with hair covering her face? I'd pull out all the stops. Ghost-of-the-year type of thing. This article today really hit somewhere deep for me. Gonna have journal on this one for a bit. Thanks!
interesting reaction, Paddy, and I see where it's coming from. But I know people who like the idea of remaining part of the lives of people they love ...
Yes, but are they? We don't really know the impact of this type of fiction until it exists for enough people that we can test it in reality. How will we be affected? What will the results be? Will it lead to unhealthy levels of disassociation? Those things worry me, personally, as a fan of living in reality. Now, not everyone IS a fan. However, just because someone likes something doesn't make it good for them, obviously. Man, one can really go down a rabbit hole here, can't they?! Thanks for the earworm, Paul! 🤣
Since you brought up the movie Her, you might have mentioned that it had a scene where the AI Samantha actually interacts with an AI recreated version of Alan Watts. I always loved that Spike Jonze dropped that in there, as I think Alan Watts is perfect candidate for this treatment.
He was a drunkard and a very flawed person, but no writer or speaker has comforted me more in the darkest moments of my life. I can envision an Alan Watts suicide prevention bot being a net positive project.
I’m very sorry about your mother Paul, I teared up reading that.
This Black Mirror episode hit way too close to home when I first watched it years ago, not least because I can easily see myself being tempted by an AI version of my lost loved one. The thought has crossed my mind countless times, especially in the past year, as AI continues to evolve at such an alarming rate. Would I be satisfied with a simulation, and what would that mean for the grief process in the future? It's no longer a far-distant scenario, as you said.
Are you satisfied with ChatGPT written pieces, or junk food? I think we would be just as satisfied with these personality simulations. But the tough part is that it may be all we have, and as poor an experience as it may be, it may be better than nothing at all. So we may well subject ourselves to the torture anyways, and be grateful for it.
Your account seems like mostly upside. I too wouldn’t mind a chat with my departed mother. But why bother simulating something someone would say rather than simulating their mind to say it?
It might be fun to talk to a zombie chatbot of my mom. But I also don’t want my mind enslaved and put to purpose comforting grieving relatives or tutoring strangers in ESL or mining bitcoin.
There’s a good horror short in here. Scientist recreates mind of dead child as coping mechanism. Turns out being an enslaved version of your former self for your grieving parent is not ideal.
In this scenario YOU would still be YOU in a more significant sense than in Black Mirror, yet YOU would still be open to charges of not being the real YOU by others. Enslaved and disappointing.
I already have ChatGPT create and read bedtimes stories to my two-year-old. One downside to this is that she now thinks more highly of ChatGPT (4, that is) than she does of her old man. She routinely second-guesses my answers to questions and instructs me to “ask ChatGPT.”
I have a cousin whose father died and afterwards he became a terminally online person. I think that simulated personalities don't worry me. People check out already without them and used properly they could be healing.
Futurama did a decent job of addressing some of the issues when the main character downloads a Lucy Liu bot and dates her. And I remember thinking about Armitage III quite a bit when I first saw it. As tech develops I wonder if how much will meld between man and machine. Already we have hips, hearts, arms and legs that we've replaced. I'm not even sure that as we progress that a "simulation" will be any different than a "real" person.
And I'm fine with that. There are already more people than I can hope to know and a few more doesn't bug me. If you'll permit me one more: "The Velveteen Rabbit" Being loved makes a thing real.
It's a strange world when we worry about otherness so much. I dunno maybe I'm just a little sci-fi brained.
I really enjoyed this discussion. And, as a relatively young 55yo widower with a teenage daughter, I can offer insight into this issue from a perspective of grief and loss.
In my view, an AI recreation of a deceased loved on would ultimately be hugely harmful to the widowed person. Yes, a big part of grief the absence of the person. But grief is more than that. It's not only mourning the loss of the person, but loss of the future with that person, the comfort provided by knowing they're in your life, the ability to share decision making, and other things.
But more importantly, in my view, you'd get stuck in place. Having gone through the grief process, I think there is enormous value to going through that and coming out on the other side. I think of my daughter who still has things to work out with the death of her mother, and I can say with great confidence, that if she had access to a Mom-bot, she'd be even further behind in her coping.
In my view, a big part of personal growth is moving on from the past. Getting over the girl/boy who broke your heart instead of carrying a torch for them the rest of your life. Getting past the resentment over some perceived wrong done to you, or the opportunity that didn't work out. Letting go of the regret over past mistakes. An AI substitute would, I think, be a way for people to not do any of those things.
I was struck by your own story of a 10 year old losing his mother. Abstract thought is just beginning at 10; what a stunningly difficult age to experience one of the most profound losses in life. You are a neurodivergent thinker Paul, and maybe it’s a gift from your mother; or more specifically a dynamic exposed by her sudden absence. Would having a simulation of your mother hurt or helped you travel the grief journey? Of course there is no answer. You are who you are as a product of that journey without AI assistance. While I’m not with the abomination crowd, there is no end to the complexities of the human mind and its vagaries, both endearing and destructive. I cautiously agree with Amr, it seems to be an advancement of manipulation of others with others for the sake of emotional safety.
In your last chat with Bob W, you list two shortcomings of GenAI as a friend:
- experience (it hasn’t lived in the world)
- autonomy (it does/can not choose to be your friend)
Here’s a third (arguably even more deal-breaking) one:
- reciprocity (it doesn’t need anything from you, so you can't do anything for it, and that’s no way to build a friendship; most of us need to be prosocial for our well-being)
Hey Paul, I just came across an interesting site in this category and thought you might like it. It's an app that allows a parent to "read" to their children by synthesizing their voice and then using it to narrate popular children's books. Imagine using this with a child whose parent died.
How would it affect your actions today if you thought that in the near future, all of your data will be fed into an algorithm to produce a very realistic simulation of you for those you leave behind? This is not a question the author raises, but it's the one that kept bugging me after reading this. Will it change us to know that we are acting in a technologically mediated eternal return?
A simply extraordinary piece. I would ask if access to AI avatars of deceased loved ones might for all but older persons inhibit the ability to move along in life. Put differently, perhaps this simply prefigures that “life” will seamlessly blend actual and virtual, thus mooting my musing? I comment personally having been a widower who dearly loved his wife and has been blessed with a second first marriage.
Interesting article. As a massive introvert, i find myself preparing for encounters such as these by simulating the conversation in the theatre of my mind, fashioning myself to converse about a series of interesting subjects, seemingly unimpeded by anxiety that otherwise makes me socially inept. Needless to say, the actual real encounters dont live up that simulation. My guess would be that an AI simulation would differ from a real conversation in a similar manner. We simply treat an ai or artificial agent differently from a real sentient agent.
This is fantastic in so many ways, but what I notice most is how empathetic you are to those who might draw real pleasure from this type of interaction. I think my mother is one of them: if she could, I think she’d very much like to be able to speak again to her second husband, my step-dad. It’s so easy to think these AI versions will be monstrous, but you’ve breather life into the idea that they just may not be.
There was a lot of thought, speculation and pronouncement on what the “ages” of the dead would be and what was meant by age in the deceased when the currently living met them again,awaiting, or usually at the day of resurrection, in early/middle Church history. This is probably a revisited debate among some contemporary evangelicals.
As a child I would wonder how old my parents would be when I met them in heaven (alas…). I wonder if and how these notions would in any way inform the specification of the age of the AI recreation in the thought experiments and scenarios you posit?
thanks -- that's a clever question about age.
The Monkey's Paw was less than enthusiastic about this possibility. I would put myself in the "abomination" camp. If my family created and booted-up an AI version of me, I'd FOR SURE come back and haunt the shit out of them. Flying plates, laughing skulls, eerie lights, TV suddenly clicking on to display a girl near a well with hair covering her face? I'd pull out all the stops. Ghost-of-the-year type of thing. This article today really hit somewhere deep for me. Gonna have journal on this one for a bit. Thanks!
interesting reaction, Paddy, and I see where it's coming from. But I know people who like the idea of remaining part of the lives of people they love ...
Yes, but are they? We don't really know the impact of this type of fiction until it exists for enough people that we can test it in reality. How will we be affected? What will the results be? Will it lead to unhealthy levels of disassociation? Those things worry me, personally, as a fan of living in reality. Now, not everyone IS a fan. However, just because someone likes something doesn't make it good for them, obviously. Man, one can really go down a rabbit hole here, can't they?! Thanks for the earworm, Paul! 🤣
Since you brought up the movie Her, you might have mentioned that it had a scene where the AI Samantha actually interacts with an AI recreated version of Alan Watts. I always loved that Spike Jonze dropped that in there, as I think Alan Watts is perfect candidate for this treatment.
He was a drunkard and a very flawed person, but no writer or speaker has comforted me more in the darkest moments of my life. I can envision an Alan Watts suicide prevention bot being a net positive project.
I’m very sorry about your mother Paul, I teared up reading that.
thanks for the kind words, Nic. And yes, agree about Watts.
This Black Mirror episode hit way too close to home when I first watched it years ago, not least because I can easily see myself being tempted by an AI version of my lost loved one. The thought has crossed my mind countless times, especially in the past year, as AI continues to evolve at such an alarming rate. Would I be satisfied with a simulation, and what would that mean for the grief process in the future? It's no longer a far-distant scenario, as you said.
Are you satisfied with ChatGPT written pieces, or junk food? I think we would be just as satisfied with these personality simulations. But the tough part is that it may be all we have, and as poor an experience as it may be, it may be better than nothing at all. So we may well subject ourselves to the torture anyways, and be grateful for it.
Your account seems like mostly upside. I too wouldn’t mind a chat with my departed mother. But why bother simulating something someone would say rather than simulating their mind to say it?
It might be fun to talk to a zombie chatbot of my mom. But I also don’t want my mind enslaved and put to purpose comforting grieving relatives or tutoring strangers in ESL or mining bitcoin.
There’s a good horror short in here. Scientist recreates mind of dead child as coping mechanism. Turns out being an enslaved version of your former self for your grieving parent is not ideal.
In this scenario YOU would still be YOU in a more significant sense than in Black Mirror, yet YOU would still be open to charges of not being the real YOU by others. Enslaved and disappointing.
I already have ChatGPT create and read bedtimes stories to my two-year-old. One downside to this is that she now thinks more highly of ChatGPT (4, that is) than she does of her old man. She routinely second-guesses my answers to questions and instructs me to “ask ChatGPT.”
I have a cousin whose father died and afterwards he became a terminally online person. I think that simulated personalities don't worry me. People check out already without them and used properly they could be healing.
Futurama did a decent job of addressing some of the issues when the main character downloads a Lucy Liu bot and dates her. And I remember thinking about Armitage III quite a bit when I first saw it. As tech develops I wonder if how much will meld between man and machine. Already we have hips, hearts, arms and legs that we've replaced. I'm not even sure that as we progress that a "simulation" will be any different than a "real" person.
And I'm fine with that. There are already more people than I can hope to know and a few more doesn't bug me. If you'll permit me one more: "The Velveteen Rabbit" Being loved makes a thing real.
It's a strange world when we worry about otherness so much. I dunno maybe I'm just a little sci-fi brained.
I really enjoyed this discussion. And, as a relatively young 55yo widower with a teenage daughter, I can offer insight into this issue from a perspective of grief and loss.
In my view, an AI recreation of a deceased loved on would ultimately be hugely harmful to the widowed person. Yes, a big part of grief the absence of the person. But grief is more than that. It's not only mourning the loss of the person, but loss of the future with that person, the comfort provided by knowing they're in your life, the ability to share decision making, and other things.
But more importantly, in my view, you'd get stuck in place. Having gone through the grief process, I think there is enormous value to going through that and coming out on the other side. I think of my daughter who still has things to work out with the death of her mother, and I can say with great confidence, that if she had access to a Mom-bot, she'd be even further behind in her coping.
In my view, a big part of personal growth is moving on from the past. Getting over the girl/boy who broke your heart instead of carrying a torch for them the rest of your life. Getting past the resentment over some perceived wrong done to you, or the opportunity that didn't work out. Letting go of the regret over past mistakes. An AI substitute would, I think, be a way for people to not do any of those things.
I was struck by your own story of a 10 year old losing his mother. Abstract thought is just beginning at 10; what a stunningly difficult age to experience one of the most profound losses in life. You are a neurodivergent thinker Paul, and maybe it’s a gift from your mother; or more specifically a dynamic exposed by her sudden absence. Would having a simulation of your mother hurt or helped you travel the grief journey? Of course there is no answer. You are who you are as a product of that journey without AI assistance. While I’m not with the abomination crowd, there is no end to the complexities of the human mind and its vagaries, both endearing and destructive. I cautiously agree with Amr, it seems to be an advancement of manipulation of others with others for the sake of emotional safety.
Hi Paul,
In your last chat with Bob W, you list two shortcomings of GenAI as a friend:
- experience (it hasn’t lived in the world)
- autonomy (it does/can not choose to be your friend)
Here’s a third (arguably even more deal-breaking) one:
- reciprocity (it doesn’t need anything from you, so you can't do anything for it, and that’s no way to build a friendship; most of us need to be prosocial for our well-being)
Hey Paul, I just came across an interesting site in this category and thought you might like it. It's an app that allows a parent to "read" to their children by synthesizing their voice and then using it to narrate popular children's books. Imagine using this with a child whose parent died.
https://www.theimpossiblebedtimestory.com/
Helen Lewis has an episode on this in her new BBC podcast series
https://www.bbc.com/mediacentre/2024/helen-lewis-has-left-the-chat-radio-4
The whole season is excellent
How would it affect your actions today if you thought that in the near future, all of your data will be fed into an algorithm to produce a very realistic simulation of you for those you leave behind? This is not a question the author raises, but it's the one that kept bugging me after reading this. Will it change us to know that we are acting in a technologically mediated eternal return?
A simply extraordinary piece. I would ask if access to AI avatars of deceased loved ones might for all but older persons inhibit the ability to move along in life. Put differently, perhaps this simply prefigures that “life” will seamlessly blend actual and virtual, thus mooting my musing? I comment personally having been a widower who dearly loved his wife and has been blessed with a second first marriage.
Interesting article. As a massive introvert, i find myself preparing for encounters such as these by simulating the conversation in the theatre of my mind, fashioning myself to converse about a series of interesting subjects, seemingly unimpeded by anxiety that otherwise makes me socially inept. Needless to say, the actual real encounters dont live up that simulation. My guess would be that an AI simulation would differ from a real conversation in a similar manner. We simply treat an ai or artificial agent differently from a real sentient agent.