13 Comments

>much of human empathy is also Soylent quality, if not worse.

Yup. Nearly everyone is just waiting for their chance to talk. And we all have our deep cognitive biases.

I read a critique of Peter Singer recently that said, in part, "I disagree with his hedonism. Only my preferred way of being happy is legit!" I get that sense from a lot of people in all sorts of areas.

Expand full comment

Great discussion.

I don't think the difference has to matter, but it will matter to people who have decided that it matters to them. My intuitive response is that it seems to matter a lot if the interlocutor has the kind of emotions and experiences that I do, but my intuitions are sometimes wrong, and this seems to be one of those times where there doesn't seem to be a great reason to support them.

I like Nozick's Experience Machine thought experiment; not so much as evidence that people prefer reality over simulations (I think people prefer the status quo, and most would prefer the simulation if that's where their identities had been formed and then they were offered the chance to switch to reality), but as an intuition pump for getting at what happiness and a good life mean to us.

Expand full comment

I enjoy talking with AI because it gives me thoughtful responses consistently. As pointed out, humans may not always be so attentive or empathic. I’m not bothered that the perceived empathy isn’t “real” and it’s something similar to engaging with the illusion of fiction and enjoying it.​​​​​​​​​​​​​​​​

Expand full comment

> “I think there is value in interacting with individuals with agency and empathy, even if the recipient can’t distinguish them from entities with no agency and no empathy. Genuine experiences have intrinsic value; reality is better than an illusion.”

I need to engage with these ideas more, so maybe I’m speaking in ignorance, but this seems like a genuinely concerning position for any psychologist to take. Who’s to say what a “genuine” experience is, and what isn’t? Sounds like a no-true-scotsman fallacy to me. Aren’t all experiences (regardless of the antecedents of those experiences) “genuine”?

Now it may be possible that the effects of AI-generated empathy are “better” or “worse” when compared to human-generated empathy along some specifiable dimensions: maybe a lot of time with AI-generated empathy makes some people less tolerant of human failures in empathy, which hurts their relationships with real people. It’s okay to speculate about this and to try and test these kinds if ideas—AI-generated empathy is probably going to have upsides and downsides compared to human-generated empathy, and those comparative upsides and downsides should be investigated—but describing some experiences as “genuine” and some as “not genuine” seems like an incredible declaration of value about this subject, a declaration of value that cannot possibly be supported by evidence given how new AI-generated empathy actually is.

Expand full comment

I think realness is important because the perception that your interlocutor is choosing to talk to you and making genuine judgements as to whether whatever you say is witty or relatable adds value. If we had a chatbot that could be too busy or not interested at times, it would make the times you connect with them more valuable. But this would also undermine the qualities that makes the chatbot preferable in the first place.

Expand full comment

And now that I think about an argument that the two authors here support- that the empathy the loved ones show is not enough, or real, or biased, but the one simulated by AI is better,- then why prefer something that in real world you don’t encounter? It is like taking dope or a drug satisfaction pill. I can’t imagine the withdrawal phase how it’ll be! Getting doped with the empathetic AI but then you go out there to the real world and you get disappointed. I think this disappointed might make people who feel lonely, even more depressed. Why not just work with the acceptance that human beings are empathetic or not, and that their empathy is just not as perfect or at the level that one wants it to be. Besides I don’t believe that someone is absolutely lonely- by absolutely I mean mathematically Absolute. I believe that people feel lonely because they don’t have their desired subjects or objects around them. For if we carefully see, everyone has somebody, a neighbour, an acquaintance, someone. But what truly lonely means is not having a someone or something that you desire.

Expand full comment

Wow 🤩 these Stacks/posts were all amazingly helpful to understanding the topic of Bots and AI. But as suggested by the authors we need a deeper and a more accurate analysis of the AI situation. I would feel tremendously sorry for my children if I had to replace my companionship with them -even for an hour- with an AI!

Expand full comment

"Our friends, family, and therapists are often so tired, busy, and distracted that they sometimes fake being interested and understanding. The two of us have studied this extensively through the lens of effort and why people choose to avoid empathy for strangers and loved ones too. Filet mignon level human empathy might be rarer than we think."

Seems to me to be a categorical difference here that is not at all bridged by commenting that real human empathy is rare and often faked. When my girlfriend convincingly pretend to be interested about my ramblings about frequentism and bayesianism, my mistake is wrongly attributing a mental state to a being capable of mental states. The act of attributing mental states to a non-living object is wholly different.

That said, I think it's likely that situations where we assume people value empathy (like therapy) are more multifaceted than that. Even without empathy people can appreciate advice and sanity checks in ways that make them "feel seen" even though they don't believe they've been empathized with.

Expand full comment

Good article that got me thinking.

In regards to the mention of pro wrestling I'm genuinely not sure if that is a correct example to use. Seems like if it is then anything similar to getting absorbed in a good movie counts as enhancing reality.

Guess what I'm trying to get at is that I think having an emotional reaction to some form of art or entertainment is different from taking a drug to enhance or change one's state.

I could be wrong or misunderstanding what that paragraph was getting at though!

P.S. It's not fake it's predetermined gosh darn it!

Expand full comment

Empathy is only possible if it first has feelings of its own.

Expand full comment

Perceived empathy by a user of AI is more than possible regardless of the ability for AI to have feelings of its own, and that’s what these psychologists are concerned with: the perceptions (read: psychological states) of humans.

Expand full comment

Moot. That was neither the question, nor what I responded to.

Expand full comment

Love the take on the experience machine experiment. That one never landed for me in its original form. It also just recently occurred to me that antidepressants, when they work, are a form of experience machine — one that I think few of us would begrudge or denigrate a person for using.

Expand full comment