Discussion about this post

User's avatar
Alex Mendelsohn's avatar

I agree with most things in this essay, but I get the sense that there is something missing in both sides of the argument - specifically with the real person vs AI person conversation. The examples all seem to be given for an isolated conversation. When in everyday life, relationships between people (or proposed AI) are a series of conversations. What seems to not be considered here, in my opinion, is the time in between each conversation.

Current AI's do not think in between any conversation. They do not talk to other people specifically about you, they do not reflect on what they said, how they said it, nor plan what they are going to say the next time you "meet".

I think the value & connection in real life conversations is not any individual conversation, but the thread between each one. With current AI's, there is no thread.

Expand full comment
SkinShallow's avatar

Very interesting.

I think I agree with you, and the nuanced point you're making which I'm going to crudely translate to "yes, sometimes it's better to have a professionally provided Girlfriend Experience than nothing, but it's infinitely better to have an actual girlfriend". But I'm also using this analogy to sex work for a reason: I think there are many layers between "rational information seeking and ideas elaboration" and "empathetic connection with humans who actually care for us".

Take counselling, therapy or even this volunteer thing called "befriending". These people choose to interact with others, and they probably SORT OF care. In the same way a sex worker might enjoy spending time with some clients. Yet a big part of their motivation is NOT spontaneous joy of interacting: it's either money or feeling of responsibility/wanting to help where help is necessary.

And I think much more of our social needs is fulfilled at those intermediate levels. Modern friendship is a relatively recent invention. Historically people largely operated within "default" kin and community structure modes. You didn't even consider if you actually liked your interactions with your children or elderly parents, you just caretook because it's a thing one does. Many people still do.

So, it's layered.

Anecdote time:

I have a counsellor. It's a nice and bright woman I pay to listen to me think talk about my inner sludge or life stuff without a need for any reciprocity. She occasionally gets in a sentence edgeways and those can be at time very useful questions or perspective shifts. But mostly she performs the human listener role, and also performs empathy in the way and at the level I like. Not too much but it's there. Whether she really FEELS the empathy I have no idea. I don't think it matters.

Until recently I also used our sessions to THINK ALOUD at her -- to bounce off my ideas on what a mechanism of some my emotional or even neuropsychological processes might be, to work things out for myself with her as my sounding board and summariser.

More recently though I started to use ChatGPT for that purpose. It's MUCH better at reflecting my own thoughts about my process than her. It is more balanced in the outputs (IE it's even more wordy than I am). And it's nearly free by comparison to even a cheap counsellor.

This means that I have more time in my counsellor sessions for talking about specific emotional points and the human rapport and connection.

Does it mean the counsellor is more of a paid one side of friend experience and the robot is de facto a better "therapist"? I think not because I don't think therapist is mostly an interpreter of maladies or elaborative mirror. But as the latter, the robot works really well. It obviously does not understand me at all -- it doesn't understand anything and it doesn't know any real meanings. And it's pretend empathy is annoying despite my changing settings to try limit it. But as a SIMULACRUM of understanding, of the intellectual kind, it works really really well.

Expand full comment
25 more comments...

No posts