0.
Mark Zuckerberg, on the Dwarkesh Podcast this week, when asked about people starting to use AI for friends, romantic partners, therapists et cetera:
Here’s one stat from working on social media for a long time that I always think is crazy. The average American has fewer than three friends, fewer than three people they would consider friends. And the average person has demand for meaningfully more. I think it's something like 15 friends or something. At some point you're like, "All right, I'm just too busy, I can't deal with more people."
But the average person wants more connection than they have. There's a lot of concern people raise like, "Is this going to replace real-world, physical, in-person connections?" And my default is that the answer to that is probably not. There are all these things that are better about physical connections when you can have them. But the reality is that people just don't have as much connection as they want. They feel more alone a lot of the time than they would like.
So I think a lot of these things — things that today might have a little bit of stigma around them — over time, we'll find the vocabulary as a society to articulate why they are valuable, why the people who are doing them are rational for doing it, and how it is actually adding value to their lives.
A quick stop before we get into it: are the numbers real?
Yes, sort of, despite many critics dismissing them for no reason.
To be fair, he doesn’t cite his sources. But there’s a good chance the ‘fewer than three friends’ average comes from this Pew Research report. This is a gold standard sample, and it has median=4, and mode=3. This is pretty much identical to this Gallup poll. So I will allow him that one (unless you want to cast him misremembering one integer off as bad faith).
On the other hand, I found no direct source for the ‘demand for 15 friends’ number, unless I am blind and it is in the research I’ve already linked. This could also be some internal Meta data, but he doesn’t say. What I think could have happened, is that some internal deck around their social media strategy went into Robin Dunbar’s research. ‘Dunbar’s Number’ is the max 150 people you can hold relationships with under the bounds of your memory and social energy. But within this, Dunbar mentions that people have a narrower group that actually take up 67% of your social interactions. These are your best friends and your not-really-best-but-still-quite-close friends, which he calls your Sympathy Group. And his typical number for that group is 15. So we can consider this an anthropological benchmark for social bandwidth, based on some models which don’t necessarily measure desire to have those relationships.
(this is obviously all models are wrong but some are useful territory, accept with caution)
So overall, Zuckerberg’s statement probably isn’t crazy.
What is crazy is what he thinks it means.
I.
I’ve been trying to come up with a good metaphor for AI friendships. I don’t think people are thinking about them in the right way.
The usual entry point may be AI therapists, which you can expect to end up within public health services, but will also just be a Google search away on your own terms anyway. The question isn’t really whether an AI can fully grasp how to go about any given modern therapeutic technique, as the exact type of therapy humans offer to other humans essentially doesn’t matter. What’s more, modern therapies also perform identically to practices considered outdated, like being straight-up Freudian. This is the Dodo Verdict, which shows that therapy is indeed a very effective thing, but it is simply the act of talking to someone about your issues that makes the difference. Which makes it no surprise that allowing psychology students to give therapy rather than fully trained professionals makes almost no difference to patient outcomes.
But these are still (you would expect) charismatic, warm, caring people. Maybe that’s what creates the effect. The question becomes: can an AI chatbot really do the same job?
Turns out: yes, probably. (no, your hopeful essay about the magic of human connection won’t change that)
The jump from this to general AI connections seems short to me.
II.
Zuckerberg says this:
There's a lot of concerns people raise like, "Is this going to replace real-world, physical, in-person connections?" And my default is that the answer to that is probably not.
My answer is yes, but not for the reason he might think.
To explain, let’s start with the next level of discourse: AI romantic partners. I see ex-Google CEO Eric Schmidt is on the case, warning that Gen Z men could ditch real women for perfect AI girlfriends.
These young me- Wait, wait, wait. Why do conversations like this always seem to centre around lonely young nerdy men? We realise women get lonely too, right?
We can indeed admit that, just as long as we’re talking about someone else’s culture. From Forbes:
Alicia Wang, a 32-year-old editor at a Shanghai-based newspaper, has found the ideal boyfriend: Li Shen, a 27-year-old surgeon who goes by the English nickname Zayne. Tall and handsome, Zayne responds quickly to text messages, answers the phone promptly and listens patiently to Wang recounting the highs and lows of her day.
Zayne’s only drawback: he doesn’t exist outside a silicon chip.
Wang is one of an estimated six million monthly active players of the popular dating simulation game Love and Deepspace. Launched in January 2024 and developed by Shanghai-based Paper Games, it uses AI and voice recognition to make its five male characters—the love interests or boyfriends—flirt with tailored responses to in-game phone calls. Available in Chinese, English, Japanese and Korean, the mobile title has become so popular that, according to Forbes estimates, Paper Games’s 37-year-old founder Yao Runhao now has a fortune of $1.3 billion based on his majority stake in the company.
But whether we’re talking about lonely boys or lonely girls, the tone of AI romance coverage always tends to be some version of: ‘damn these sad lonely young losers think this thing on their screen loves them and now they’ll never feel real love’.
This ignores the most obvious thing in the world: the user knows they are talking to an AI. They do not at any level think they have a girlfriend/boyfriend. Absolutely, it gives them a stimulating feeling. But maybe just the opportunity to talk about the highs and lows of your life is all one needs from a partner. Maybe it’s the Dodo Verdict again.
Zuckerberg doesn’t think AI relationships will replace real relationships. In a sense he’s right, because they do different jobs.
AI relationships aren’t relationships. They’re closer to porn. Which is a pretty useful way to think about this.
First of all, porn addiction is not sex addiction. Just like junk food addicts aren’t ‘foodies’. They’re obsessives. And this happens to be the easiest thing to obsess over that will take their mind off whatever that actual problem is.
But claiming porn addicts aren’t interested in sex at all seems like a strange thing to say. The speed with which culture can shift in this respect is pretty fascinating. Your grandparents lived in a world where people pretty much pretended that sex didn’t exist, but by the time you were born we were all obsessed with it. Nowadays, boring vanilla sex could be considered a fetish of its own.
But this is where freedom becomes terror once again, as the free ability to engage in the act shifts the onus onto you. When you have strict norms, the list of things you did and didn’t do can be blamed on society. But when the question of sex becomes not if, but how and with who, suddenly your self-esteem becomes front and centre. Cue dysfunction.
Giving an individual infinite possibility inevitably comes with increased inner conflict, which tends to go along with alienation, loneliness and depersonalisation. It becomes a problem to solve. This is how sex ends up being an act without desire, and sometimes barely any curiosity. It is closer to its purpose being to avoid those very things. Hence, an obsession.
This is not to say that this is a good situation you should accept, and it is definitely not to say that porn will help, but it does make it seem more inevitable as an outcome. And why it makes sense that porn addiction only becomes possible when sex addiction becomes impossible. And why AI connection is only sought out when true connection was impossible anyway.
People that are truly addicted to something are incredibly resourceful at getting it, they don’t need a substitute. It might be crass to say this, although I guarantee you’ve wondered it yourself: how do heroin addicts who have lost everything still manage to pay for heroin often enough that they stay addicted? Think about the diet version. Drop someone with an emerging coke problem into a Brooklyn house party with 50 people they don't know and I guarantee within 45 minutes they’ve used every social skill in the book to get a new friend’s $20 bill up their nose. Even if, and this is the insightful part, those are skills they do not manage to show in any other aspect of their life.
So the question you need to ask is that if someone is willing to substitute something they want for a simulated version, you must question whether they really wanted that original thing at all. And better yet, ask what simulating this is allowing them to avoid in the real world.
III.
The reason people aren’t hanging out is because they don’t want to. The reason they’re not having sex is because they don’t want to. The reason they don’t have 15 close friends is because they don’t want to. Despite whatever their cognitive capacity for that may be.
People don’t want more connection, they want less. They want things that get them out of the possibility of having connection as a problem to solve.
Ask yourself this, its ok there’s no one watching: are you averse to not having many friends, or are you averse to feeling like a loser? Or this: do you want to invite people to things, or do you want to be invited to things?
Which, I suppose, is a challenge for companies building AI friends: you can’t offer that kind of social capital. You certainly can’t offer scarcity. Where’s the value?
The value is in the comfort. The value is not allowing the terror of freedom to invade your mind anymore than it already does.
Here’s that Zuck quote again: “There's a lot of concerns people raise like, "Is this going to replace real-world, physical, in-person connections?" And my default is that the answer to that is probably not."
Do I need to remind you what the name of this blog is?
The exploration of the cultural themes of AI friendships is well written, and reminds me of reading this book review (haven't read the book yet):
https://www.robkhenderson.com/p/the-logic-of-envy
I think, however, it would be a mistake to claim that the irrelevance of the modality of psychotherapy and even the professional level of the therapist is indicative that it is the act of talking through personal issues that is the effect of therapy rather than bringing any validity to CBT, Jungian myths, etc. and thus implying the latter are all entirely useless.
Rather, therapeutic modalities occupy a subliminal space of meaning that is the Hermeneutic, lying at the level of myth/scripture/ideology that can be true to an individual in the sense of bringing a compelling narrative that enables one to make sense of their lives. I discuss this in more detail here:
https://arxiv.org/abs/2501.05844
Note that this corroborates the recent rise of mindfulness as well as the brain snow-globe shake up of psychedelics, as treatment for mental health. These are tools that enable one to devise a more workable set of narratives to find meaning for their lives, in the midst of mental crisis.
And fundamentally, mental health is rarely a matter of just the mind, but is a holistic exercise including lifestyle habits. Thus, actually, if an AI were to be trained on the scientific literature as far as best practices for mental health, it would actually recommend to its client that it should go interact with more homo sapiens.