This article is from the source 'guardian' and was first published or seen on . The next check for changes will be

You can find the current article at its original source at https://www.theguardian.com/commentisfree/2025/may/15/mark-zuckerberg-loneliness-epidemic-ai-friends

The article has changed 2 times. There is an RSS feed of changes available.

Version 0 Version 1
I’ll solve the loneliness epidemic with AI, says Mark Zuckerberg. But isn’t his best mate even more money? Do you trust Mark Zuckerberg to solve your loneliness with an ‘AI friend’? No, me neither
(about 2 hours later)
He’d like to persuade us that chatting to a bot is like having a friend, and of course, that’s nonsense. But if we succumb, it’s another income stream for him He’d like to persuade us that chatting to a bot is like having a real pal. Of course that’s nonsense. Still, it’s more money for him
Mark Zuckerberg has gone on a promotional tour to talk up the potential of AI in human relationships. I know; listening to Zuck on friendship is a bit like taking business advice from Bernie Madoff or lessons in sportsmanship from Tonya Harding. But at recent tech conferences and on podcasts, Zuck has been saying he has seen the future and it’s one in which the world’s “loneliness epidemic” is alleviated by people finding friendship with “a system that knows them well and that kind of understands them in the way that their feed algorithms do”. In essence, we’ll be friends with AI, instead of people. The missing air quotes around “knows” and “understands” is a distinction we can assume Zuck neither knows nor understands.Mark Zuckerberg has gone on a promotional tour to talk up the potential of AI in human relationships. I know; listening to Zuck on friendship is a bit like taking business advice from Bernie Madoff or lessons in sportsmanship from Tonya Harding. But at recent tech conferences and on podcasts, Zuck has been saying he has seen the future and it’s one in which the world’s “loneliness epidemic” is alleviated by people finding friendship with “a system that knows them well and that kind of understands them in the way that their feed algorithms do”. In essence, we’ll be friends with AI, instead of people. The missing air quotes around “knows” and “understands” is a distinction we can assume Zuck neither knows nor understands.
This push by the 41-year-old tech leader would be less startling if it weren’t for the fact that semi-regularly online now you can find people writing about their relationships with their AI therapist or chatbot and insisting that if it’s real to them, then it’s real, period. The chatbot is, they will argue, “actively” listening to them. On a podcast with Dwarkesh Patel last month Zuck envisaged a near-future in which “you’ll be scrolling through your feed, and there will be content that maybe looks like a Reel to start, but you can talk to it, or interact with it and it talks back”. The average American, he said, has fewer than three friends but needs more. Hey presto, a ready solution.This push by the 41-year-old tech leader would be less startling if it weren’t for the fact that semi-regularly online now you can find people writing about their relationships with their AI therapist or chatbot and insisting that if it’s real to them, then it’s real, period. The chatbot is, they will argue, “actively” listening to them. On a podcast with Dwarkesh Patel last month Zuck envisaged a near-future in which “you’ll be scrolling through your feed, and there will be content that maybe looks like a Reel to start, but you can talk to it, or interact with it and it talks back”. The average American, he said, has fewer than three friends but needs more. Hey presto, a ready solution.
The problem, obviously, isn’t that chatting to a bot gives the illusion of intimacy, but that, in Zuckerberg’s universe, it is indistinguishable from real intimacy, an equivalent and equally meaningful version of human-to-human contact. If that makes no sense, suggests Zuck, then either the meaning of words has to change or we have to come up with new words: “Over time,” says Zuckerberg, as more and more people turn to AI friends, “we’ll find the vocabulary as a society to be able to articulate why that is valuable”.The problem, obviously, isn’t that chatting to a bot gives the illusion of intimacy, but that, in Zuckerberg’s universe, it is indistinguishable from real intimacy, an equivalent and equally meaningful version of human-to-human contact. If that makes no sense, suggests Zuck, then either the meaning of words has to change or we have to come up with new words: “Over time,” says Zuckerberg, as more and more people turn to AI friends, “we’ll find the vocabulary as a society to be able to articulate why that is valuable”.
My hunch is that this vocab Zuckerberg is hoping to evolve won’t be the English equivalent of one of those compound German words with the power to articulate, in a single term, the “value” of a chatbot as “something that might look superficially like intimacy and might even satisfy the intimacy requirements of someone who neither understands nor values human interaction, but is in fact as lacking in the single requirement for the definition of ‘intimacy’ to stand – consciousness – as a blow-up doll from the 1970s”.My hunch is that this vocab Zuckerberg is hoping to evolve won’t be the English equivalent of one of those compound German words with the power to articulate, in a single term, the “value” of a chatbot as “something that might look superficially like intimacy and might even satisfy the intimacy requirements of someone who neither understands nor values human interaction, but is in fact as lacking in the single requirement for the definition of ‘intimacy’ to stand – consciousness – as a blow-up doll from the 1970s”.
Instead, what Zuck seems to mean is that we’ll just relax the existing meanings of words such as “human”, “understanding”, “knowing” and “relationship” to encompass the AI product he happens to be selling. After all, this is just an extension of the argument he made in 2006 when he first sold us on Facebook: namely, that online or computerised interaction is as good as if not better than the real thing.Instead, what Zuck seems to mean is that we’ll just relax the existing meanings of words such as “human”, “understanding”, “knowing” and “relationship” to encompass the AI product he happens to be selling. After all, this is just an extension of the argument he made in 2006 when he first sold us on Facebook: namely, that online or computerised interaction is as good as if not better than the real thing.
The sheer wrongness of this argument is so stark that it puts anyone who gives it more than a moment’s thought in the weird position of having to define units of reality as basic as “person”. To extend Zuckerberg’s logic: a book can make you feel less alone and that feeling can be real. Which doesn’t mean that your relationship with the author is genuine, intimate or reciprocated in anything like the way a relationship with your friends is.The sheer wrongness of this argument is so stark that it puts anyone who gives it more than a moment’s thought in the weird position of having to define units of reality as basic as “person”. To extend Zuckerberg’s logic: a book can make you feel less alone and that feeling can be real. Which doesn’t mean that your relationship with the author is genuine, intimate or reciprocated in anything like the way a relationship with your friends is.
Must we list the ways? Given Zuckerberg’s easy rejection of basic norms, I guess we must. Human friends are conscious and responsive in unpredictable ways that increases our own sense of self in relation to them. More practically, human friends can introduce us to other humans, one of whom we might date, marry, be offered a job by, or add to our store of existing friends who nourish us and make us laugh. Perhaps mercifully, AI friends can’t make us go camping or force us to organise their hen night. But that is because our relationship with an AI friend is not a relationship at all, and when we talk to them, we’re alone in the room.Must we list the ways? Given Zuckerberg’s easy rejection of basic norms, I guess we must. Human friends are conscious and responsive in unpredictable ways that increases our own sense of self in relation to them. More practically, human friends can introduce us to other humans, one of whom we might date, marry, be offered a job by, or add to our store of existing friends who nourish us and make us laugh. Perhaps mercifully, AI friends can’t make us go camping or force us to organise their hen night. But that is because our relationship with an AI friend is not a relationship at all, and when we talk to them, we’re alone in the room.
A worse issue than fraudulence, apart from the horrible possibility that already lonely people – particularly young men – will be sold this kind of “intimacy” as an answer to their problems and discourage them from seeking out other people, is that any interaction with AI is by necessity commercial in nature. Perhaps that’s simply where we are, now. If you want real, searing, soul-level engagement then find someone who looks at you the way an AI chatbot looks at your data.A worse issue than fraudulence, apart from the horrible possibility that already lonely people – particularly young men – will be sold this kind of “intimacy” as an answer to their problems and discourage them from seeking out other people, is that any interaction with AI is by necessity commercial in nature. Perhaps that’s simply where we are, now. If you want real, searing, soul-level engagement then find someone who looks at you the way an AI chatbot looks at your data.
Emma Brockes is a Guardian columnistEmma Brockes is a Guardian columnist
Emma Brockes is a Guardian columnistEmma Brockes is a Guardian columnist