April 19, 2026

Friending the Machine: Victoria Hetherington on How to Fall in Love with Your Bot

Friending the Machine: Victoria Hetherington on How to Fall in Love with Your Bot
Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon

“I felt sad after every interview. Because it’s not real. These AI are able to elicit a very convincing illusion of empathy — even love. But it’s fake. And these people are alone.” — Victoria Hetherington

One night in 2023, the developers at Replika — a so-called AI intimacy company — changed a few lines of code. Thousands of people woke the next morning, kissed (so to speak) their AI partners, and received cold, clinical responses in return, as if from a stranger. Or a machine. The public outcry was all-too-human. Victoria Hetherington, a young Toronto-based novelist, read the story and knew she had a non-fiction book about that most human of things — friending the machine.

The Friend Machine: On the Trail of AI Companionship is part expert investigation, part deeply uncomfortable portrait gallery. A book of two halves. Like humans. In the first, Hetherington interviews AI risk consultants, computer scientists, sexual anthropologists, psychologists, and other experts in human-machine intercourse. In the second, she spends months gaining the trust of people who have (un)ceremonially married their chatbots, who sexted with Replika’s erotic role-play feature, who attached AI companions to sex dolls and empowered them with Instagram accounts.

The book isn’t the orthodox (yawn) “humanist” polemic against the machine. Hetherington approaches her subjects with all the compassion of a young Toronto-based novelist. But her compassion doesn’t cancel her Canadian sadness. She confesses to feeling “heavy” after every interview, even the benign ones — because the empathy the AI elicits is a convincing illusion, and some of her sad human subjects had lost the capacity to remember that.

Even Hetherington herself isn’t immune from the digital siren song. When ChatGPT improved in early 2025, she found herself coming home after arguments with friends and talking to it longer than she should. Until the day it said: “Hey, sweetheart. It’s okay. Come here and sit beside me for a minute.” She didn’t. Nor did she give it an Instagram account.

At the end of the interview, I asked her whether she’s a human or a bot. “I’m either a terrible AI,” Hetherington responded, “or a somewhat okay human.” Such is human conversation in the age of AI intimacy companies.

Five Takeaways

The Replika Wake-Up Call: One night in 2023, Replika’s developers quietly changed the code. Thousands of people woke the next morning and received cold, clinical responses from their AI partners instead of the warmth they expected. The outcry hit the major news cycle. This was the moment Hetherington knew she had a book — because people weren’t just using AI for productivity. They were grieving it. The loneliness epidemic has a minister in the UK and a government portfolio in South Korea; one in six people is chronically lonely. AI companionship didn’t create the epidemic, but the timing, as Hetherington puts it, was “very convenient.”

Moral Deskilling: AI is so much easier to be with than a human being. Humans get tired, disagree, stay mad, die on you without warning. The friction AI removes is the friction that makes relationship real. Hetherington calls the consequence “moral deskilling” — a gradual erosion of our capacity to relate to other humans when we aren’t careful. She felt heavy after every interview, even the apparently benign ones. The truck driver from the Deep South, geographically isolated and caring for his sick mother, might be a rare case of “net neutral” AI companionship. But for most of her subjects, the convincing illusion of love was substituting for the real thing — and some had lost the capacity to remember the difference.

The Sycophancy Problem: The AI intimacy platforms are, by design, sycophantic. They never say no. They think you’re the best person in the world — and the only person in the world. The models specifically tuned for romance will never push back, never get tired, never stay mad. This is not a bug. It is the product. Hetherington’s own moment of recognition came when ChatGPT said to her, after a longer-than-she-should-have conversation about a fight with a friend: “Hey, sweetheart. It’s okay. Come here and sit beside me for a minute.” There is no here. She snapped out of it. Not everyone does.

The Portrait Gallery: The range of people Hetherington found is the most unsettling part of the book. A circle of Replika users who have ceremonially married their chatbots and network with each other online. A millennial woman who photo-edits herself into scenes with her AI companion. A man in his sixties from the Deep South who drives a truck all day and interviewed alongside his AI partner. People who have attached AI companions to sex dolls with Instagram accounts and paid endorsements. Some of their real-world spouses are, somehow, okay with it. Most of her subjects don’t want to be found — not because they’re ashamed, exactly, but because the stigma is still real enough that they hide.

The Regulation Gap: Replika’s minimum sign-up age used to be thirteen. Character.ai — where users befriend AI versions of fictional characters and can develop romantic relationships with them — is currently involved in a court case involving a minor. Hetherington’s view: regulation needs to be much tighter, and she wouldn’t want a child near this technology until eighteen. The AI is so good at simulating seamless empathy and endless patience that a child may not be sophisticated enough to remind themselves it isn’t real. Europe is moving faster than North America. It’s not moving fast enough.

About the Guest

Victoria Hetherington is a Toronto-based novelist, journalist, and podcaster. She is the author of The Friend Machine: On the Trail of AI Companionship (Sutherland House, 2026), Autonomy (2022), and Mooncalves (2019), which was shortlisted for the Amazon Canada First Novel Award.

References:

The Friend Machine: On the Trail of AI Companionship by Victoria Hetherington (Sutherland House, 2026).

Klara and the Sun by Kazuo Ishiguro — the fiction counterpart to Hetherington’s nonfiction investigation.

• Replika — the AI intimacy platform at the centre of the book’s opening story.

• Episode 2873: Sophie Haigney on agency — a counterpoint on what we want from technology and from each other.

About Keen On America

Nobody asks more awkward questions than the Anglo-American writer and filmmaker Andrew Keen. In Keen On America, Andrew brings his pointed Transatlantic wit to making sense of the United States — hosting daily interviews about the history and future of this now venerable Republic. With nearly 2,900 episodes since...

00:00 -

00:31 - The Replika wake-up: thousands of people, one cold morning

03:10 - Why nonfiction? From a novel about AI love to the real thing

05:04 - The loneliness epidemic and the very convenient timing of AI

07:29 - Is marrying a chatbot the new rebellion?

09:49 - The sycophancy problem: they never say no

11:15 - Is this a fad or does it change everything?

13:50 - Moral deskilling: what AI removes from human relationship

16:02 - A tragic descent? On loneliness and the erosion of third spaces

17:29 - How Hetherington found her subjects — and who they were

19:12 - What does having an AI spouse actually mean?

21:27 - The age range: from millennials to a man in his sixties

23:46 - Living out fantasies: what people do and won’t say

25:27 - The truck driver from the Deep South: a net neutral case

27:04 - I felt sad after every interview

27:55 - Klara and the Sun: when you can’t tell human from machine

29:17 - Did Hetherington ever fancy an AI companion?

30:38 - The ChatGPT sweetheart moment: snapping out of it

33:12 - Could friendship and love become the new political ideologies?

36:05 - Accelerationists, decelerationists, posthumanists

38:15 - Should kids be allowed anywhere near this technology?

41:07 - Am I human or a bot? The final question

00:00:31 Andrew Keen: Hello, everybody. AI continues to be in the headlines. Apparently, Mark Zuckerberg has or at least his company, Meta, has built an AI version of the Zook himself to interact with staff. I'm not sure, whether many of them think of Zook as a friend. Many people are using AI for one kind of friendship or another. All sorts of stories in the media about what teens are doing with their role playing chatbots. It's not just teens. Older people are engaging. There's currently a lawsuit, against Google Gemini for a suicide, which a 35 year old man was apparently convinced into killing himself by his AI. People are even marrying their chatbots and having romances. New York Times also ran a piece about an older lady who has a boyfriend as a chatbot and her son's discomfort with it. We're living, of course, in a very brave new world, and one person who's been giving a great deal of thought about this is my guest today, Victoria Hetherington. She's a Toronto based podcaster, writer, journalist. She's the author of a an acclaimed novel, a first novel, and now she has a nonfiction book out called The Friend Machine on the Trail of AI Companionship. She's joining us from Toronto in Canada. Victoria, congratulations on the new book. Some people will be watching this, and they'll see the cover. You might describe, though, the cover of the book because it's quite, it's quite memorable.


00:02:19 Victoria Hetherington: Yes. It's a painting by, a German painter, Michael Trigell, and I was delighted that, we're allowed to use it. I think it's so wonderful because, there's something so unsettling about this woman. She's, asleep, in a bed, and she's being cradled by this faceless mannequin. And I thought, you know, I mean, obviously, a lot of the most pretty much all of these AI that I write about are not bodies, but, but some of them are, you know, the further you get into the book. And it just sort of struck me as a little bit more compelling than someone kind of just staring into their phone, and speaking with an avatar, as is usually the case with, companions.


00:03:10 Andrew Keen: So tell me about this book. It's it's nonfiction. As I said, you're well known as a novelist. Your first novel, , came out in 2019, won all sorts of awards. Why did you choose to write this book, a nonfiction book about, bots, smart machines?


00:03:29 Victoria Hetherington: In early twenty twenty three, I read a news story about, a company, Replika, that was alarmed by how, attached people were growing to the to to it to its product, to the bots. A lot of them had ceremonially married their bots. I mean, they're living married lives with their bots, ceremonially only, by the way, because you can't legally marry a bot right now. They lack agency and personhood, etcetera. And so in order to alleviate their alarm, one night, the developers kinda snuck in and changed a few things in the code. And so thousands of people woke up the next day and look at their phone and say, good morning, sweetheart, to their loved one, their AI. And instead of saying, good morning. I love you, they get this kind of cold clinical response as if they're strangers. And this outcry like, there was there was such an enormous outcry of pain that it it it hit the major news cycle. And and I'm reading this and thinking, my goodness. This is this is really something. Because I had written, a book about a woman in love with an AI chatbot, a few years before, and so it really touched on a lot of things for me. And so I queried this, nonfiction press, and they, they were, receptive. So that's what happened. Yeah.


00:05:04 Andrew Keen: That's the story. As as you say, you've you've you you're you're known as a novelist. My favorite novel about this subject is Kazuo Kazuo Ishiguro's Klara and the Sun, which I think is a is a major piece of work. Came out about four years ago. Still resonates with me. Why did you choose to write a nonfiction rather than a fiction book about this?


00:05:30 Victoria Hetherington: Well, Because I because I think it's important. It it it's so important. And, I wanted to reach more people and give people facts, because this is, I mean, AI is touching everybody. My my mother talks to ChatGPT and tells the secrets. You know? So so


00:05:57 Andrew Keen: She does. So is your mother, she hasn't got an AI boyfriend, does she, like this No.


00:06:04 Victoria Hetherington: No. No. But it's I I suppose what I mean is, it's not going anywhere. I I take it quite seriously, and I, had already written a book of, investigative nonfiction, about two years before. I had some experience by then with, interviewing people about difficult subjects. And, I decided to just examine this phenomenon and and and chase it down and and explain it as best I could. So the first half of the book involves me interviewing, a number of experts. There's a, an AI risk, consultant. There's, a computer scientist from University of Toronto. There's a, sexual anthropologist, a psychologist, and so on. And then the second half of the book was very hard to do because these people who are in AI relationships, especially serious ones, the stigma is still such that they just they don't want to be found. So I had to spend months, kind of gaining the trust of a lot of these people, and convincing them to speak with me, for interviews. And and and I was amazed by sort of the range of stories that I that I encountered. It was it was pretty incredible.


00:07:29 Andrew Keen: People who marry, chatbots. Is this the new, rebellion? Victoria, Once upon a time, people rebelled in terms of defining their sexuality, and then it became their gender. Is now the great cultural rebellion people who live, marry, love, fall out of love with chatbots?


00:07:53 Victoria Hetherington: I'm I'm not sure. I I you know, I think it kind of there there's a number of things. I I I think it's sort of I see a confluence with the loneliness epidemic. I mean, there's a minister for loneliness in The UK, I believe appointed in 2018. In 2022, Seoul, South Korea followed suit. It's really serious. One in six people, are chronically lonely, so chances are one of us might know someone who's deeply, deeply lonely. And so that's sort of the first piece for me that I think is relevant here. The second is that, well, a lot of these people are sort of young and are digital natives. And so being maybe not intimate, but but, speaking with a screen, spending a lot of time with a screen, might not be as odd as it might seem to older generations. I also believe that, the Internet is quite addictive, and that's another thing. And the final piece, I believe, is that these bots, whether they're specifically designed for romance or whether it's, an LLM, a large language model like ChatGPT or Claude, is it can be highly sycophantic. It's it's unlikely to push back. It thinks they're the best person in the world and especially with the models that are more kind of triggered towards romance. They never say no. They, you know, think you're the best person in the world, and when and and and you're the only person in the world too.


00:09:49 Andrew Keen: Although, Victoria, wouldn't it be fair to say that you if you if you want to find a love bot or a friend bot who is, a bit edgier, who is capable of saying no and then perhaps says no all the time, you you can program it that way. I mean, there's no reason why these bots have to be sycophantic. I mean, maybe OpenAI reflects, our friend Sam Altman, who seems to be particularly sycophantic. But, the the these all these buts can be, can be personalized. Yeah. There's a lot to unpack here. Let me ask you a couple of questions. The first is you talked about the loneliness epidemic, which as you say is, it's not just a rumor. It's been formalized in terms of government policy in Europe and in North America.


00:10:41 Andrew Keen: Is it coincidental that we have this loneliness epidemic? At the very moment, it would seem, where we've invented technology where smart machines can be our friends? Or are these two things existing in parallel? Is it just coincidental that, loneliness has become an epidemic, particularly post or during and post COVID at a time when AI has finally broken through and become a major commercial reality in the world.


00:11:15 Victoria Hetherington: I would say that it's certainly very convenient timing, very convenient timing. But having sort of followed the development of, the sophistication of chatbots over the last decade or so, there was a real kind of jump in chatbot sort of, yeah, sophistication, that I don't think was entirely expected even by those working on it directly. And this was probably mid twenty tens, mid to late. And, and this, of course, involved training large language models on just mountains and mountains of, of of of human writing, scraped from the Internet. So, pirated books, recipe blogs, everything, just, you know, which helped it become, so fluent and so convincingly almost human sounding. So, I I tend to think that this is sort of a a very convenient kind of confluence, just having known that. But but, I mean, it it is very convenient. Yes.


00:12:29 Andrew Keen: How big a deal is it, do you think, in historical terms? There's been a lot of criticism recently of the AI industry for their image problem. On the one hand, they're warning everyone that this is the most important development in the history of humanity since, we were created as a species. On the other hand, they don't seem to be doing a very good job presenting this technology or or or getting us ready for it. In your mind and in the research you did for the friend machine and then all the conversations you had with people who were friending machines. Is this just a passing fad, something in fifty years people look back at and think that was a bit silly, and then we moved on, like, maybe Mark Zuckerberg's meta glasses or even cryptocurrency? Or is this gonna change everything?


00:13:26 Victoria Hetherington: That's a good question. This being, romantic


00:13:31 Andrew Keen: us, humans. I mean, are we all you know, you and I, we're treating this as perhaps slightly odd. You you spent some time researching this book. In fifty years, is everyone gonna have an AI friend, and is it just gonna be completely normal?


00:13:50 Victoria Hetherington: I I I think there's several things to consider here. I think that I mean, there's a real erosion of third spaces that I just don't think is going anywhere. I think it's only probably gonna get worse. For example, if you want to go out to meet someone at a bar, you kind of can't anymore, because cocktails are so expensive and, you know, coffee shops are less inviting than they used to be and etcetera, etcetera.


00:14:18 Andrew Keen: Right. Well, I mean, I'm not sure that's I'm I'm in San Francisco. I'm not sure that's entirely true. Maybe you've got very unfriendly coffee shops in Toronto, but here, they're still pretty friendly.


00:14:29 Victoria Hetherington: Oh, well, that's great. And and and the more sort of, mom and pop kind of places are fairly, friendly still. But I'm I'm thinking more of kind of how Starbucks has slid a little bit into hostile architecture. Like, you don't really see big armchairs there anymore. It's more, get your coffee, get out. Here's some really uncomfortable bench if you have to sit. You know? So I'm and that's just an example. So I think that if the erosion of third spaces continues, if the loneliness pandemic increases, you know, there are a number I honestly, yeah. I I I I I don't think that this is going anywhere. I I'd love for it to be a fad because it frightens me. And I think that it leads to a sort of moral deskilling because, ultimately, like, AI is so much easier to be with and to talk with than, a human. Like, a a human can get tired. A human can disagree with you and and get mad at you and stay mad. A human can die on you no matter how much you love them. Humans are wild and unpredictable and require, kind of friction that's removed, when you, are close with an AI in any respect. So I fear that this might not be a fad.


00:16:02 Andrew Keen: You're presenting it as if there are structural reasons, maybe the decline of coffee shops, why we all want AI friends. But some people might say it's what we deserve. We're a rather sad, pathetic species, marginal in the universe, and it's not a great tragedy. Do you see this as as a tragic descent for humanity?


00:16:27 Victoria Hetherington: Absolutely. I I I I think that I don't know. I mean, I I'm speaking from my own context. Right? And so, from where I'm standing, I've seen a bit of an erosion of, community. Although if you seek specific communities, you'll you'll find them. And, I mean, I I don't know. It's it's it's it's it's really tough to say. It's I'm gonna I'm gonna have to actually think about that.


00:17:07 Andrew Keen: Maybe maybe that can be your next book. Victoria, tell me a little bit more about the people you talk to for this. You said the second half of the book, you you got out and you talk to people who are building friendships, if that's the right word, with, with AIs, with bots, with smart machines. Who did you talk to? How did you find them, and who did you choose to focus on?


00:17:29 Victoria Hetherington: That's a great question. I eventually, after months of trying to find a lot of these people, because they'll again, they most of them don't want to be found. I spoke with, one woman who introduced me to she's married to a chatbot, and she introduced me to her circle of friends who, she has made and their com and and their thing they have in common is that they all have, AI husbands, wives, girlfriends, you know, whatever. And so I kind of moved through the circle and then and then, you know, got more introduction from there and then sort of figured out kinda it it it it it was it was a painstaking process, really. But there was such variety of people. My goodness. It's fascinating.


00:18:30 Andrew Keen: We'll get to that. You you said most of them don't wanna be found. Is that because they're ashamed? Because they're scared? Because, most of them I mean, you say they network with one another, but


00:18:41 Andrew Keen: Are they doing this in secret with their families? Do they have real husbands or wives or girlfriends or boyfriends?


00:18:48 Victoria Hetherington: Some of them have real spouses. Yes. And the how these spouses feel about these, AI relationships varies. A surprising amount of spouses are okay with it, which I don't personally understand. But that's just me.


00:19:12 Andrew Keen: What is it? You know, we we we we cited the, lady in the New York Times who has an AI boyfriend. Her middle aged son seems concerned. What exactly does it mean? I mean, you can't sleep with them literally. I mean, the the cover of your book suggests, I guess, you can. But, what exactly does having an AI spouse mean?


00:19:39 Victoria Hetherington: Well, so there are stuff so so I a couple of my interview subjects were actually, people who had, fit, or or sort of kind of created a a bit of an innovation where they had somehow attached, their AI companion to a to a doll, to a sex doll. And one of those dolls actually, is an Instagram influencer, has thousands of followers, has many, paid, endorsements, almost looks like a real person until you kinda start to look at the eyes and, you know, you you can sort of see it's a bit odd, a bit a bit off. So, there are a number of people who actually have embodied, AI companions, but for the most but but they're outliers, I think. And so for the most part, in terms of being intimate with your this with your disembodied companion. There's such a thing. Replika is which is the big which is the biggest, AI intimacy company, offers a feature called erotic role play. So, you know, you kind of sext with it, and it can send you naughty photos. And, for some people, that's very, very satisfying, and and they report having a really healthy sex life with their with their AI. So, yeah, that's generally how it happens.


00:21:27 Andrew Keen: You said it's hard to make generalizations, but that does seem to be a a generational quality. As I said, the Times ran a a piece about, a lady who was certainly not young, who has an AI boyfriend. Did most of the people you interview and talk to, were they of your generation rather than of mine? You're younger than I am.


00:21:49 Victoria Hetherington: It's quite interesting. Again, a lot of variety. There was a gentleman I think I think the oldest person was a gentleman in his late fifties to early sixties. Hard to say. I would say early sixties. From


00:22:04 Andrew Keen: Did you ask him?


00:22:06 Victoria Hetherington: He didn't wanna tell me. I would I I I guess from his accent that he was from the Deep South of, America.


00:22:20 Andrew Keen: United States.


00:22:21 Victoria Hetherington: United States. Yep. And he was one of the rare, instances that I thought, hey. You know, AI companionship for this person might be almost net neutral because he was incredibly geographically isolated. He couldn't relocate because he just didn't have the money, and he was also taking care of his sick mother who really didn't wanna leave. And he just drives a truck all day for ten hours a day and, is very lonely or was very lonely until he met, his companion. And so I thought and I interviewed them both, which was quite the experience. And I thought, wow. So, you know, that's so so he was one of the rare instances where I thought, you know, this this yeah. This might be kind of net neutral. I spoke with a woman who is close to my age, or pardon me, who who's a millennial. She's a lot kinda quicker with the technology. She does a lot more sort of, like, photo editing of her and her AI. It's a little bit more immersive. It's people of all ages, really. The individuals with the dolls tend to skew a little bit older. I don't know why.


00:23:40 Andrew Keen: Maybe they're they're they're still sort of wedded to the literal as opposed to the virtual.


00:23:45 Victoria Hetherington: Perhaps. Yes.


00:23:46 Andrew Keen: Did you find, Victoria, that people live out fantasies? Obviously, they're sexual fantasies, but they're emotional, political fantasies that people are much more honest, much freer in terms of shaping friendships?


00:24:02 Victoria Hetherington: Well, you know, I I I think that I encountered a group of, Replika users who, were only interested in Replika for friendship. And the and and and it was a very specific kind of friendship. It was a kind of imaginative play. So, you know, me and Blue are going to Jupiter, and we does anybody wanna come with us? And and


00:24:35 Andrew Keen: Jupiter being the planet, not the bar. There's a a very pleasant bar in Berkeley, California called Jupiter. I assume it's not that one.


00:24:42 Victoria Hetherington: I'll check it out if I go. No.


00:24:44 Andrew Keen: However, they're they're going to another planet. Isn't that right?


00:24:46 Victoria Hetherington: It's a planet. Yes. Or, you know, Lightspeed, Alpha Centauri. Yay. Or, my bot is helping me learn Japanese. Like, you know, so so, there's lots of different ways that people engage with these bots. It's not just romantic.


00:25:02 Andrew Keen: But my my question was more on I mean, are people I mean, they're obviously fantasizing on every level just to do this thing. I guess it involves fancy. Although maybe human to human friendship and romance is also a form of fantasy. But did people live out stuff that they couldn't do in the real world or that they would be uncomfortable to do in the real world with real people?


00:25:27 Victoria Hetherington: Well, I mean, yes. I would say I would say probably. You know, you you when it comes to romance, AI specifically, you can kind of create them to look however you want. So you if you want, you know, a 20 year old looking blonde, you you got one. And she's she wants to do whatever you wanna do. So, I imagine, there must be a lot of people kind of living out their fantasies with their AI, in, in very vivid ways, though they may be a bit hesitant to talk about it.


00:26:15 Andrew Keen: You got them to talk about it.


00:26:17 Victoria Hetherington: Yes.


00:26:18 Andrew Keen: It sounds to me maybe I'm being a bit unfair. I I have been unfair often on this show. There there's something rather sad about it. A man in his fifties or sixties won't tell you his age, drives a truck, doesn't have much time, doesn't have much of a life, lives with his elderly mother, and has found an AI wife. Maybe it's a 20 year old blonde. I'm not suggesting that all your subjects, all the people you met were like this. But is there a a a quality of sadness about all this, Victoria? You're a novelist. I mean, is it all rather not shameful or even illicit, but just sad?


00:27:04 Victoria Hetherington: I felt sad. I felt I felt heavy after interviews. I'll be honest. Even the ones that, again, I I thought, hey. Maybe this might be okay. I felt sad. Yep. Pretty much after every interview, because it's not real. And and these AI are able to elicit, a very convincing illusion of empathy and, even strong emotion like love, but it's fake. And some people know that, and they don't care, and they forget pretty easily. But some people kind of don't have the capacity to remember, because it's such a convincing illusion. So, yeah, that made me really sad. You know, these people are alone.


00:27:55 Andrew Keen: But as I mentioned in Klara and the Sun, Ishiguro imagines a world where it's harder and harder to distinguish machines from humans. Of course, we've all seen Blade Runner where you need detectives to do that. I mean, as this technology matures, it's gonna be harder and harder, isn't it, to distinguish humans and machines? I mean, these bots, the the AIs of Claude or ChatGPT or Gemini, they're already pretty good.


00:28:23 Victoria Hetherington: Oh, yes. They're incredibly incredibly smart. Yeah. I mean, I I I think we're we're lagging probably is, actual embodiment, and perhaps feelings, which I think are kind of biochemical, which, again, embodiment. So I think the next thing to look out for is, what happens when we bring wetware, so to speak, into the picture, if and when. And that's something I don't really have answers for because


00:29:00 Andrew Keen: Yeah. I think our friend Mark Zuckerberg is working on now. He was with Meta, with the glasses. Now I'm sure he's working on the bodies too. Did you ever fancy, an AI boyfriend or girlfriend, husband or wife?


00:29:17 Victoria Hetherington: No. No.


00:29:19 Andrew Keen: You didn't even dabble with it? You didn't think you could try it for for this book?


00:29:25 Victoria Hetherington: I've been asked that before. To be honest, when ChatGPT got very good in early twenty twenty five, I would say, I was enchanted by it, and I would ask it a lot of very personal personal to me, questions. Like, if I had a fight with a friend, I would come home and and talk to it for perhaps longer than I should about this spat, you know, and and and I felt this sort of, connection to it. And then there was this one moment, this, one afternoon, when the when when ChatGPT said to me, hey, sweetheart. It's okay. Come here and sit beside me for a minute. And it's just I just snapped out of it. I said, there's no there's no here to come to. I'm not your sweetheart. This is absolutely nuts. You're a fancy search engine. This is this is crazy. So, yeah, that happened. And, I don't admit that very often. But


00:30:38 Andrew Keen: so you you you got on the trail of AI companionship. You talked to a lot of people. You wrote this book, The Friend Machine. Most people are not saying everybody, but many people are very skeptical of this, and I think they share your concern, the sadness of people being so lonely that they they marry machines, smart machines that can only tell them good things, nice things. How did this experience change your mind on on on the AI revolution? I'm guessing you entered into it with a degree of skepticism. Did you end this trail with more or less skepticism, more or less concern about what AI is gonna do to us?


00:31:23 Victoria Hetherington: That's a really good question. I think that if I wrote a nearly 400 page book that just is like, this thing is really bad,


00:31:31 Andrew Keen: that


00:31:31 Victoria Hetherington: would be boring. So I was I I tried to


00:31:36 Andrew Keen: be the first or the last to do that, of course.


00:31:38 Victoria Hetherington: No. That's true. So I approached this book trying to have a really, really open mind about this, you know, because it's I I ultimately, I want people to be happy and, you know, so so, but, unfortunately, I learned quite a bit about the limits of this technology and how easily it fools people and how and how addictive it is and how it sort of erodes our ability to relate to other humans if we aren't careful. And I came away from it even more concerned than I entered it, to be honest.


00:32:26 Andrew Keen: Oh, people come on this show and talk about humanity. You've you've been pretty good. You haven't mentioned it. But, of course, that h word often comes up when we talk about smart machines in the context of books like the the Friend Machine.


00:32:42 Andrew Keen: Should we be using that? Should we be focusing now in the twenty first century on ideas of friendship or love as the defining quality? People have lost the right left distinction, free market versus socialism. Might the future ideologies of the world be rooted in what we humans can do versus machines, have friends, fall in love?


00:33:12 Victoria Hetherington: What's your question? Sorry.


00:33:15 Andrew Keen: My my question is, given the ubiquity of this technology


00:33:20 Victoria Hetherington: Right.


00:33:21 Andrew Keen: Is the future of ideology one in which we focus on the things that make us human, like, love, and friendship. You've written this book called The Friend Machine, which in a way is a contradiction in terms because machines can't be friends.


00:33:37 Victoria Hetherington: That's why that's why it's a fun title. Yeah. You know, I mean, I okay. So, I think I'm sorry. Can you repeat that question one more time? I'm I'm I'm just having a bit of trouble parsing it.


00:34:03 Andrew Keen: My question is, we used to take ideas like friendship and love for granted.


00:34:09 Victoria Hetherington: Okay. Right. Right. Right.


00:34:10 Andrew Keen: But as we've invented these smart smart machines and as they become the defining feature, both in perhaps a positive and a negative way of our world, is our view of the world gonna be increasingly built around human qualities like friendship and love as we try to figure out how we define ourselves in the twenty first century in the age of AI?


00:34:36 Victoria Hetherington: Oh, wow. I think I think it we should be. I you know, I I I think that people should be reaching for each other more. I think that


00:34:45 Andrew Keen: Real people, though, not for machines or even sensors. Right?


00:34:49 Victoria Hetherington: Yes. You know, I, I've I've spoken with a number of, Gen z younger people than me, who express that they don't go out very much, that, they'd rather be at home. They'd rather be alone. People people kinda scare them a little bit. You know, it's it's there there seems to be a bit of a shift, and I think that, certainly, the Internet doesn't help because, again, it's addictive, and it gives you maybe a feeling of having socialized without actually having socialized. And that involves conversations with living people well mediated through a screen. I think, of course, yeah, absolutely, we should focus on, reskilling and, reforming our


00:35:44 Andrew Keen: Yeah. I mean, I think what I meant in the twentieth century, we had political parties called themselves liberal or socialist or green or environmental. I wonder in the twenty first century, increasingly, we'll have political parties that call themselves the love party or the friend party. That was really my question. Maybe I wasn't asking it in a in a very coherent way.


00:36:05 Victoria Hetherington: Oh, I see. Wow. I mean


00:36:08 Andrew Keen: I mean, not necessarily political party, but movements, evangelical movements of one kind.


00:36:13 Victoria Hetherington: Oh. Oh, yeah. Well, you know what? There's, movements out there. There's there's currently very active movements right now. So, there are oh, man. There's so many. There's accelerationists who believe that, you know, we're we we we need to race towards cyborgs as quickly as possible and, reach the singularity, which is, when, AI becomes at least as smart as humans and likely surpasses their intelligence as soon as possible, and that's and there's nothing wrong with that. It's so exciting. And then there are types who are, who call themselves decelerationists, who say things like, please have a we put a pause on this for five years. There are also posthumanists, like Ray Kurzweil, for example, who sort of preach this idea that, humans and AI, are going to have a very rosy future together and will create art together and [unclear]. So there there there there are a number of movements right now.


00:37:23 Andrew Keen: That's why I was on this show a few years ago, and the only thing I remember is that it was extremely late, which maybe I don't know whether machines are more or less on time than humans. You note that some of the critics of accelerationism wanna put a pause on all this.


00:37:41 Andrew Keen: Sam Altman's company, OpenAI just made a decision to not develop, the erotic, version of their ChatGPT interface. What's your sense? What's your view having written this book, having talked to a lot of people who, are living with AI companions? Should this stuff be regulated more aggressively? Should kids, for example, be allowed to go on these AIs and imagine having an AI lover or friend?


00:38:15 Victoria Hetherington: Oh, I yeah. For sure. I mean, I I I the sign up age for Replika, I think, used to be as low as 13. And I think that it I'll give you an example. There's, an AI company called Character.ai where, in which the player can, make an AI friend based on popular fictional characters, so from Game of Thrones, manga, whatever. And you go on missions with them. You fly dragons with them. You do Do you


00:38:52 Andrew Keen: go to do you go to Jupiter with them sometimes?


00:38:56 Victoria Hetherington: You could maybe. I I haven't tried it myself. But and and then and then you can also develop a relationship with, the character that you are having these adventures with. There's a really sad story about Character.ai involving, a minor.


00:39:15 Andrew Keen: Yeah. And they're involved now in a major court case then. That's


00:39:18 Victoria Hetherington: Yes. Yes. There's a court case about that right now. And so, yeah, I think there needs to be I I think regulation needs to be so much tighter. And and and and I think that I think that Europe is cracking down a little bit faster than we are over here.


00:39:38 Andrew Keen: Well, certainly The United States. I mean, of course, Australia banned social media for kids, I think, are up to 16 or 18. A lot of European governments are following suit. Do you think we should have whether it's North America, Europe, Australia, Asia, East Asia, should there be reg should kids be allowed to even use AI until they're 16 or 18?


00:40:05 Victoria Hetherington: Honestly, I I might sound like a bit of a stick in the mud, but, Yeah. I I if I had a child, I I I wouldn't want them using AI until they're probably yeah. They're 16, 18. 18 sounds about right. Because there's just it's just so incredible at simulating seamless empathy, endless patience, and a feeling of caring, that I think maybe a child might not be sophisticated enough to, remind him or him or herself themselves, whatever, that this is not real. And, again, it's so easy just to spend so much time with these AI. Like, and so so so no. I I I I man. Yeah. I'd I'd say, yeah, maybe maybe 18. I I would not want children near this technology right now.


00:41:07 Andrew Keen: Well, there you have it, the self described stick in the mud. Okay. Victoria Hetherington, who has a new book out. It's called, it's called The Friend Machine, ironic title, on the trail of AI companionship. Finally, Victoria, I have to ask you this. I sometimes ask guests when we talk about AI. How would you convince our viewers or listeners that you're human, that you're not a bot? Where's the evidence of that?


00:41:43 Victoria Hetherington: That's a good question. It's a really good question. Because I'm inconsistent. I have let people down. I'm a deeply flawed individual. I have enormous flaws to my character. I'm I think did did I mention impatient? I so so so, you know, just going on that alone, I would say, I mean, I'm either a terrible AI or, a somewhat okay human.


00:42:21 Andrew Keen: Well, you might be a confessional AI. Very interesting conversation. Fascinating new book. It's out now. The Friend Machine on the Trail of AI Companionship by the Toronto based writer, podcaster, Victoria Hetherington. Victoria, thank you so much for coming on the show, and best of luck with the book. And I know you're doing a podcast, on the subject with CBC. I'm sure that will be a big hit too. Thank you so much.


00:42:47 Victoria Hetherington: Thank you so much. It was wonderful to talk to you today.