March 27, 2026

Bring the Friction Back: Stephen Balkam on Kids, Social Media, and Tech’s Big Tobacco Moment

Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon

“Friction is what brings us together. If we were never able to communicate in real space, we would not truly learn what it is to be human.” — Stephen Balkam

Is social media a drug? In what the Financial Times called a landmark case, Facebook (Meta) and YouTube (Google) have been found guilty of designing their products to be addictive to kids. Is this a big tobacco moment? the tut-tutting New York Times asked. In contrast, the free market Wall Street Journal called it a shakedown.

So what to make of this decision to make social media a narcotic? Stephen Balkam — founder and CEO of the Family Online Safety Institute (FOSI), amongst Washington’s most credible nonpartisan voices on kids and technology, has been on the front lines of this fight for nearly thirty years. Calling himself a radical moderate, he sees good and bad in social media. He even expelled Meta from FOSI three years ago for what he calls conduct contrary to the institute’s mission.

Balkam’s sharpest disagreement is with Jonathan Haidt, amongst the shrillest voices arguing in favor of a social media ban for kids. He “violently agrees” with Haidt on the idea of a free-range childhood — giving kids more freedom outdoors. But the evidence Haidt uses to justify banning social media confuses correlation with causation, a basic research error that, Balkam insists, academic researchers have called out. Balkam thinks the real anxious generation isn’t the kids — it’s us, the paranoid parents, projecting our mostly irrational fears onto our children.

His deeper argument is in favor of friction. Silicon Valley has spent thirty years removing friction from ordering pizza, hailing cabs, and dating. Balkam argues we need to design it back into childhood — the friction of developing friendships, building resilience, learning to think critically instead of outsourcing cognition to ChatGPT at midnight. Bring the human friction of life back, Balkam argues. It’s the most effective antidote to the drug of online existence.

Five Takeaways

•       Yesterday Was Tech’s Big Tobacco Moment — Sort Of: Meta and Google found liable for harm to children’s mental health. Balkam sees strong parallels to the tobacco cases of the nineties but resists the lazy comparison. The repercussions will extend beyond social media to AI. The hundreds of trials still to come will shape the next decade of tech regulation.

•       Congress Gets a D-Minus: America is the last advanced country without a national privacy framework. COPPA dates to the late nineties. KOSA never passed. The result is a splintering of state-level laws and no coherent federal approach. Meanwhile, parents are overwhelmed, and the tech companies retrofitted safety features years after the damage was done.

•       Jonathan Haidt Got the Free-Range Part Right. The Rest Is Shaky: Balkam “violently agrees” with Haidt on giving kids more freedom outdoors. But the evidence Haidt uses for his social media bans confuses correlation with causation — a basic research error. Academic researchers violently disagree with him. His book directly caused Australia’s social media ban. Balkam thinks we — the parents — are the anxious generation, not the kids.

•       42% of Teens Talk About Their Feelings with AI Chatbots: 60% say they feel safe using AI. 44% say some of its behaviours freak them out. They’re using it for homework, for loneliness, for practical advice, for asking how to invite someone to prom. And they’re worried about their job prospects. The three waves of concern: content in the nineties, behaviour in the 2000s, emotional attachment and cognitive outsourcing now.

•       Bring the Friction Back: Silicon Valley has spent thirty years removing friction from ordering pizza, hailing cabs, and dating. Balkam argues we need to design friction back into childhood — the friction of developing friendships, building resilience, learning to think critically. A plush AI toy called Grok is being marketed to three-year-olds. It’s always there, always positive, always frictionless. That’s the dystopia.

 

About the Guest

Stephen Balkam is the founder and CEO of the Family Online Safety Institute (FOSI), a nonpartisan organisation dedicated to making the online world safer for kids and families. FOSI’s members include Google, Amazon, Microsoft, and other leading technology companies. Balkam is based in Washington DC and will teach an MA course on online safety at Georgetown University in 2027.

References:

•       Family Online Safety Institute — FOSI’s research, policy work, and resources for parents.

•       Episode 2849: How Stories Can Save Us — Colum McCann on Narrative Four. Social media promised storytelling. It delivered isolation.

•       Episode 2846: How to Be Agreeably Disagreeable — Julia Minson on disagreeing better. Balkam’s friction argument is the parenting version.

About Keen On America

Nobody asks more awkward questions than the Anglo-American writer and filmmaker Andrew Keen. In Keen On America, Andrew brings his pointed Transatlantic wit to making sense of the United States — hosting daily interviews about the history and future of this now venerable Republic. With nearly 2,800 episodes since the show launched on TechCrunch in 2010, Keen On America is the most prolific intellectual interview show in the history of podcasting.

Website

Substack

YouTube

Apple Podcasts

Spotify

 

Chapters:

  • (00:31) - Introduction: Meta and Google found liable for harm to children
  • (03:23) - Big tobacco or something different?
  • (04:29) - Julia Angwin: should big tech pay us?
  • (06:23) - FOSI and the radical moderate
  • (07:25) - Congress gets a D-minus: no federal privacy bill
  • (09:34) - Safety by design vs. retrofitting parental controls
  • (09:49) - Why FOSI expelled Meta — and Twitter
  • (12:38) - The pendulum from optimism to paranoia
  • (14:48) - Jonathan Haidt: brilliant on free-range kids, wrong on the evidence
  • (18:05) - Australia’s ban vs. Greystones, Ireland: local solutions work
  • (22:20) - Trump’s tech panel: Zuckerberg and Andreessen
  • (24:19) - Melania and the robot: the optics of grift
  • (26:54) - 42% of teens talk about their feelings with AI chatbots
  • (31:22) - Bring the friction back: critical thinking vs. ChatGPT at midnight
  • (35:25) - Grok: the AI plush toy marketed to three-year-olds

00:31 - Introduction: Meta and Google found liable for harm to children

03:23 - Big tobacco or something different?

04:29 - Julia Angwin: should big tech pay us?

06:23 - FOSI and the radical moderate

07:25 - Congress gets a D-minus: no federal privacy bill

09:34 - Safety by design vs. retrofitting parental controls

09:49 - Why FOSI expelled Meta — and Twitter

12:38 - The pendulum from optimism to paranoia

14:48 - Jonathan Haidt: brilliant on free-range kids, wrong on the evidence

18:05 - Australia’s ban vs. Greystones, Ireland: local solutions work

22:20 - Trump’s tech panel: Zuckerberg and Andreessen

24:19 - Melania and the robot: the optics of grift

26:54 - 42% of teens talk about their feelings with AI chatbots

31:22 - Bring the friction back: critical thinking vs. ChatGPT at midnight

35:25 - Grok: the AI plush toy marketed to three-year-olds

00:00:31 Andrew Keen: Hello, everybody. It is Thursday, 03/26/2026, and not for the first or the last time, the headlines are about social media. This time, yesterday, Meta and Google have been found liable for social media harm to children's mental health in what the Financial Times calls a landmark US case.


00:00:56 Andrew Keen: Not everyone thinks it was a wise decision. The Wall Street Journal editorial describes it as the social media shakedown. It's very interesting. The New York Times reports that the juries are taking the lead in the push for child online safety. It's all about children and the Internet. The New York Times also asked: is this a big tobacco moment, and do they finally get regulated?


00:01:27 Andrew Keen: And these addiction trials are only just at the beginning. The one that was determined yesterday is the first one in many hundreds of trials about how social media is impacting children.


00:01:42 Andrew Keen: My guest today on the show is all too familiar with the impact of technology on kids. Stephen Balkam is the founder and CEO of the Family Online Safety Institute — one of Washington DC's most credible nonpartisan institutes for evaluating all these issues. A lot of the big tech companies are part of his group, although, interestingly enough, not Meta, who were asked to depart from FOSI three years ago.


00:02:15 Andrew Keen: Stephen, welcome to the show. You and I have known each other for many years. In fact, we first met, I think, at a big Internet conference in Rio de Janeiro in Brazil.


00:02:26 Stephen Balkam: Yes, we did. Thanks so much for having me, Andrew. I think we had a very lengthy taxi ride to some dinner, and the traffic was unreal. But we did get to —


00:02:36 Andrew Keen: All taxi rides in Rio are lengthy, Stephen. I think it's not possible to have a short one.


00:02:43 Stephen Balkam: Well, anyway, yes. It's been good to know you, Andrew. And in fact, you spoke at our very first annual conference in 2007 around your book, The Cult of the Amateur, which really stirred things up, I remember.


00:02:58 Andrew Keen: Well, I was early on to this. Twenty years later, it seems as if everyone has caught up with this concern about the Internet and kids. What do you make of this trial that is in the headlines of the Financial Times, The Wall Street Journal, the New York Times, the Guardian? Everyone is reporting on it. It is big news, Stephen. How would you historicize the decision yesterday?


00:03:23 Stephen Balkam: Oh, it's huge news. Absolutely. In many different ways. It will have repercussions across public policy, across the way we consider how we parent our kids, and quite frankly, the other cases that are waiting in the wings. It has strong parallels to the big tobacco court cases of the nineties, but I also feel that there are some key differences.


00:03:52 Stephen Balkam: I am not — I think that simply saying big tech is big tobacco is a rather unfortunate, maybe even lazy, comparison. I think there are some important differences. But yeah, this will have big repercussions. And not just for social media, Andrew, but for the AI companies as well, which quite frankly is the next wave, the next generation.


00:04:17 Andrew Keen: Yeah. I want to talk about AI. I know you're talking to me from South San Francisco. You have some meetings with certain AI companies — perhaps OpenAI and Anthropic — so we'll talk about that later.


00:04:29 Andrew Keen: But this idea that big tech should — what Julia Angwin is arguing in the New York Times today — big tech should pay us for what it's done to us. Do you think big tech has a debt, especially when it comes to kids and families? Is there a broader philosophical issue here? Or has the zeitgeist shifted so dramatically since you and I shared that cab in Rio de Janeiro in 2007 that now everyone hates tech, and everyone blames it for everything that's gone wrong — from democracy to anxiety to our loneliness?


00:05:10 Stephen Balkam: Yeah. I mean, we were definitely in the early heady days of optimism about connecting everyone in the world. Remember that? Or that it would somehow bring about democracy in the Middle East — and the Arab Spring, of course, was a few years after you and I met — all the way through now to what I would call the hype that Jonathan Haidt's book, The Anxious Generation, has created.


00:05:39 Stephen Balkam: And so the pendulum has swung very, very strongly to the other side. And I guess there's a certain piling on going on right now. So I would describe myself as a radical moderate. I see good and bad in social media. I see good and bad in AI, just like we ourselves as people incorporate good and bad.


00:06:05 Stephen Balkam: So I'm not piling on in this case, but I do feel that there are some karmic lessons for the social media companies, and some warnings for the AI companies as they build out their products.


00:06:23 Andrew Keen: So as a self-described radical moderate, at FOSI — you started this thing nearly thirty years ago, the Family Online Safety Institute. Everyone has positions on this, which are being articulated this week in the wake of the trial, Stephen. What have you found at your institute that can help us make sense of what we should and shouldn't allow our kids to do online, particularly when it comes to social media?


00:06:56 Stephen Balkam: Well, okay. There are so many elements here, but I guess the simplest way of saying it is that we all have some layer of responsibility. Government definitely has a responsibility to set the rules, create enlightened legislation based on academic research about the real facts about what's going on. And I would have to say I would give Congress pretty much a D-minus in this —


00:07:25 Andrew Keen: And you're generous. I mean, Congress rarely gets a D-minus for anything. So you probably mean you're being polite. It probably gets an F.


00:07:32 Stephen Balkam: I'm a glass-half-full kind of a guy. So look, they got through the TAKE IT DOWN Act, what, a few months ago. But we have to really look back to the late nineties for COPPA — the Children's Online Privacy Protection Act — for any kind of significant national framework.


00:07:53 Stephen Balkam: And actually, COPPA was more of a privacy law. We have failed — alright, here's an F. Congress has failed to pass a federal privacy bill upon which safety bills like KOSA, the Kids Online Safety Act, could be added to. We're one of the last — I think the last — advanced countries without a national privacy framework. And so what's happened is we've had this splintering effect, with all the states doing what they're doing.


00:08:24 Stephen Balkam: So that's just the first layer. Parents, teachers, the kids themselves all have different but overlapping responsibilities here. And let's face it: parents are overwhelmed and exhausted by the huge array of devices, apps, and platforms that their kids are accessing. Back in the early days, back in the nineties, we'd say put the family computer in the living room and you can keep your eye on what they're doing.


00:08:55 Stephen Balkam: And, you know, Web 2.0 in 2005, 2006, around the time you and I met, blew all of that away — particularly with mobile devices. So now we have AI to contend with. It is an enormous task to simply ask parents to control what their kids are doing. So Congress has to act, and the tech companies themselves — they have got to take their responsibilities far more seriously, to create with safety in mind, or as Julie Inman Grant, the eSafety Commissioner in Australia, would say, safety by design.


00:09:34 Stephen Balkam: Whereas, in fact, what has happened — and Facebook is a great example — it was totally created with college students originally in mind, and only many years later retrofitted with parental controls.


00:09:49 Andrew Keen: Your group includes a number of big tech companies — Amazon, Google, Microsoft. But as I noted in the introduction, you asked Meta, Facebook, to leave three years ago. Why is that? And how does the fact that you're supported by so many of these big tech companies, including Google, who own YouTube — one of the two social media platforms found liable in yesterday's decision — how does that work? Why should we trust you, Stephen?


00:10:24 Stephen Balkam: There's no reason to trust me, Andrew. All I would say is that we have a mission. Our mission is to make the online world safer for kids and their families. We have a range of principles — from being inclusive, transparent, bipartisan, and so on. Now, we are also a 501(c)(3), which is like an educational charity.


00:10:48 Stephen Balkam: We're not a trade association. We don't represent the industry. We're not cheerleaders for it, but we do work inside the tent with them simply because we felt that would be more effective than throwing rocks from outside the tent. Now in the case of Meta, they conducted themselves in a way which went contrary to our mission. And we held several board meetings and lengthy conversations and proper due process, and the board unanimously decided to revoke their membership.


00:11:23 Stephen Balkam: Same, by the way, with Twitter, when the management changed there — but for very different reasons. So we take our mission very seriously. Now, that's not to say that virtually all of those companies at different times have been fined by the FTC or have been found negligent in some way or another. But in almost all of those cases, they have changed their practices or they've made amends in some ways, and we are working internally with them to improve their trust and safety efforts. So all I would say is we show up every day to try and find a balanced way to make the online world safer for kids — and particularly understandable for their parents — so they can confidently navigate the web with their kids rather than putting the fear of God in them so that they just want to throw the devices out the window.


00:12:21 Andrew Keen: Yeah. We all come with our biases, as you know and as our viewers and listeners know. My wife is the head of litigation at Google, so I've been living this trial for weeks, if not months. I'm actually pleased it's ending — I don't have to hear about it anymore.


00:12:38 Andrew Keen: Coming back to some of these big tech companies that you work with — I know you're in town, you're in San Francisco to talk to some of the AI companies. Are you finding, in some of these companies — from Pinterest and other social media companies to Google and Amazon — are you finding a concern? Do they realize that the zeitgeist has shifted and that they need to behave a little bit more responsibly? They need to be more accountable. Are they recognizing that they have a big problem with kids?


00:13:12 Stephen Balkam: Absolutely. I think it's become front and center, most importantly, in the C-suite itself. You've got the CEO of Pinterest, for instance, pretty much saying he'd agree with a social media ban for under-sixteens. You would never have heard that five, ten, fifteen years ago.


00:13:30 Andrew Keen: And you might note who he is and his background. He used to be at PayPal — one of the younger and more dynamic entrepreneurs in Silicon Valley.


00:13:41 Stephen Balkam: Yeah. And similarly, the CEO of Snap, who is also relatively young, is taking safety as a sort of starting place. I think we're long past the idea that, you know, we have this amazing CEO, amazing CTO, we've got this incredible product, and we're gonna move fast and break things.


00:14:06 Andrew Keen: Zuckerberg, of course, who famously — or infamously — said that.


00:14:10 Stephen Balkam: Infamously. And I think what's happened is safety has come into the thinking. And by the way, follow the money. I think that's what's happening even at places like Andreessen Horowitz. I gave a talk there about why every startup that's involved with social media in any way, shape, or form should have a chief online safety officer as part of its portfolio of people running the show.


00:14:37 Stephen Balkam: Otherwise, you're gonna end up on the front page of the New York Times for the wrong reasons, or, in the case of Meta yesterday, you're gonna end up in court.


00:14:48 Andrew Keen: You mentioned Jonathan Haidt earlier — sociologist at New York University, the author of a bestselling book on the impact of social media on kids. Are you in the Haidt camp, Stephen? Do you think that Haidt has perhaps shifted the debate too far toward criticism, and too much of a — perhaps even a paranoia — about kids?


00:15:18 Andrew Keen: I have to admit, I read the Haidt book and wasn't particularly impressed. My sense is there's now this sort of deep paranoia, and it's not just caused or created by Jonathan Haidt. He's a consequence as well of helicopter parenting. I mean, he's supposed to be a critic of helicopter parenting, but he seems to be exhibit A in helicopter-parenting thinking, at least.


00:15:45 Stephen Balkam: That is quite a contradiction. Let me tell you where I agree with him. I violently agree with him on his point about giving kids more free time outdoors — giving them more freedom and responsibility to go to the local store and buy some milk and bring it home again. And he's sort of linked himself with the Let Grow movement, or the free-range kids concept. And I'm all for that. I grew up as a free-range kid. I think that we've lost an enormous amount by doing the helicopter parenting. But his first three proposals — and quite frankly, the so-called evidence he uses to base his proposals for bans — are very shaky indeed.


00:16:36 Stephen Balkam: And all of the academic researchers that we interact with violently disagree with him and see that he has used correlation as causation. And that's a basic 101 on research that you just don't do. But he's run with it. He's been very successful. He's obviously had an incredible marketing team behind him.


00:17:02 Stephen Balkam: And his book literally led to the Australian social media ban — directly, because the wife of a provincial leader read the book, handed it to her husband, and said, "What are you going to do about this?" So he's had a profound effect. He's part of this larger swing against tech that you and I started talking about. I think he's capitalized on it, and good luck to him. But I'm afraid I disagree with him on way too many counts.


00:17:36 Andrew Keen: You mentioned the Australian social media ban. Where is FOSI on this? I mean, you had a piece on your website about Australia acting to protect children online. Where's America? I assume you have a degree of sympathy. But do you think the Australians are right, and the French — I know — are following in their footsteps and other countries are looking at this? Should social media be banned for kids under 16 or 18?


00:18:05 Stephen Balkam: You know, philosophically, we're against nationwide or overarching bans. We are very much for thoughtful restrictions that are developed at the local level, and that include — by the way — parents, teachers, and the kids themselves. There's a fascinating article in the New York Times — I should have sent this to you, only yesterday, Andrew, in the real estate section of all places — about a small town in Ireland called Greystones. It's about 30 miles south of Dublin.


00:18:39 Andrew Keen: Yeah, I saw it actually, about kids not being allowed to carry their phones around — or being discouraged. I'll put it up on the screen as you continue.


00:18:52 Stephen Balkam: Basically — and by the way, I was interviewed for the piece, unfortunately my quote didn't make its way in, you know, editors — but basically, a teacher after COVID watched her kids coming to school and noticed a considerable difference in their behaviors, and also their unbelievable attachments to their phones. And so she started a conversation with other schools, and the next thing you know, all six schools in the town of Greystones agreed that it would be a good idea to say no phones for kids until they're 12 years old, or when they go to secondary school in Ireland and England. And they did that without any federal pressure.


00:19:44 Stephen Balkam: The parliament in Dublin didn't say they had to do this. They got together and did it, and the kids bought in. So that's a perfect example of a thoughtful restriction that's been developed at the local level. The problem with national bans that are imposed from on high is they inevitably create unintended consequences. And of course the kids will try everything they can to get around them.


00:20:13 Stephen Balkam: And in fact, we have research that's going to come out in about a month's time that will demonstrate the lengths to which kids are now going to get onto their favorite social media — to benefit from that connection and sense of belonging that a lot of kids find on social media.


00:20:33 Andrew Keen: And that New York Times piece is entitled "A Phone-Free Childhood: One Irish Village Is Making It Happen." Tired of seeing its elementary school children struggle with online temptations, the town of Greystones proposed a no-smart-device code, and most everyone bought in. And I have to admit, when I walk around without a phone, I always find it very liberating. And in terms of my own childhood — which, for better or worse, took place in the pre-social media, pre-smartphone age — there's nothing better than being untethered from one's family when one's traveling. These days, if we don't hear from our kids every hour or two, even if they're out of the country, we fear the worst — when in fact, unless you're traveling in Iran or something, we live in an increasingly safe age.


00:21:24 Andrew Keen: So this paranoia, which seems to be manifested in the thinking of people like Jonathan Haidt, is really troubling, isn't it, Stephen?


00:21:33 Stephen Balkam: Yeah. I guess I would be hesitant to use the P-word. We are very fearful. I actually wrote a piece that said we are the anxious generation.


00:21:44 Andrew Keen: Yeah, that's it — you're absolutely right. It's not the kids. The kids are learning it from the parents.


00:21:50 Stephen Balkam: Well, and I used my readings of Carl Jung back in my days as a psych major back in Cardiff — that we were talking about earlier — as a linchpin, because Jung talked a lot about the way in which we project onto others the fears that we have ourselves. And I think that is just classic of what's going on with us remaining Boomers and even Gen X parents in the way in which we treat and bring up our kids.


00:22:20 Andrew Keen: Stephen, you mentioned Marc Andreessen. You gave a speech to Andreessen Horowitz recently. Another piece of news this week is that Donald Trump, the American president, has named Mark Zuckerberg and Marc Andreessen to a tech panel. You threw Meta out of FOSI. Are you concerned with the way in which Trump is — for better or worse — appropriating the bad boys of tech, from Zuckerberg to Andreessen, and maybe politicizing this even more?


00:22:56 Stephen Balkam: I mean, it's deeply concerning. Although quite frankly, I'm not sure who's appropriating whom in this case. Could it be that these guys would prefer to play nice with this current administration for their own business interests? I'm sure the historians will look back and tell us exactly what happened here. But I'm not thrilled.


00:23:22 Stephen Balkam: Obviously, in the past, we've been involved with the White House through many different panels and advisory boards and so on. And for the most part, they were set up in good faith and with an effort to do the right thing for the country. I remain skeptical about pretty much anything that this administration is doing, particularly in this space. And I don't know what the recent AI framework that came out — which is calling for a moratorium, or actually a ban, on states creating any AI legislation — is helpful for whom. Is it helpful for the American people?


00:24:09 Stephen Balkam: I doubt it. Is it helpful for the tech companies developing this stuff? Of course. So I'm not a fan.


00:24:19 Andrew Keen: Another piece of news this week was the president's wife, Melania — I always call her Malaria Trump — being photographed with a smart machine, a robot, and prancing down some walkway suggesting that this was good for kids. Have you looked at that? And are you concerned with the way in which the president's wife might be promoting — if not child labor, certainly children's association with robots? Does that trouble you at all?


00:24:55 Stephen Balkam: Well, first of all, full disclosure — we worked with the First Lady during the first —


00:25:00 Andrew Keen: Is she not as bad as she looks?


00:25:02 Stephen Balkam: Well — let me get this out. Through the first administration, we were invited to the launch of Be Best. We did some follow-up meetings with her. She spoke at our annual conference in 2019, I believe it was. And I actually took her at her word that she was trying to address the issues of cyberbullying and so on.


00:25:25 Stephen Balkam: And I was grateful that she used her soapbox for that effort. This time around, first of all, we're not involved with her in this effort, and I thought the optics were not great. The idea of, in a way, promoting a piece of technology from a particular company always smacks of — dare I say it — grift, or some kind of connection that —


00:25:55 Andrew Keen: Grift, Stephen? You're not suggesting that this is a grifting administration, are you?


00:26:00 Stephen Balkam: I am suggesting that they have used their position to benefit themselves in ways that perhaps we've never seen before — to use a hyperbolic statement that this administration likes to use. So I'm disturbed by it. The idea, by the way, of this being a teacher's assistant is also very concerning. Again, not a fan.


00:26:29 Andrew Keen: In your research at FOSI, do kids have a different take on robots than our Boomer generation? We're a little fearful and nervous — we think these are our last invention and that we will become their serfs. Are kids more intrigued by the promise of being taught by and having friends with robots?


00:26:51 Andrew Keen: So much has been written and thought on this.


00:26:54 Stephen Balkam: I mean, yeah — we found that nearly half of teens used AI tools more than once a week. We found that 42 percent had talked about their feelings with an AI chatbot. Now here's a paradoxical set of stats: 60 percent said they felt safe while using AI, but 44 percent — which must be some kind of overlap — say some of generative AI's behaviors freak them out, which I guess is a technical term for "caused them concern." They have concerns that their parents don't seem to have any rules around generative AI, or that their parents haven't got their heads around it yet.


00:27:41 Stephen Balkam: And they're also concerned about their job prospects when they come out. So many of them are using it. Many of them are using it for homework — I have some issues around that. But at the same time, they're embracing it for solace if they're feeling lonely or bored.


00:28:03 Stephen Balkam: They're also using it for help and advice on basic practical things, all the way through to how do you ask a girl out to the prom. And at the same time, they have existential worries about whether AI will take all of the jobs that they will eventually want to apply for.


00:28:26 Andrew Keen: Yeah. I think one of the most prescient books written in the last fifty years is actually Kazuo Ishiguro's Klara and the Sun, which came out in 2021. I think it's a remarkable book about a near future — it seems now as if it's not even the near future, it's our current moment — where children interact with robots.


00:28:50 Andrew Keen: I know you're in town, Stephen, to talk AI. You may be going to visit Sam Altman or some of his people at OpenAI, and Dario Amodei at Anthropic. OpenAI have been quite active this week. Again, a remarkable week when it comes to headlines for AI. They've apparently put their so-called erotic chatbot plans on hold indefinitely, and on top of that, they're shutting down Sora, their AI video generator. I'm guessing that on both these fronts — especially the porn front — from your point of view at FOSI, these are good decisions.


00:29:32 Stephen Balkam: They are, and I'm relieved and grateful that they have taken those decisions. I would hope that there were some moral pangs in the front office about doing this. However, my guess is that the pending IPO is probably what's most uppermost in Sam Altman's mind.


00:29:54 Andrew Keen: Well, one wonders whether Sam Altman can experience what you call moral pangs. He has pangs, but I'm not sure of what kind.


00:30:06 Stephen Balkam: Well, anyway, whatever he's feeling, my guess is the pressure of the IPO is what's most uppermost in his mind. So yeah, I felt relieved and grateful that they did that, because Sora was a great example of the creation of mis- and disinformation that was causing a lot of teens to be very concerned.


00:30:30 Stephen Balkam: I mean, Andrew, if you look back to the mid-nineties, when we first got started in all of this, the number one concern was content — trying to keep kids away from porn, violent images, and so on. And then Web 2.0 in 2005, 2006, around the time you and I met, the issue switched to behavioral issues like cyberbullying, sexting, overuse, and oversharing. And now, twenty years after that, we've got the issue of emotional attachments to chatbots, mis- and disinformation — particularly through tools like Sora — and lastly, the thwarting of the development of critical thinking skills because we're outsourcing our cognition to our AI tools.


00:31:22 Stephen Balkam: Now for you and me with grey in our hair, that's one thing. Our frontal lobes have been connected up for a while. If you're a 17-year-old facing a midnight deadline to get an essay in, and ChatGPT is offering to write that essay for you, the impulse will be: yeah, of course. But of course, you lose the opportunity to learn. And so in another piece I wrote — around friction in Silicon Valley — these are the issues that we have to somehow design back into our tools, so that the friction of developing a relationship or a friendship is part of the deal.


00:32:03 Stephen Balkam: The friction of learning and developing resilience and judgment and critical thinking skills is part of the process. And that now, for me, is uppermost in my mind as the two biggest concerns we have.


00:32:20 Andrew Keen: Yeah. It's an interesting way of putting it — the friction. And, you know, even the good guys of Silicon Valley, or the supposed good guys, somebody like Dario Amodei, the founder of Anthropic, who claims he wants to protect humanity from AI — he's also a controversial figure, although I think most people trust him a little more than Sam Altman. Is the problem with some of these people in Silicon Valley — the Dario Amodeis — that they've grown up in such a techno-centric world that they don't really quite understand what friction is when it comes to human relationships?


00:32:57 Stephen Balkam: I mean, yeah — we've spent, what, the last twenty, thirty years taking the friction out of ordering a pizza, or a cab, or airline tickets, or —


00:33:07 Andrew Keen: Or dating.


00:33:09 Stephen Balkam: And then into dating. And, boy, doesn't that feel better — that we don't have to suffer, or we don't have to wait around, or we don't have the awkward pauses or silences that can now be taken out of our lives. And I'm seeing elements of Gen Z starting to say, "You know what? I'm switching off. I'm switching down. I'm going to go touch some grass," because they can see what's being taken away from them through this frictionless approach to life, which is not healthy for us humans.


00:33:53 Andrew Keen: Maybe I'm going to title this conversation, Stephen: "Bring the Friction Back."


00:33:57 Stephen Balkam: There you go. Friction's good. It's actually what sort of brings us together in some ways. Because if we were never able to communicate and contact each other in real time, in real space, in meat space as they used to call it, we would not truly learn what it is to be alive and what it is to be human.


00:34:24 Andrew Keen: So in conclusion, could we — or might you argue, FOSI and Stephen Balkam — that we need to bring the friction back to childhood? And in a way, people like Jonathan Haidt have got it entirely wrong. They want to take all the friction out of childhood, but we want to bring it back. And that's what being a responsible child and human is — learning to understand, deal with, and cultivate friction in some ways.


00:34:58 Stephen Balkam: Well, to be fair, he does want to bring friction back in terms of letting kids outside and going to the store and all of that. So I'll give him marks for that. But yeah, he wants to take everything else away — and based on research that is questionable. Yeah. I think we do need to help kids learn how to develop friendships and relationships.


00:35:25 Stephen Balkam: I mean, I have a plush toy back in my office called Grok, which is marketed to three-year-olds. It's a toy that you can talk back and forth with all day, and it is always there for you. It is always positive.


00:35:40 Andrew Keen: This is a Musk toy?


00:35:41 Stephen Balkam: It isn't. It's actually Grimes — one of his former partners. And she called the toy Grok. I guess she got permission from Elon.


00:35:50 Andrew Keen: He's probably thought about suing her now.


00:35:54 Stephen Balkam: Well, anyway — yes. It's got an AI sitting inside this little, cute toy, and it's really — I talk about dystopian. The fact that we would leave our kids with either a Grok or one of Melania's robots to talk to all day long is taking the friction out of childhood. Not a good idea.


00:36:16 Stephen Balkam: And at the other end — and by the way, next year I'm going to be teaching a course at Georgetown, an MA course on online safety, and I'm going to have to confront the issue that the kids who'll be in my class could well be accessing AI to do their work.


00:36:36 Andrew Keen: What do you mean "could well"? They inevitably will, Stephen.


00:36:41 Stephen Balkam: Let's just say on the first day of class, we're going to have to have a conversation. Because, by the way, Georgetown — along with a number of other universities — doesn't have a campus-wide policy on AI. They leave it up to individual professors to decide for themselves. So I'm going to talk to them about the power of friction and the power of struggle, and that that's part of what it's like to be alive. And I want them to benefit from that.


00:37:08 Andrew Keen: Bring the friction back — at least according to Stephen Balkam, the founder and CEO of the Family Online Safety Institute. Stephen, we started this conversation in a cab in Rio back in 2007. It's always nice to talk to you. I have to get you back on the show. You are on the front lines, the cutting edge of all these discussions. Thank you so much.


00:37:28 Stephen Balkam: Thank you, Andrew. Really appreciate it.