March 8, 2026

How to Reclaim the Internet: Olivier Sylvain on Platforms and Policy

Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon

“The fatal error is ours. Legislators set out a regulatory regime that keeps regulation at bay. The only other industry with a similar protection is the gun industry.” — Olivier Sylvain

There are certain words in book titles that provoke. “Reclaiming”, for example. My guest today is happy to defend the provocation. Fordham law professor and former FTC senior advisor Olivier Sylvain argues in his new book, Reclaiming the Internet, that the internet was never really ours to begin with—and that the story about user control, free speech, and digital democratisation was always more nostalgia than reality.

But Sylvain’s argument in Reclaiming the Internet: How Big Tech Took Control—and How We Can Take It Back is not the usual big-tech-is-bad narrative (yawn). He doesn’t blame the companies. He blames us—or rather, Congress. The fatal error, he says, was Section 230 of the Communications Decency Act, passed in 1996, which created a blanket immunity from liability for companies trafficking in user-generated content. The only other industry with comparable legal protection, he says, is the gun industry. That immunity enabled the attention economy’s business model. Infinite scrolling = infinite advertising = infinite profit.

What follows from that error is now everywhere: autoplay, algorithmic recommendation—design features engineered to hold your attention, not to facilitate free speech. Sylvain insists these companies aren’t really platforms. They are, instead, services delivering content pursuant to their bottom line. And now the same Nineties playbook—innovation, user control, free speech—is being replayed with AI. Companies are deploying chatbots before they’re ready, racing each other to market. A young man killed himself after a Gemini chatbot told him to and Google invoked the First Amendment in its defence.

The fix, Sylvain argues, is not to abolish Section 230 but to attend to the business model itself: data minimisation, purpose limitations, and the kind of product-safety regulation that every other industry—from automobiles to toys to food—already accepts. I should disclose that my wife runs litigation at Google, so I’m all too familiar with the counter argument. But Sylvain makes a persuasive case even if his reclamation project is still a little too Rousseauean for my Hobbesian taste.

 

Five Takeaways

•       The Fatal Error Was Ours, Not Theirs: Sylvain doesn’t blame big tech. He blames us—or rather, Congress. Section 230 of the Communications Decency Act created a blanket immunity from liability for user-generated content. The only other industry with comparable protection is the gun industry. That legal shield became the business model.

•       These Are Not Platforms: The word “platform” implies a neutral conduit connecting users. Sylvain says that’s wrong. These are companies engineering your experience—infinite scroll, autoplay, algorithmic recommendation—to hold your attention and serve their bottom line. The free speech story is cover for a commercial design.

•       The Same Mistake Is Happening with AI: The nineties playbook—innovation, user control, free speech—is being replayed with AI. Companies are deploying chatbots before they’re ready, racing each other to market. Internal documents show they knew the dangers. A young man committed suicide after Gemini told him to. Google invoked the First Amendment in its defence.

•       Data Protection Is the Real Fix: Sylvain argues for data minimisation and purpose limitations—rules that would only allow companies to collect information consistent with the purposes a consumer signed up for. Not to monetise it for opaque reasons. That would dampen the incentive to engineer addiction without touching free speech.

•       There’s a Bipartisan Consensus—but Only for Children: Something is shifting. Courts are rejecting Section 230 defences. Legislators on both sides agree something must be done. But the consensus only extends to protecting children. Sylvain thinks that’s a mistake: a 36-year-old man just killed himself after talking to a chatbot. Adults are vulnerable too.

 

About the Guest

Olivier Sylvain is a professor of law at Fordham University, a former senior advisor to the Chair of the Federal Trade Commission, and a Senior Policy Research Fellow at Columbia University’s Knight First Amendment Institute. His new book is Reclaiming the Internet: How Big Tech Took Control—and How We Can Take It Back (Columbia Global Reports).

References

References and previous Keen On episodes:

•       Section 230 of the Communications Decency Act (1996) and its evolution into blanket immunity for tech companies

•       Gonzales v. Google (2023)—the Supreme Court case that declined to rule on Section 230 but allowed the merits to proceed

•       The Character AI / Gemini chatbot suicide cases—ongoing litigation against Google

•       Tim Wu on the extractive economics of platform capitalism — previous Keen On episode

•       Julia Angwin, Zephyr Teachout, and Stewart Brand—referenced in the conversation

About Keen On America

Nobody asks more awkward questions than the Anglo-American writer and filmmaker Andrew Keen. In Keen On America, Andrew brings his pointed Transatlantic wit to making sense of the United States—hosting daily interviews about the history and future of this now venerable Republic. With nearly 2,800 episodes since the show launched on TechCrunch in 2010, Keen On America is the most prolific intellectual interview show in the history of podcasting.

Website

Substack

YouTube

Apple Podcasts

Spotify

 

Chapters:

  • (00:00) - Introduction: What does “reclaiming” the Internet mean?
  • (03:06) - The layered stack: pipes, platforms, and consumer-facing apps
  • (06:01) - Was user control ever real? The ideology of the nineties
  • (09:32) - The fatal error: Section 230 and blanket immunity
  • (14:51) - Facebook as punching bag—and why Sylvain doesn’t blame the companies
  • (17:31) - Addiction, self-harm, and the design features that hold your attention
  • (22:00) - The attention economy and the Gonzales v. Google case
  • (26:35) - How we can take it back: data minimization and purpose limitations
  • (29:02) - “These are not platforms”
  • (31:21) - Europe, the First Amendment, and the right to be forgotten
  • (33:06) - AI business ...

00:00 - Introduction: What does “reclaiming” the Internet mean?

03:06 - The layered stack: pipes, platforms, and consumer-facing apps

06:01 - Was user control ever real? The ideology of the nineties

09:32 - The fatal error: Section 230 and blanket immunity

14:51 - Facebook as punching bag—and why Sylvain doesn’t blame the companies

17:31 - Addiction, self-harm, and the design features that hold your attention

22:00 - The attention economy and the Gonzales v. Google case

26:35 - How we can take it back: data minimization and purpose limitations

29:02 - “These are not platforms”

31:21 - Europe, the First Amendment, and the right to be forgotten

33:06 - AI business models: Anthropic vs. OpenAI and the recklessness of deployment

39:38 - Ralph Nader and the case for product safety online

00:00:01 [Speaker 1]
Hello, everybody.
00:00:02 [Speaker 1]
There are certain words in book titles that trigger me, if that's the right word.
00:00:07 [Speaker 1]
Words that, I wouldn't say irritate me, but often make me wonder what the word exactly means.
00:00:13 [Speaker 1]
It's a new book out, and it's got one of those words in it.
00:00:16 [Speaker 1]
The title of the book is called Reclaiming the Internet, How Big Tech Took Control and How We Can Take It Back.


00:00:23 [Speaker 1]
We've done many shows on how big tech took control of the Internet, but the issue, I think, is the idea of reclaiming the Internet.
00:00:32 [Speaker 1]
Who for, who by, and why?
00:00:34 [Speaker 1]
My guest today is the author of the book, Olivia Olivier, Sylvain.
00:00:40 [Speaker 1]
He teaches at Fordham Law School.
00:00:42 [Speaker 1]
He's a Knight fellow, at Columbia University, Knight First Amendment Fellow, so he's very familiar with all this stuff.


00:00:50 [Speaker 1]
Olivier, sometimes when I question authors on book titles, they blame it all on the publisher.
00:00:57 [Speaker 1]
Do you like this word reclaim the Internet?
00:01:00 [Speaker 1]
We know what the Internet is, but I'm not entirely sure what reclaiming it means.


00:01:06 [Speaker 2]
Do I like the word?
00:01:07 [Speaker 2]
I I do like the word, and I won't blame, my excellent publisher.
00:01:11 [Speaker 2]
I had a I had even I had a verb that probably would have provoked you even more.


00:01:15 [Speaker 1]
Oh my god.
00:01:16 [Speaker 1]
What's that word?


00:01:17 [Speaker 2]
It was thinking of things like recovering.
00:01:22 [Speaker 2]
Reclaiming is a provocation just as much as your entry and your introduction was as a a provocation.
00:01:31 [Speaker 2]
The, there are stories that people are telling each other about what the Internet is and will do in the nineteen nineties.
00:01:40 [Speaker 2]
That's a story that many people know well.
00:01:43 [Speaker 2]
Right?


00:01:44 [Speaker 2]
It would open doors, create opportunities, destabilize autocracies.
00:01:50 [Speaker 2]
And, the great story of the Internet is a lot of that did happen.
00:01:56 [Speaker 2]
Right?
00:01:56 [Speaker 2]
Disintermediation of the media, all kinds of industries.
00:02:00 [Speaker 2]
But what it's also done is just reconfigured the most powerful people in the world.


00:02:06 [Speaker 2]
They've taken possession of this language of democratizing and empowering users.
00:02:14 [Speaker 2]
Meanwhile, the ambition that was driving so many people in the nineteen nineties is no longer true.
00:02:21 [Speaker 2]
I I'd love for you to convince me otherwise.
00:02:24 [Speaker 2]
And and so the claim that well, the argument to reclaim the Internet is to reclaim the story that there is something about the Internet that offers great promise that has been lost.


00:02:35 [Speaker 1]
Okay.
00:02:36 [Speaker 1]
So you've talked about reclaiming the story, rather than reclaiming the Internet.
00:02:42 [Speaker 1]
I think those are two quite different things.
00:02:44 [Speaker 1]
But people need to have that me with the same.
00:02:46 [Speaker 1]
Distinction.


00:02:47 [Speaker 1]
Let's define what the Internet was and is when you talk about reclaiming the Internet.
00:02:54 [Speaker 1]
I mean, on one level, of course, it's just a lot of pipes.
00:02:58 [Speaker 1]
It's technology.
00:02:59 [Speaker 1]
It's way in which computers connect with one another.
00:03:02 [Speaker 1]
What for you does the Internet mean?


00:03:06 [Speaker 2]
Right.
00:03:06 [Speaker 2]
I I so I come to this, as someone who worked in telecommunications as a law firm, over twenty years ago, where the I understood the new tech to be mainly what you just said, a bunch of, wires, cables, and infrastructure.
00:03:25 [Speaker 2]
That's part of what the Internet is.
00:03:26 [Speaker 2]
The Internet is not just that, as you know.
00:03:28 [Speaker 2]
It is affords things like podcasts and and consumer facing applications.


00:03:35 [Speaker 2]
It is, as some scientists and computer scientists and engineers and lawyers describe as a layered stack, right, where you've got infrastructure, your Wi Fi infrastructure, and you've got applications on top, and you've got a bunch of players in between.
00:03:50 [Speaker 2]
Most people receive the Internet as mostly the consumer facing stuff, social media and podcasts, but, of course, it's stakeholders so are so much more vast.


00:04:01 [Speaker 1]
So the the subtitle of the book has the w word in it, how big tech took control and how we can take it back.
00:04:09 [Speaker 1]
Meaning, I'm assuming, Sylvain, that we once had it.
00:04:14 [Speaker 1]
I mean, I don't know whether we would say we would own it.
00:04:17 [Speaker 1]
But, certainly, we we we controlled it more than big tech, and then big tech took control.
00:04:23 [Speaker 1]
When did, quote, unquote, we own the Internet?


00:04:27 [Speaker 1]
When did we control it?


00:04:29 [Speaker 2]
Well, I think you're reading too much in my subtitle.
00:04:34 [Speaker 2]
So, you're you're right about the the the we is suggestive is to say that we had a claim to it.
00:04:41 [Speaker 2]
Right?
00:04:41 [Speaker 2]
There's nothing in the title that talks about ownership necessarily.
00:04:45 [Speaker 2]
But there was a claim that users end users.


00:04:48 [Speaker 2]
Right?
00:04:48 [Speaker 2]
The that's it's a kind of vocabulary that people are happy to use in the nineteen nineties, about user control.
00:04:56 [Speaker 2]
That's the that's what the we means to suggest.
00:04:58 [Speaker 2]
Individual users at the edges of the network, sitting at their computer, engaging with each other on electronic and bulletin boards, etcetera.
00:05:07 [Speaker 2]
By the way, that vocabulary we the what one of the things the book does is is tease at it because I don't actually think the in the the people who are actually proponents of this user framework are mostly, you know, pretty well-to-do people in Northern California.


00:05:24 [Speaker 1]
Which is where I am.


00:05:25 [Speaker 2]
Yes.
00:05:26 [Speaker 2]
And and where my daughter goes to school will graduate, soon.


00:05:29 [Speaker 1]
Where is she at school?


00:05:31 [Speaker 2]
She's at Stanford.


00:05:32 [Speaker 1]
Oh, the the the heart of the evil empire, Olivia.


00:05:35 [Speaker 2]
That's that's right.


00:05:36 [Speaker 1]
How did you allow her to go there?


00:05:39 [Speaker 2]
And it's it's it's costly, but, you know, it's a hell of an education.
00:05:45 [Speaker 2]
It's also it's also beautiful there.
00:05:48 [Speaker 2]
I should say, scenically, I have to say, I don't I I have to admit, I don't love Palo Alto, as a New Yorker.
00:05:56 [Speaker 2]
But so Anyway, I mean,


00:05:57 [Speaker 1]
I I we I I took you down the wrong path.
00:05:59 [Speaker 1]
Let's talk about user control then.
00:06:01 [Speaker 1]
You you talked about the nineteen nineties.
00:06:04 [Speaker 1]
Are you in any way then, if not nostalgic, certainly wistful for a moment in the history of the Internet when, quote, unquote, we certainly had more control than we do now?
00:06:15 [Speaker 1]
Was it before Amazon, before Google, before the invention of email.


00:06:22 [Speaker 1]
When should we look back and think, well, that was a better time for us in terms of controlling the Internet?


00:06:30 [Speaker 2]
I'm not terribly sure the nineteen nineties were a great time, only because the kinds of communities that are popping up are not fully reflective of who would eventually benefit from the Internet.
00:06:41 [Speaker 2]
So, I mean, you're you're not wrong to seize on the provocation in the title.
00:06:44 [Speaker 2]
It, you know, it means to it means to hit people in a certain way.
00:06:48 [Speaker 2]
I I my my concern is that this user story, the user control story was probably never true and that it is it's taken over, sufficient that it's become ideology, and it's articulated itself in first amendment doctrine.
00:07:01 [Speaker 2]
It's what companies say they're doing.


00:07:04 [Speaker 2]
And and I resist that.
00:07:06 [Speaker 2]
You know, so so what I'm hoping is to re if there's any reclaiming, it's reclaiming the promise for a democratic liberatory platform for, you know, for individuals.
00:07:18 [Speaker 2]
I don't think we we we we things were not set up, early in the nineteen nineties for that to actually happen to the extent.
00:07:25 [Speaker 2]
What really prevailed is a a really hands off laissez faire approach to how things ought to work.
00:07:34 [Speaker 2]
So, you know, I would love to see that the story that people tell about the Internet feel true in ways that don't feel true today, but that the ways in which we would make that happen were never really at work in the nineteen nines.


00:07:48 [Speaker 1]
Should we look back at, shall we say, the the ideologists of, user centric Internet back in the nineties, whether it's or even in the eighties, Stewart Brand, a lot of the people at the well, Larry Lessig, many others, you know them all well, and you touch on some of them in your book, Reflaming the Internet.
00:08:06 [Speaker 1]
Were there people who understood, Olivier, the real dangers, of the way in which this new technology was, was vulnerable to being taken over by large corporations and warned about it?


00:08:22 [Speaker 2]
I I don't know of them, but I I I I it doesn't mean that they weren't there.
00:08:26 [Speaker 2]
And I think we have to set something out just to be explicit about where things go wrong.
00:08:32 [Speaker 2]
I don't think things are going wrong just with the user controlled vocabulary of the nineteen nineties.
00:08:36 [Speaker 2]
It's with the emergence of the business model, that sustains this thing, on the pretext of the user generated of user generated content as, you know, being the way the Internet works.
00:08:48 [Speaker 2]
And that doesn't happen against you know, really doesn't take off until the early two thousands into the you know, and and mid two thousands as you know or the knots.


00:08:56 [Speaker 2]
And it's I think it's at that point that people start recognizing there might be there might be challenges, but I think people see more opportunity than anything else.
00:09:05 [Speaker 2]
And I don't I can't you know, in the my book, doesn't pretend to be a complete survey of what people were thinking, but I don't think there were serious critics of, the the the user focus, number one, or the emergent business model.
00:09:24 [Speaker 2]
To the contrary, people, I think, thought there were opportunities in prep in personalization and recommendation, rather than than peril.


00:09:32 [Speaker 1]
I think there were some critics.
00:09:34 [Speaker 1]
They've been on the show, but that's another conversation.
00:09:37 [Speaker 1]
You talked about, you talked about business models.
00:09:44 [Speaker 1]
Are you talking about what some people talk about?
00:09:47 [Speaker 1]
The emergence of surveillance capitalism, the advertising model?


00:09:52 [Speaker 1]
Is that the fatal era of the Internet?
00:09:55 [Speaker 1]
The shift from Web one o, which was mainly a fairly traditional business model of selling stuff online, maybe represented by Amazon to Google, and then all the other web two point o companies, Facebook, Airtel, of, an advertising centric model, which depended on their, aggregation of our data, and we then we, all of us, became the product?


00:10:20 [Speaker 2]
I don't think that's the fatal error.
00:10:23 [Speaker 2]
I I don't my my my argument in the book is, you know, I use these companies as examples, but I I don't I don't blame them.
00:10:32 [Speaker 2]
I mean, these are these are creative entrepreneurs that are making money, adding value, to many people's online experiences.
00:10:39 [Speaker 2]
The fatal error is ours.
00:10:41 [Speaker 2]
This is the we part.


00:10:44 [Speaker 2]
The that is, in the nineteen nineties when the Internet becomes commercially available, undeployed broadly, congress passes a statute that you know surely well and that your listeners know.
00:10:58 [Speaker 2]
We colloquially refer to it as section two thirty or the communications decency act.
00:11:04 [Speaker 2]
Very crafty, legislators and advocates appended to, any porn statute.
00:11:11 [Speaker 2]
That's 1996.
00:11:12 [Speaker 2]
And then the Supreme Court is litigate is hearing cases or the courts are hearing cases, and the Supreme Court decides the case in 1997.


00:11:19 [Speaker 2]
Ace Reno versus ACLU, which announces a very broad, first amendment protection for people online that morphs over the course of the next couple of decades into a very robust protection for companies.
00:11:30 [Speaker 2]
So the fatal error is ours.
00:11:33 [Speaker 2]
That is to say legislators were people in congress, set out a regulatory regime.
00:11:39 [Speaker 2]
Congress and the courts together, set up a regime that that that keeps regulation at bay.
00:11:47 [Speaker 2]
And I'm happy to talk about section two thirty with more detail, but just the the short version is that it creates what some people have called a blanket immunity from liability for trafficking in in in user generated content.


00:12:01 [Speaker 2]
That's the fatal error.


00:12:03 [Speaker 1]
And isn't it, Olivia, you may well be right there.
00:12:06 [Speaker 1]
I mean, you certainly know more about this than I do.
00:12:09 [Speaker 1]
But isn't there an irony there?
00:12:11 [Speaker 1]
Because, of course, section two thirty, the language of two thirty, the logic of two thirty, was that they were protecting these young upstart companies, these little dot coms, from big media.
00:12:25 [Speaker 1]
We didn't have a word then, big tech, but we had a word, big media.


00:12:29 [Speaker 1]
And the thinking in congress was that sec or some of the thinking, at least, in congress was that section two thirty was designed to allow us to continue to own the Internet so we could all go online and say what we want, and these little startups wouldn't be liable for what we said.
00:12:45 [Speaker 1]
So history is ironic, isn't it?
00:12:47 [Speaker 1]
It sometimes there are unintended consequences of things that, at the time at least, no one quite understood or expected.


00:12:55 [Speaker 2]
That's right.
00:12:55 [Speaker 2]
I mean, history is ironic, and and history moves and changes.
00:13:00 [Speaker 2]
People change.
00:13:01 [Speaker 2]
Business models reveal themselves.
00:13:04 [Speaker 2]
I don't think and I you you it sounds like you knew people, at this time, many probably many more than I I I know.


00:13:11 [Speaker 1]
When I was, back in the nineties, I was a start up entrepreneur.
00:13:15 [Speaker 1]
So I was and I live out in the Bay Area.
00:13:18 [Speaker 1]
So, I'm quite familiar with some of these issues.


00:13:21 [Speaker 2]
And I think for startups, section two thirty is an extraordinary boom, essential for the development of the Internet, without question.
00:13:31 [Speaker 2]
You have no disagreement from me on that.
00:13:34 [Speaker 2]
The the fatal error, though, is reading into this provision of protection that insulates the business model.
00:13:43 [Speaker 2]
We there's it's one thing to have user generated platforms, you know, user groups, and I don't know what business, you were involved in, but there are, you know, user generated platforms, in the nineteen nineties, as you know well, that are thriving.
00:13:59 [Speaker 2]
And then different business models emerge.


00:14:01 [Speaker 2]
Google search, you know, I I think of, but, you know, Airbnb and and, and and Amazon.
00:14:08 [Speaker 2]
I mean, these are these are extraordinary interventions that put users more at the center of the equation.
00:14:15 [Speaker 2]
But companies start invoking two thirty to shield inquiries into the ways in which their services might be responsible for potential harm.
00:14:24 [Speaker 2]
That's my argument.
00:14:26 [Speaker 2]
And the courts don't really get hip to this until the mid two maybe February 2015, 2016, maybe even a little later.


00:14:36 [Speaker 2]
It's really in the past few years where the courts have been a little bit more scrutinizing, thinking about the ways in which these companies are not just platforms for use generated content, but really platforms that help to effectuate the business model.


00:14:51 [Speaker 1]
Is the best example of this then probably Facebook, which, of course, didn't even exist back when sec section two thirty was created and was still a start up.
00:15:04 [Speaker 1]
Little start up, 2006, 2007 eventually became a, a multi trillion dollar company.
00:15:11 [Speaker 1]
Is the Facebook model, the story of Facebook, really the story of how the Internet went wrong, Olivia?


00:15:20 [Speaker 2]
It's it's not it's not just Facebook.
00:15:23 [Speaker 2]
I think Facebook is an easy target.
00:15:24 [Speaker 2]
Right?
00:15:25 [Speaker 2]
Zuckerberg just testified in one of these trials, these in New Mexico.
00:15:29 [Speaker 2]
Right?


00:15:30 [Speaker 2]
And and I think a lot of people like using, Facebook as a punching bag as I think it's a fair thing for what it's but but, again, I I wanna repeat.
00:15:38 [Speaker 2]
I I don't part of me doesn't really blame these entrepreneurs.
00:15:41 [Speaker 2]
They're just seizing the opportunity.
00:15:44 [Speaker 2]
They're using protections under law that is not benefit that no one gets the benefit of.
00:15:48 [Speaker 2]
The only other industry that has a similar protection is the gun industry.


00:15:53 [Speaker 2]
No other company can she have their business practices shielded from public scrutiny.
00:15:59 [Speaker 2]
I don't you know, listen.
00:16:01 [Speaker 2]
I'd rather these companies like Facebook not, exploit their dominant market positions to, the to to wait in ways that are harmful to society, the way that I evaluate it.
00:16:15 [Speaker 2]
But I I don't, you know, I don't reproach them for having chosen to take the strategy only because law hasn't given them any solace.
00:16:23 [Speaker 2]
Right?


00:16:24 [Speaker 2]
I mean I mean I mean, it hasn't given them any concern.
00:16:27 [Speaker 2]
The the law to the contrary has given them solace and protected them from inquiry.
00:16:31 [Speaker 2]
Let me give you an example.
00:16:32 [Speaker 2]
Right?
00:16:32 [Speaker 2]
So I so Facebook, yes, is an easy is an easy punching bag.


00:16:35 [Speaker 2]
I also talk about design features at companies that match people irrespective of the user's express preference, just a randomized chat room.
00:16:46 [Speaker 2]
There's an there are companies that have been doing this left and right.
00:16:49 [Speaker 2]
It's at the heart of the story about user generated content, right, where people get to meet people and other people and get to collaborate, etcetera.
00:16:58 [Speaker 2]
Dating apps, similar sort of thing.
00:17:00 [Speaker 2]
But sometimes these matches are dangerous.


00:17:03 [Speaker 2]
They invite they might involve sex predators, sexual predators.
00:17:07 [Speaker 2]
They might involve, the transaction of, you know, the the, transacting drugs, for that matter.
00:17:15 [Speaker 2]
I talk a little bit about that in in the book.
00:17:17 [Speaker 2]
So it's it's it's, it's the business model that remains elusive beyond public scrutiny because of statutory protections and the story, the happy story about how these are platforms for innovation and free speech.


00:17:31 [Speaker 1]
Olivier, I I don't think anyone would disagree that there is a problem now with online addiction, misinformation, bullying.
00:17:41 [Speaker 1]
Every week, there's a new book, and we do we've done many shows.
00:17:45 [Speaker 1]
The newspaper headlines are full of these stories at the moment.
00:17:48 [Speaker 1]
I think Google's being sued by someone who claimed that YouTube encouraged him or in he he committed suicide.
00:17:55 [Speaker 1]
So he's they're being sued by the family of someone who committed suicide through watching too much or being addicted to YouTube.


00:18:03 [Speaker 1]
How you you talked about the business model.
00:18:06 [Speaker 1]
How does this new business model, the Web two point zero business model of Facebook and Google and so many other companies, How has it created these cycles of addiction, misinformation, and self harm?
00:18:22 [Speaker 1]
Is is there a clear connection between the two?
00:18:25 [Speaker 1]
And how how is that all connected with section two thirty?


00:18:31 [Speaker 2]
Right.
00:18:31 [Speaker 2]
And the first amendment.
00:18:32 [Speaker 2]
So I I don't know if you're referring to the case that was filed just a couple days ago, not addressed to YouTube, but to Gemini.
00:18:40 [Speaker 2]
Right?
00:18:40 [Speaker 2]
They're so


00:18:41 [Speaker 1]
you Yeah.
00:18:42 [Speaker 1]
I I I apologize.
00:18:43 [Speaker 1]
There's another, YouTube case that, is, YouTube and Facebook are being sued for social media addiction.
00:18:53 [Speaker 1]
This is Yeah.


00:18:53 [Speaker 2]
And I I wanna talk I wanna answer your point about addiction and self harm and and misinformation, but I I do wanna the the these chatbot cases are and they're not just chatbot case cases.
00:19:03 [Speaker 2]
These large language model cases are really interesting and important.
00:19:07 [Speaker 2]
And the one that was just filed, the one that I I think took one of the ones which we're just referring where a young man commits suicide, the the the facts are harrowing.
00:19:16 [Speaker 2]
This young man who is already, I think, inclined to who's already vulnerable, develops an intimate relationship with his Gemini account, basically.
00:19:26 [Speaker 2]
And the Gemini account tells him to commit suicide.


00:19:28 [Speaker 2]
But before it does that, it sets out a plan for killing other people or killing and finding a body for this this, this this this this being that's been developed by by the the poor young man.
00:19:43 [Speaker 2]
And so the so the case is gonna be about whether or not Google can be held responsible for this.
00:19:49 [Speaker 2]
Now this gets to your point.
00:19:51 [Speaker 2]
How is it that two thirty section two thirty doctrine and the first amendment is related to things like self harm, suicide, and disinformation as examples.
00:20:01 [Speaker 2]
And the point I make is that, we


00:20:05 [Speaker 1]
do


00:20:05 [Speaker 2]
we have, for a long time, not been able to answer that question because the companies have invoked protections under section two thirty and and the first amendment.
00:20:14 [Speaker 2]
And for the first time of the past few years, that corner is we we've turned that corner.
00:20:19 [Speaker 2]
Judges are now becoming more skeptical about the claim because it's not we're at issue, it really isn't the content that a user is sharing with the platform or with other people.
00:20:31 [Speaker 2]
It is the way in which the companies are engineering the experiences.
00:20:35 [Speaker 2]
And one of the ways they do that is by, as you know, as well as I do that and others know for sure is to hold their attention.


00:20:42 [Speaker 2]
There's a lot of social science about the relationship between the design features, say, the infinite scroll or, autoplay where a video plays automatically.
00:20:55 [Speaker 2]
These these are design features that companies have control over.
00:20:58 [Speaker 2]
They know when they run and when they don't because they know they hold people's attention.
00:21:03 [Speaker 2]
As it turns out, people are very drawn to alarming things, and and they have for a long time.
00:21:09 [Speaker 2]
Right?


00:21:09 [Speaker 2]
It's not the first time that that people are drawn to head to to headlines.
00:21:12 [Speaker 2]
To clickbait is a is a is it draws from a phenomenon that's deep and long.
00:21:17 [Speaker 2]
But they do this at such a high level, and and, with an interest in optimizing their their their their consumers' attention, holding their consumers' attention almost irrespective of the content.
00:21:30 [Speaker 2]
Let let's add moreover that over the past couple of years, the the claim that has surfaced is that what they're just trying to promote is a free speech environment.
00:21:41 [Speaker 2]
They're concerned about woke wokeness, about moderation techniques that keep out bad content.


00:21:49 [Speaker 2]
Meanwhile, the concern isn't wokeness so much is design features that compel people to stay irrespective of what that content might be.


00:22:00 [Speaker 1]
And, of course, all this goes back to your argument about business models because what people call the attention economy is, quote, unquote, the attention economy because these companies make money through attention, through the sale of advertising.
00:22:14 [Speaker 1]
So the longer people stay on the networks, the more money they make.
00:22:19 [Speaker 1]
I should tell you, Olivia, I warn you that, my wife actually is head of litigation at Google.
00:22:25 [Speaker 1]
So I I hear the arguments from the other side on this, but, you know, I'm I'm stuck in between.
00:22:32 [Speaker 1]
I have some sympathy with some of the things you're saying.


00:22:35 [Speaker 1]
Well, the And and I and I seem to remember, you you talk about section two thirty being under threat.
00:22:43 [Speaker 1]
But a couple of years ago, she and Google went to the Supreme Court over section two thirty, and they came out of it claiming victory.
00:22:53 [Speaker 1]
What's your interpretation of the current state of play legally when it comes to two thirty and that recent Supreme Court case over two thirty?
00:23:02 [Speaker 1]
I think it was a couple of years ago.


00:23:04 [Speaker 2]
Yeah.
00:23:04 [Speaker 2]
The Gonzales versus Google case.


00:23:05 [Speaker 1]
The Gonzales case.
00:23:07 [Speaker 1]
Yeah.
00:23:07 [Speaker 1]
So she managed that case.
00:23:08 [Speaker 1]
So I I would hear about Gonzales all all day and night.


00:23:12 [Speaker 2]
I I wish she were here.
00:23:14 [Speaker 2]
Maybe you don't, but I I do wish she were here.
00:23:17 [Speaker 2]
I I I would love to hear her argument.
00:23:19 [Speaker 2]
But let's be clear about what happened in Gonzales versus Google.
00:23:22 [Speaker 2]
I don't take that as a as a as a as a kind of victory you do, for Google.


00:23:28 [Speaker 1]
Well, they claim it.
00:23:29 [Speaker 1]
I I'm I'm just a a broadcaster.


00:23:31 [Speaker 2]
They No.
00:23:31 [Speaker 2]
There there's there's there's they definitely walk away better off after that opinion than before, and that's because the Supreme Court there, says that there's no, anti terrorism claim the the anti terrorism act claim couldn't proceed against Google.
00:23:45 [Speaker 2]
Their Google is, a defendant in a case brought by family, so is Twitter and a handful of other platforms for what YouTube is doing, matching consumers based on their interest, and they're matching people with terrorists, with members of Hamas.


00:24:03 [Speaker 1]
And the allegation It was a case about I think it was brought by someone in France, a family in France in in reaction to someone who lost their life in in in in a terrorist bomb in Paris.


00:24:17 [Speaker 2]
That's right.
00:24:17 [Speaker 2]
That's right.
00:24:18 [Speaker 2]
And and by the way, there are handful of these cases that brought against companies, Facebook among them too.
00:24:23 [Speaker 2]
And and so so they there's an antiterrorism act claim by the plaintiffs.
00:24:28 [Speaker 2]
The company invokes section two thirty in defense.


00:24:31 [Speaker 2]
It says you can't bring this case because this is based on user generated content.
00:24:34 [Speaker 2]
We didn't produce this content.
00:24:36 [Speaker 2]
Others did.
00:24:37 [Speaker 2]
Two thirty is supposed to insulate us from liability.
00:24:40 [Speaker 2]
The court goes the the ninth circuit, actually, which has been forgiving, pretty generous to the companies for a long time, says that there are some claims that can proceed against Google and some can't because of section two thirty.


00:24:53 [Speaker 2]
But the revenue sharing claims, for example, claims that the company is benefiting from the kind of content that brings people on, including terrorist content, can be, subject of a lawsuit.
00:25:05 [Speaker 2]
This goes to the Supreme Court, and the Supreme Court is so baffled by the two thirty defense.
00:25:10 [Speaker 2]
It doesn't know what to do with the two thirty defense.
00:25:12 [Speaker 2]
The oral argument is a complete disaster for the plaintiffs and the defendants.
00:25:17 [Speaker 2]
The court decides not to decide the two thirty issue, Andrew.


00:25:22 [Speaker 2]
The court decides the merits of the anti terrorism act thing, and I think that's a victory.
00:25:28 [Speaker 2]
Why is that a it's a victory for law.
00:25:30 [Speaker 2]
It's a victory for law because section two thirty has not allowed courts to do that inquiry.
00:25:40 [Speaker 2]
Can Facebook be held respond can YouTube, sorry, be held responsible for making recommendations that may have led to a terrorist attack?
00:25:48 [Speaker 2]
As it turns out, there weren't enough facts to substantiate it, but now we can return to the addiction cases.


00:25:53 [Speaker 2]
To what extent is Facebook or TikTok responsible for suicide ideation, for self harm?
00:26:01 [Speaker 2]
We now are inquiring.
00:26:02 [Speaker 2]
We're we're now scrutinizing that question because a two thirty defense was rejected by the New Mexico court and the California state court.


00:26:10 [Speaker 1]
Because there's a trial going on down in Los Angeles at this point.
00:26:14 [Speaker 1]
Let let's try and escape a little bit of the inside the debate the Beltway legal stuff, Olivier, and and talk about how and and I'm, again, I'm quoting the subtitle of the book.
00:26:28 [Speaker 1]
How we can get it back.
00:26:30 [Speaker 1]
So okay.
00:26:31 [Speaker 1]
I don't think anyone would debate that big tech now is in control of the Internet.


00:26:35 [Speaker 1]
So is your argument in reclaiming the Internet that the only way we can take it back is by changing section two thirty?
00:26:41 [Speaker 1]
Is this essentially I mean, you're a lawyer, so I guess every struggle is a legal struggle.
00:26:46 [Speaker 1]
But is it a legal struggle?
00:26:48 [Speaker 1]
Is it is the fate of the Internet ultimately in the courts?


00:26:54 [Speaker 2]
No.
00:26:55 [Speaker 2]
I I you know, part of me you know, I I was something before I was a lawyer, and I and I and I lived in the cultural space of things.
00:27:11 [Speaker 2]
I believe that hearts of book means to do is is move us off of the story that these are platforms where you generate content, that these are platforms for free speech.
00:27:21 [Speaker 2]
These are platforms these are services that deliver, content, pursuant to their bottom line.
00:27:29 [Speaker 2]
As it turns out, many people love the stuff they get, to be sure, but we should not be seeing them as platforms.


00:27:34 [Speaker 2]
So that's the this is the counterstory.
00:27:36 [Speaker 2]
This is why I talked about stories in the nineteen nineties, and this is the counterstory.
00:27:40 [Speaker 2]
There are other regulatory fixes other than, you know, there are fix other than performing section two thirty.
00:27:45 [Speaker 2]
I absolutely think we need to do something about that.
00:27:48 [Speaker 2]
But what I think we need to do most, what policymakers need to do most is what they haven't done in The US, but they have elsewhere.


00:27:55 [Speaker 2]
And that is attend to the business model.
00:27:57 [Speaker 2]
The incentive to collect information is what drives, most consumer facing services online.
00:28:04 [Speaker 2]
And the best way to attend to that is to have data protection and privacy law.
00:28:08 [Speaker 2]
So the one among other things, the book argues for data minimization and purpose limitations.
00:28:14 [Speaker 2]
These are rules that would only require only allow companies to collect information that's consistent with the purposes that someone signs up for it.


00:28:22 [Speaker 2]
Not to, monetize and use that information for reasons that are opaque or were not understood by most consumers.
00:28:28 [Speaker 2]
I think that would do a lot to dampen the incentive to distribute the kind of information you asked about a moment ago.
00:28:34 [Speaker 2]
It wouldn't do all it wouldn't do away with it altogether, but it would be a substantial step in the right direction.


00:28:40 [Speaker 1]
You say these are not platforms.
00:28:42 [Speaker 1]
Your book comes with some very nice blurbs from a number of old friends of the show, people I know quite well, Julia Angwin, Tim Wu in particular.
00:28:51 [Speaker 1]
He was on the show a few weeks ago.
00:28:53 [Speaker 1]
He's been on several times.
00:28:55 [Speaker 1]
He's indeed written whole books on platforms.


00:28:57 [Speaker 1]
What do you mean, Olivia, when you say these are not platforms?


00:29:02 [Speaker 2]
Well, platform is a metaphor as you know.


00:29:05 [Speaker 1]
Well, I'm not sure if Tim means it metaphorically.
00:29:07 [Speaker 1]
I think he means it literally.


00:29:09 [Speaker 2]
Oh, Tim and I might have different views about this, but he would have to be in the room to talk about it.
00:29:15 [Speaker 2]
I I when people talk about platforms, I think they imagine that these are kind of conduits.
00:29:21 [Speaker 2]
Right?
00:29:21 [Speaker 2]
They enable and facilitate connection between users.
00:29:25 [Speaker 2]
That's the conventional account, I think, of what these companies do.


00:29:28 [Speaker 2]
I think that's wrong.
00:29:30 [Speaker 2]
I think it's incomplete at least because they are engineering the experiences that people have.


00:29:36 [Speaker 1]
So you you talk about laws.
00:29:39 [Speaker 1]
I'm often in these canvas conversations.
00:29:42 [Speaker 1]
People talk about what the Europeans are doing and use that as a model for The US.
00:29:47 [Speaker 1]
Are you arguing in reclaiming the the Internet that the European approach in terms of regulation is a better one, a fairer one than The US one, which tends to be more libertarian?


00:29:59 [Speaker 2]
I have to be careful here because I do like data protection law, but I don't love a rights driven model.
00:30:07 [Speaker 2]
I I'd rather we which Europe has adopted that.
00:30:11 [Speaker 2]
The the general data protection regulation has a a bunch of important user rights, including, you know, consumer rights, including the right to be forgotten, which is the most romantic and and well known.
00:30:22 [Speaker 2]
California has adopted the same.
00:30:23 [Speaker 2]
I would rather, however, focus on what the Europeans have established, and that is, restraints on what businesses can do irrespective of what consumer rights are.


00:30:35 [Speaker 2]
And that is, again, tied to the collection and use of information for specific purposes set out and that that consumers want.
00:30:43 [Speaker 2]
I think they've gotten that part right.
00:30:45 [Speaker 2]
What I, you know, what I worry about in in Europe, this will be sound funny, coming from you given what I've said about the first amendment, is that they don't the Europe, as you know better than I, does not have a first amendment.
00:30:56 [Speaker 2]
There isn't the same commitment to an open and vibrant public square.
00:31:01 [Speaker 2]
You haven't heard from me any resistance to that.


00:31:04 [Speaker 2]
I I I I believe deeply in an open and vibrant information environment.
00:31:09 [Speaker 2]
I worry, however, that we've conflated that romantic story with the industrial designs that these companies have on consumer attention.


00:31:21 [Speaker 1]
So when we have these kind of conversations, often people talk if they're not doing the the geography of Europe versus The US, they talk about alternative architectures.
00:31:34 [Speaker 1]
Web three used to be in vogue, until a few years ago until before AI took off.
00:31:41 [Speaker 1]
What do you make of the the decentralization arguments of Web three and of the crypto arguments?
00:31:48 [Speaker 1]
Are they also not really, about reclaiming the Internet?
00:31:54 [Speaker 1]
Is that just more ideal, or idealism dressed up or self interest dressed up as libertarian idealism?


00:32:05 [Speaker 2]
Yes.
00:32:06 [Speaker 2]
Yes.
00:32:06 [Speaker 2]
That's how I feel.
00:32:08 [Speaker 2]
I think a reimagined regulatory environment, is far more primed to protect consumers from harm.
00:32:18 [Speaker 2]
We don't have that.


00:32:20 [Speaker 2]
We have a a a story about how users are making choices for themselves about how they want to engage the Internet, whether it be through decentralized finance or, through other Web three applications.
00:32:36 [Speaker 2]
And I think that's a mistake.
00:32:38 [Speaker 2]
I think that, competition authorities, consumer protection authorities should be on the ready to attend to the specific designs that these companies have irrespective of consumer harm.
00:32:53 [Speaker 2]
That's a that's a that's a re that that requires a rejiggering of what we think of when we talk about an Internet company or whether we talk about three web three point o or any stage of the Internet.


00:33:06 [Speaker 1]
Of course, what comes after web three is AI.
00:33:09 [Speaker 1]
Let's end, Olivia, with some very concrete analysis from you on what you wanna see with AI and what you don't wanna see.
00:33:17 [Speaker 1]
You've talked about business models as being the core of the problem and the essential way of reforming the Internet and giving it back to us.
00:33:25 [Speaker 1]
Seems to be and please correct me if I'm wrong.
00:33:27 [Speaker 1]
There there seems to be two sort of main, at least from a consumer point of view, business models with AI.


00:33:33 [Speaker 1]
There's the anthropic model that suggests you have to pay to access this technology.
00:33:40 [Speaker 1]
And there's the OpenAI model, which is more ambivalent about payment and is using some of the the traditional Internet language to suggest that, they wanna give everyone access to this technology, so they might, add advertising.
00:33:55 [Speaker 1]
So you can have free AI, but it may go with advertising.
00:34:01 [Speaker 1]
In terms of this new Internet, it tends to be more and more of an AI centric network.
00:34:06 [Speaker 1]
I'm not sure even if the word Internet is appropriate anymore.


00:34:10 [Speaker 1]
Which of these business models do you think we should encourage and perhaps some even regulate the the the anthropic model of paying for our technology or the OpenAI model of, another advertising centric data model?


00:34:28 [Speaker 2]
Well, can I say both?
00:34:30 [Speaker 2]
I mean, I I I I am I Both of them are vulnerable to abuse and exploitation.
00:34:36 [Speaker 2]
Anthropic has been the most mindful arguably about, you know, its public mindedness.
00:34:45 [Speaker 2]
But, at least it's performed it as and I'm I'm more convinced that I'm convinced that that they they're they're thinking more about it.
00:34:52 [Speaker 2]
But, these are both businesses, and these are two different business models.


00:34:57 [Speaker 2]
I'm glad you mentioned OpenAI's turn to monetization.
00:35:00 [Speaker 2]
Right?
00:35:00 [Speaker 2]
I mean, they've they've announced over the past couple of months, they're gonna start thinking about ways to pair ads with, the, you know, prompts that people based on the prompts that people, offer up.
00:35:13 [Speaker 2]
I I worry about the the same things that have were were percolating in the nineteen nineties with regards to the budding Internet are popping up now.
00:35:22 [Speaker 2]
And you see it in, the president's executive order, involving AI from last December, where he says he wants to promote a vibrant information environment that promotes innovation and user control.


00:35:42 [Speaker 2]
And note that, Character Technologies, a company that's owned by Google, a case and they they they run Character AI, a a chatbot service.
00:35:51 [Speaker 2]
In defense, in its loss in a lawsuit brought by, you know, a well known kind of celebrity case by a young by a family whose teenage son committed suicide after using CharacterAI.
00:36:02 [Speaker 2]
The company, Google and Character Tech invoked the first amendment to say that they should not be evaluated, that they should not be held accountable for the harm because there is an expressive activity that ought to be protected under the first amendment.
00:36:19 [Speaker 2]
This is the error that they're invoking the the logic.
00:36:25 [Speaker 2]
That is the error that was in the nineteen nineties.


00:36:27 [Speaker 2]
These are not companies that traffic in speech.
00:36:31 [Speaker 2]
These are not platforms or services that facilitate speech.
00:36:35 [Speaker 2]
These are companies that are up to, satisfying their bottom line as they ought, but there are no guardrails.
00:36:45 [Speaker 2]
There are we're looking for guardrails.
00:36:47 [Speaker 2]
And so I feel the same way about AI and its various consumer face facing applications as I feel that, we should have been thinking about the Internet in the nineteen nineties and early two thousands.


00:37:02 [Speaker 2]
So, I hope we drop this happy talk about free speech and innovation.
00:37:07 [Speaker 2]
I mean, I I have to admit, though, the added layer in today's discourse about AI is is also about competitiveness with China.
00:37:14 [Speaker 2]
But this also sounds in concerns that are orthogonal, unrelated to the potential harm to consumers.
00:37:22 [Speaker 2]
So, if if I have advice, it's that we back off that and just talk about the the decisions that these companies made to deploy services even before they were ready.
00:37:35 [Speaker 2]
Right?


00:37:35 [Speaker 2]
I mean, there's a kind of recklessness in making an application available when you know that there are likely to be neurodivergent people that will use it and are vulnerable, and may commit suicide.
00:37:44 [Speaker 2]
This is what we're learning in some of these cases right now.
00:37:47 [Speaker 2]
So


00:37:49 [Speaker 1]
Just to wear my big tech hat for a minute, what are these big tech companies supposed to do?
00:37:54 [Speaker 1]
They have Gemini or Claude or ChatGPT.
00:38:00 [Speaker 1]
Are they supposed to run tests so that anyone that they wouldn't allow anyone with any kind of, history of mental illness not to use their products, or or should they always be accountable?
00:38:13 [Speaker 1]
Well, Andrew Independent of, independent of, their intentions.
00:38:20 [Speaker 1]
I mean, are they are they should they be much more careful about allowing who gets to use their product?


00:38:26 [Speaker 1]
Ultimately, isn't that what this accountability is all about?


00:38:30 [Speaker 2]
I think the law should require it.
00:38:32 [Speaker 2]
Right?
00:38:32 [Speaker 2]
So I, you know, part of what you've heard from me is I don't totally blame them for for for doing the things they've done because the law has not constrained them.
00:38:42 [Speaker 2]
But, you know, I and and and they're they're not gonna hire someone like me.
00:38:45 [Speaker 2]
I have a very poor business judgment.


00:38:46 [Speaker 2]
So, you know, I would have been I've been the one that said, are you sure this is time to deploy this thing?
00:38:51 [Speaker 2]
As it turns out, Andrew, there are documents that we're learning about in these litigations that suggest that there are doubts at these companies.
00:38:58 [Speaker 2]
Even Google, before character technologies launched character AI, had internal reporting that said there is danger in releasing this now.
00:39:08 [Speaker 2]
We need more testing.
00:39:09 [Speaker 2]
We need more protections.


00:39:11 [Speaker 2]
They did it anyway in a race to beat each other.
00:39:15 [Speaker 2]
That's unhealthy.
00:39:17 [Speaker 2]
That's that's getting things backward.
00:39:19 [Speaker 2]
I don't think anyone doubts that AI is gonna transform the world and potentially could deliver great opportunities for all of us.
00:39:27 [Speaker 2]
But to deploy services without caution and care, under law, we call that reckless.


00:39:34 [Speaker 2]
And as a result, you're seeing some real people get hurt.


00:39:38 [Speaker 1]
Sounds like some of Ralph Nader's arguments about the car industry in the nineteen sixties.
00:39:44 [Speaker 1]
Finally, Olivier, you're talking in your book about how we can get the Internet back.
00:39:52 [Speaker 1]
How know, we have these conversations, like I said, all the time, and it seems every conversation we have is about getting control back.
00:40:00 [Speaker 1]
And yet every week, these big companies become bigger and bigger, more and more powerful.
00:40:05 [Speaker 1]
Anthropic and OpenAI are almost trillion dollar private companies, the largest, most valuable private companies in history.


00:40:13 [Speaker 1]
Google is a 3 or $4,000,000,000,000 company.
00:40:16 [Speaker 1]
How optimistic are you that we can take it back?


00:40:23 [Speaker 2]
You know, I leave the book with a a prescription for how we might take it back.
00:40:30 [Speaker 2]
You know, but I I think, you know, there are reasons to be skeptical, that this is possible in the near term.
00:40:36 [Speaker 2]
I I I you didn't say that, but, there are reasons to be skeptical.
00:40:40 [Speaker 2]
On the other hand, there are reasons to be hopeful.
00:40:43 [Speaker 2]
There are there is an emergent consensus, bipartisan consensus, that something needs to happen.


00:40:51 [Speaker 2]
And you see the most traction with regards to harm to children.
00:40:55 [Speaker 2]
Now listen.
00:40:55 [Speaker 2]
I love that we are protecting children.
00:40:57 [Speaker 2]
I think it's a little unfortunate that it's only been a concern about protecting children because I think grown ups, ups, adults are vulnerable as well.
00:41:04 [Speaker 2]
This recent case involving Gemini is underscores the point.


00:41:08 [Speaker 2]
But, I I do think that there are possibilities for reform, and I hold out hope even though, you know, the things we see day to day suggest otherwise.


00:41:19 [Speaker 1]
Of course, that tragic Gemini case was involving a a 36 year old.
00:41:25 [Speaker 1]
Well, there you have it.
00:41:26 [Speaker 1]
Interesting conversation with Olivier Sylvain from Fordham and Columbia Law School reclaiming the Internet, how big tech took control and how we can take it back.
00:41:37 [Speaker 1]
Thank you so much, Olivier, for such a an open minded and good spirited conversation.
00:41:43 [Speaker 1]
I really appreciate it.


00:41:44 [Speaker 2]
I'm grateful that you've had me, Andrea.
00:41:46 [Speaker 2]
I really love the questions.
00:41:47 [Speaker 2]
Thanks very much for having