April 11, 2026

Slippery Sam, Devious Dario, Honest Hassabis: Blowing Up Silicon Valley’s Cult of Personality

Slippery Sam, Devious Dario, Honest Hassabis: Blowing Up Silicon Valley’s Cult of Personality
Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon
Apple Podcasts podcast player iconCastbox podcast player iconPocketCasts podcast player iconOvercast podcast player iconSpotify podcast player iconYoutube Music podcast player iconRSS Feed podcast player icon

“The media has its own agenda, completely separate from anything going on in the real world, creating the story themselves.” — Keith Teare

Last night, somebody hurled a Molotov cocktail at Sam Altman’s Pacific Heights mansion. I live a couple of hills over, but heard nothing. Meanwhile, the New Yorker hurled its own explosive cocktail at Sam, publishing a 15,000-word hit piece rhetorically entitled “Sam Altman May Control Our Future. Can He Be Trusted?” No, of course, he can’t be trusted. Not according to the New Yorker. Especially with something as precious as, gasp, our future.

Not everyone, however, is sold on this media cult of personality. In his That Was The Week editorial, Keith Teare tells the media to take their hands off Sam. I don’t disagree. Although I’m a bit skeptical of Keith’s attempt to demonize what he defines as a “devious” Dario Amodei. Whether it’s Altman, Amodei or Google’s AI honcho Demis Hassabis, all these guys are prisoners of their company’s structures and cultures. They are also victims of today’s anti-tech hysteria. It’s one thing to blow up Silicon Valley’s cartoonish cult of personality, it’s quite another to hurl bombs at these people’s homes. Enough with all the violence – verbal or otherwise. It never ends well.

Five Takeaways

A Molotov Cocktail at Slippery Sam’s House: On Friday night, someone hurled a Molotov cocktail at Sam Altman’s Pacific Heights mansion, according to The New York Times. Andrew lives nearby and didn’t hear it. The week’s zeitgeist had already turned: a 15,000-word New Yorker hit piece by Ronan Farrow and Andrew Marantz, wall-to-wall coverage, Sam moving into Musk-like media-frenzy territory. Keith’s editorial: Hands Off Sam Altman. The personality-driven circus has caught fire. Quite literally.

Anthropic’s Mythic Model Finds Decade-Old Vulnerabilities: The actual AI news this week, drowned out by the personality circus. Anthropic’s new “Mythic” model autonomously discovered security holes in software that had eluded human experts for years. Dario refused to release it openly until the patches were complete. Treasury Secretary Bessent commented on the implications for banks and government. The signal: AI is becoming systematically better than the best humans at specialist domains. Generalists can probably relax.

Slippery Sam vs Devious Dario vs Honest Hassabis: Keith’s contrarian take: Altman is honest because he’s openly dishonest. Amodei is the devious one — a politically liberal narrative wrapped around a commercial juggernaut. Andrew’s third way is yesterday’s Mallaby interview: Demis Hassabis, the Spinozan one-faced scientist who would rather be at Princeton. But even Demis must have authorised the firing of Mustafa Suleiman. Everyone has a game plan, said Mike Tyson, until they get punched in the face.

Post of the Week: Keith Replaces WordPress in Ten Minutes: Keith’s tweet: he’s run two curation sites — seriouslyphotography.com and seriouslybc.com — on WordPress for over a decade. Last Friday afternoon, he asked Anthropic’s tools to rewrite them. Ten minutes later, both sites were rebuilt from scratch, fully responsive, WordPress gone. Cost in the old world: tens of thousands of dollars and several months. The Matt Mullenweg vs Matthew Prince debate is settled by the actual technology while the principals are still arguing.

The End of Ownership? Keith Goes Marxist: Pure capitalism, Keith argues, will produce so much abundance that scarcity ends and self-interested competition with it. “In the future there will be no ownership, or everything will be commonly owned.” Andrew calls it Marx with Tesla characteristics. Eric Ries’s forthcoming Incorruptible argues that Patagonia and Mondragon point a different way — structural ethics rather than abundance utopianism. Two visions of the post-AI economy. Both probably wrong. We’ll find out.

About the Guest

Sebastian Mallaby is the Paul A. Volcker senior fellow for international economics at the Council on Foreign Relations. A former Washington Post columnist and Economist contributing editor, he is the author of More Money Than God, The Man Who Knew (winner of the FT and McKinsey Business Book of the Year), The Power Law, and now The Infinity Machine: Demis Hassabis, DeepMind, and the Quest for Superintelligence.

References:

The Infinity Machine: Demis Hassabis, DeepMind, and the Quest for Superintelligence by Sebastian Mallaby.

• Episode 2862: Truth Is Dead — Steven Rosenbaum on AI as a spectacularly good liar. Mallaby’s quiet counter-argument.

• Episode 2860: We Shape Our AI, Thereafter It Shapes Us — Keith Teare on agency in our agentic age. Hassabis thinks he can still steer.

About Keen On America

Nobody asks more awkward questions than the Anglo-American writer and filmmaker Andrew Keen. In Keen On America, Andrew brings his pointed Transatlantic wit to making sense of the United States — hosting daily interviews about the history and future of this now venerable Republic. With nearly 2,800 episodes since the show launched on TechCrunch in 2010, Keen On America is the most prolific intellectual interview show in the history of podcasting.

Website

Substack

YouTube

Apple Podcasts

Spotify

Chapters:

  • (00:31) - A Molotov cocktail at Sam Altman’s Pacific Heights house
  • (02:41) - The New Yorker hit piece: Ronan Farrow, Andrew Marantz, 15,000 words
  • (05:36) - Slippery Sam and the zeitgeist
  • (07:39) - Brian Merchant: it’s open season for refusing AI
  • (08:09) - Anthropic’s Mythic model finds decade-old vulnerabilities
  • (10:46) - Why even release it? Dario’s narcissism
  • (12:12) - Slippery Sam vs Devious Dario
  • (14:11) - Hassabis as the third way
  • (18:29) - The Mustafa Suleiman question
  • (19:17) - Mike Tyson, Kant, Spinoza, and Hobbes
  • (22:09) - Brian Merchant and the new Luddism
  • (23:34) - Anthropic makes a new generation redundant every week
  • (23:34) - Post of the week: Keith rebuilds his sites in 10 minutes
  • (26:39) - Eric Ries on incorruptible companies
  • (30:12) - Patagonia, Berkeley Bowl, Mondragon
  • (35:43) - The end of ownership? Keith goes Marxist

00:31 - A Molotov cocktail at Sam Altman’s Pacific Heights house

02:41 - The New Yorker hit piece: Ronan Farrow, Andrew Marantz, 15,000 words

05:36 - Slippery Sam and the zeitgeist

07:39 - Brian Merchant: it’s open season for refusing AI

08:09 - Anthropic’s Mythic model finds decade-old vulnerabilities

10:46 - Why even release it? Dario’s narcissism

12:12 - Slippery Sam vs Devious Dario

14:11 - Hassabis as the third way

18:29 - The Mustafa Suleiman question

19:17 - Mike Tyson, Kant, Spinoza, and Hobbes

22:09 - Brian Merchant and the new Luddism

23:34 - Anthropic makes a new generation redundant every week

23:34 - Post of the week: Keith rebuilds his sites in 10 minutes

26:39 - Eric Ries on incorruptible companies

30:12 - Patagonia, Berkeley Bowl, Mondragon

35:43 - The end of ownership? Keith goes Marxist

00:00:01 [Speaker 1]
Hello, everybody.
00:00:02 [Speaker 1]
It's Saturday, 04/11/2026, at least in San Francisco where I'm talking from.
00:00:09 [Speaker 1]
The news in San Francisco is that yesterday, a Molotov cocktail was hurled at the home of OpenAI CEO Sam Altman according to The New York Times.
00:00:22 [Speaker 1]
He doesn't live very far away from me.
00:00:24 [Speaker 1]
I didn't hear it or see it, but I assume it's true.


00:00:28 [Speaker 1]
Meanwhile, Keith Tears, That Was the Week editorial, has a very different image.
00:00:35 [Speaker 1]
It's of Sam Altman being escorted by police or at least people who look up to be police, but they're actually wearing the badge, not of the NYPD, but of the New Yorker.
00:00:51 [Speaker 1]
It's an interesting fake video that Keith is an expert in making, and it forms, the basis of his editorial this week.
00:01:02 [Speaker 1]
Hands off Sam Altman.
00:01:04 [Speaker 1]
It's very much in response to a piece that the New Yorker wrote this week on Sam called Sam Altman May Control Our Future.


00:01:12 [Speaker 1]
Can he be trusted?
00:01:13 [Speaker 1]
Keith, I hope you didn't hurl that Molotov cocktail at Sam's house, did you?
00:01:19 [Speaker 1]
Have you been were you up in San Francisco yesterday?


00:01:22 [Speaker 2]
Not me, Andrew.
00:01:23 [Speaker 2]
Not me.
00:01:23 [Speaker 2]
And, if you read the editorial, highly unlikely that


00:01:27 [Speaker 1]
Well, you could be just you know, some people are very clever, and they appear to be against they appear to be defending Sam Altman, but actually are hurling Molotov cocktails at his house.
00:01:37 [Speaker 1]
Anyway, I trust you on that one.
00:01:39 [Speaker 1]
I don't think that's you.


00:01:40 [Speaker 2]
Isn't there a famous Shakespearean quote?
00:01:42 [Speaker 2]
He came to something me, but then he something something.
00:01:47 [Speaker 2]
Forget what I


00:01:48 [Speaker 1]
That's, that's very useful key.


00:01:49 [Speaker 2]
He came to praise me and or something.


00:01:52 [Speaker 1]
Oh, barrier.
00:01:53 [Speaker 1]
Anyway, so back to, back to your friend, Sam.
00:01:58 [Speaker 1]
This, video you made of him being arrested in New York City of all places.
00:02:05 [Speaker 1]
I think it looks like it's down on the Bowery by New Yorker agents.
00:02:09 [Speaker 1]
What's happening here?


00:02:11 [Speaker 2]
So, amazingly, the New Yorker published a very long rambling piece by, two authors.


00:02:21 [Speaker 1]
Well, let's name them at least.
00:02:22 [Speaker 1]
You gotta give them their credit.
00:02:24 [Speaker 1]
Ronan Farrow and Andrew Marantz.
00:02:27 [Speaker 1]
Farrow, of course, doesn't need much of an introduction.
00:02:29 [Speaker 1]
Marantz is, their business tech one of their business and tech guys.


00:02:35 [Speaker 1]
It is incredibly long.
00:02:36 [Speaker 1]
I mean, it's gotta be 15,000 words.


00:02:40 [Speaker 2]
It's it's long.
00:02:41 [Speaker 2]
It's rambling.
00:02:42 [Speaker 2]
There's nothing new in it.
00:02:44 [Speaker 2]
There's there's absolutely no insight that hasn't been published before.
00:02:48 [Speaker 2]
So it it's like, it's like an obsessive gossip piece that belongs in in, you know, magazines my mom would have read, but in the Oh,


00:02:59 [Speaker 1]
your mother, Keith.
00:03:00 [Speaker 1]
You're insulting your mother on this show.


00:03:04 [Speaker 2]
Well, all moms in those days were reading such magazines.
00:03:07 [Speaker 2]
They were called woman and woman's own and things like that.
00:03:12 [Speaker 2]
It it it it so it it it it And


00:03:14 [Speaker 1]
what did men read?
00:03:15 [Speaker 1]
They they they just beat their wives up those days, got drunk, went and beat their wives up while they were while the wives were reading woman's own.


00:03:22 [Speaker 2]
Sadly, they did, Andrew.
00:03:23 [Speaker 2]
It's actually true.
00:03:24 [Speaker 2]
They did do that.


00:03:25 [Speaker 1]
That's what happened in the North Of England anyway.
00:03:27 [Speaker 1]
I'm not sure if it was true in, North London where I grew up.
00:03:30 [Speaker 1]
But anyway, that's another story.


00:03:32 [Speaker 2]
But, but what's amazing is the piece got massive coverage and take up.
00:03:39 [Speaker 2]
Rowan was on TV.
00:03:40 [Speaker 2]
I saw him interviewed on CNN.


00:03:42 [Speaker 1]
Ronan, not Rowan.
00:03:43 [Speaker 1]
Sorry.
00:03:43 [Speaker 1]
Ronan.


00:03:44 [Speaker 2]
Ronan was on TV left and right, almost every channel, as if some major insight existed.
00:03:52 [Speaker 2]
And so it was impossible to ignore this article last week.
00:03:54 [Speaker 2]
It was everywhere.
00:03:56 [Speaker 2]
And it it contrasted with equally wall to wall news about anthropic to do with a new model that they introduced.
00:04:05 [Speaker 2]
So once again, and I think this is, like, the fifth week in a row, the headlines are dominated by, Dario Amadai and Sam Altman.


00:04:13 [Speaker 2]
And then last night, after this title had been put to bed, Altman, got got a Molotov Molotov cocktail on his house.
00:04:23 [Speaker 2]
It's almost like we, we anticipated.
00:04:26 [Speaker 2]
So


00:04:26 [Speaker 1]
We, we we're not gonna blame the New Yorker for that, are we, Keith?
00:04:32 [Speaker 1]
There was some irate San Franciscan who who read the New Yorker and thought, well, this, Waltman, he's so slippery that I'm gonna have to hurl a Molotov cocktail at his house.
00:04:43 [Speaker 1]
He lives in a big house, so up in Pacific Heights.
00:04:46 [Speaker 1]
So it's not actually hard to hurl it at it because


00:04:50 [Speaker 2]
Yeah.


00:04:51 [Speaker 1]
It's massive.


00:04:52 [Speaker 2]
I I I would say the New Yorker is just reflecting the the reason it got taken up is it's it's the zeitgeist around Altman is that he's dodgy, you know,


00:05:06 [Speaker 1]
Slippery Sam as I


00:05:08 [Speaker 2]
Slippery Sam.
00:05:09 [Speaker 2]
And this was just a hit piece, not even a very good hit piece, but it was a hit piece.
00:05:14 [Speaker 2]
And journalists left and right seized on it because they agree with it.
00:05:20 [Speaker 2]
And, and and so you've got, once again, the media with its own agenda, completely separate to anything going on in the real world, creating the story themselves.
00:05:31 [Speaker 2]
And and, you know, the story is Sam Altman, which is a non story in my view.


00:05:35 [Speaker 2]
It's just not a story.


00:05:38 [Speaker 1]
Yeah.
00:05:38 [Speaker 1]
You quote in your editorial, Altman is moving into Musk like territory in terms of media frenzy.
00:05:44 [Speaker 1]
You say that this personality driven circus is mostly a side show.
00:05:48 [Speaker 1]
You don't mention your mother in in the editorial.


00:05:53 [Speaker 2]
No.
00:05:53 [Speaker 2]
I don't.


00:05:53 [Speaker 1]
You can keep that one out of it.
00:05:55 [Speaker 1]
Yeah.
00:05:55 [Speaker 1]
I mean, look.
00:05:56 [Speaker 1]
You and I disagree, I think, in sometimes on Altman, But I actually agree with you.
00:06:01 [Speaker 1]
I think that, and I think it's a media conspiracy.


00:06:05 [Speaker 1]
It's the zeitgeist, and the the media is both the product of the zeitgeist and also creates it.
00:06:12 [Speaker 1]
And the people who write this stuff believe it, and the people who read it believe it.
00:06:18 [Speaker 1]
There is a growing not more more than a distrust, a growing dislike for big tech.
00:06:26 [Speaker 1]
I mean, the other piece of news this week is that Bernie Sanders and AOC, Alexandria Ocasio Cortez have proposed a more a moratorium on new AI data centers.
00:06:38 [Speaker 1]
I think that AI data centers are becoming perhaps the the the political stage on a lot of this stuff's being built.


00:06:47 [Speaker 1]
Yeah.
00:06:47 [Speaker 1]
You another of the pieces that you have in in this week's newsletter is by Brian Merchant.
00:06:54 [Speaker 1]
I've been on my show a couple of times, the author of Blood in the Machine, an excellent book on Luddism, on the Luddite movement both then and now.
00:07:04 [Speaker 1]
Brian entitles, his piece, it's open season for refusing AI.


00:07:09 [Speaker 2]
Yeah.
00:07:11 [Speaker 2]
Yeah.
00:07:11 [Speaker 2]
And and, you know, this is not new to this show.
00:07:15 [Speaker 2]
It's been our theme for several weeks in a row now because you just can't escape it.
00:07:20 [Speaker 2]
Last week, we talked about the documentary, the AI doc.


00:07:25 [Speaker 2]
And a couple of weeks before that, we, you know, we talked about the dooms versus the optimists and indicated that neither really represents, stay you know, the right point of view about AI.
00:07:38 [Speaker 2]
And I think this week, the right message is, probably best squeezed out of the anthropic news.
00:07:46 [Speaker 2]
Anthropic's latest model, which is called mythic.
00:07:52 [Speaker 2]
I almost wanted to call it mystic, but it's mythic.
00:07:58 [Speaker 2]
Eclipse is software in cybersecurity and has exposed decades old vulnerabilities in software that, had been unknown to specialist software and has led to Anthropic not releasing the model openly because it could be used to do cyber intrusion, until they've patched all the vulnerabilities it found.


00:08:24 [Speaker 2]
And and and and that that that probably is closer to what the news should be.
00:08:29 [Speaker 2]
The news should be about the impact of AI, its growing capability, and how to use it, and what for.


00:08:39 [Speaker 1]
Yeah.
00:08:40 [Speaker 1]
I mean, there is I mean, to be fair, the media isn't just running huge anti tam stories.
00:08:46 [Speaker 1]
There's a lot of stuff about the anthropic, the an the the anthropic release this week.
00:08:52 [Speaker 1]
A lot of stuff about, responses from senior administration people, Besson, for example, on the fear of this new model.
00:09:04 [Speaker 1]
Should we be fearing it?


00:09:05 [Speaker 1]
Should banks be fearing it?
00:09:07 [Speaker 1]
Should the government be fearing it?
00:09:09 [Speaker 1]
I mean, Theo made news over the last few weeks by arguing with the government about how they should and shouldn't use his, his technology when it comes to defense.
00:09:25 [Speaker 1]
In a way, isn't this news confirmation of Dario's concerns?


00:09:33 [Speaker 2]
Well, you could look at it in two ways.
00:09:36 [Speaker 2]
I think I think one of the ways is as you just said it, and you'd say yes, because this soft, this this new model is clearly very capable of going way beyond humans in a domain that some of the most intelligent humans on the planet work.
00:09:54 [Speaker 2]
So it it is a signal that AI is gonna be increasingly good at things compared to the best humans in in in the space.
00:10:03 [Speaker 2]
We've we've talked a lot about how specialists are the most vulnerable to AI, not generalists.
00:10:10 [Speaker 2]
The generalist would say, well, now that this is true, what do we do with it?


00:10:15 [Speaker 2]
And,


00:10:16 [Speaker 1]
And that's you.
00:10:17 [Speaker 1]
And, I mean, you're more of a specialist than me, but I'm the ultimate generalist.


00:10:21 [Speaker 2]
Right.
00:10:21 [Speaker 2]
Well, that that's where Dario probably failed this week because his answer to that question was, don't release it.
00:10:30 [Speaker 2]
Even though if you released it, it could be used to, patch all of those holes.


00:10:35 [Speaker 1]
So sorry.
00:10:35 [Speaker 1]
Just I'm gonna ask you a dumb question, Keith.
00:10:37 [Speaker 1]
I mean, most of my questions, as you know, are dumb.
00:10:42 [Speaker 1]
Why is this even news?
00:10:43 [Speaker 1]
If he had this product, Mythos AI, that supposedly is so dangerous and can destroy the world economy and all the rest of it, why not just keep it under wraps?


00:10:54 [Speaker 1]
Why even make it a story?


00:10:57 [Speaker 2]
Well, because he can't help himself.
00:11:01 [Speaker 2]
You know, if if one wants to do the cult of personality, Dario is, a little bit of a narcissist.


00:11:08 [Speaker 1]
Oh, now you need you need to write a 15,000 word piece for the New Yorker on Dario, and then I could have an editorial hands off.


00:11:16 [Speaker 2]
Yeah.
00:11:16 [Speaker 2]
Exactly.
00:11:17 [Speaker 2]
And and and he's, self interested in attention, clearly.
00:11:22 [Speaker 2]
And he wants reason.
00:11:23 [Speaker 2]
And second and he wants his business to be valuable.


00:11:25 [Speaker 2]
He's announced his, revenues this week for the, I think, third week in a row.
00:11:30 [Speaker 2]
He's clearly trying to grab attention for


00:11:32 [Speaker 1]
You don't I mean, what is it?
00:11:34 [Speaker 1]
I I don't really understand why you don't like him, and you seem to you seem to prefer Altman to, Amadai.
00:11:41 [Speaker 1]
Why?


00:11:42 [Speaker 2]
I think Altman's honest, and Amadai is the devious one.


00:11:47 [Speaker 1]
Well, Altman's honest because he's openly dishonest.


00:11:50 [Speaker 2]
Yes.
00:11:51 [Speaker 2]
He's he's very transparent.
00:11:53 [Speaker 2]
I think Amadai


00:11:54 [Speaker 1]
in all the the the one thing I did like about the New Yorker piece was the imagery.
00:11:59 [Speaker 1]
It has Sam's many faces, and and and that's true.
00:12:02 [Speaker 1]
I mean, everybody knows that there's a happy Sam, a a salesman Sam.
00:12:07 [Speaker 1]
Well, there's always a salesman Sam, but a a miserable Sam, a warning Sam, a a manipulative Sam.
00:12:13 [Speaker 1]
And you're saying that Amadai is really Sam, except he's trying


00:12:17 [Speaker 2]
he's he's got a different take.
00:12:19 [Speaker 2]
He's coming from, a politically liberal, canvas.
00:12:25 [Speaker 2]
You know, the narrative about doing good, a little bit like Google, do no evil.
00:12:31 [Speaker 2]
But that lives inside a commercial organization intent on making a lot of money.
00:12:36 [Speaker 2]
So there's a bit of a conflict of interest there.


00:12:39 [Speaker 2]
Whereas Altman is, you know, completely transparent.
00:12:43 [Speaker 2]
There's no guessing.
00:12:45 [Speaker 2]
And Amadai has succeeded in that narrative around himself, which is, stunning to me, but probably because the zeitgeist wants to believe it.


00:12:54 [Speaker 1]
Well, I'm just falling into the the trap that you're warning about Oh, yeah.
00:12:58 [Speaker 1]
About, personality driven circus that


00:13:03 [Speaker 2]
Well, I I could be.


00:13:05 [Speaker 1]
By turning, by turning Amadai into Altman, you're basically playing the same game as the New Yorker.


00:13:12 [Speaker 2]
I could I could be.
00:13:13 [Speaker 2]
But, to my credit, this week's editorial is saying how awesome Anthropic Software is and and how it should be embraced and, and implemented.
00:13:27 [Speaker 2]
And so whatever I think of, Amadai personally, it doesn't extend to my views about anthropic and anthropic I believe is, is fantastic.
00:13:38 [Speaker 2]
Just like OpenAI is in in its way as well.


00:13:41 [Speaker 1]
Yeah.
00:13:41 [Speaker 1]
I mean, there's no doubt.
00:13:42 [Speaker 1]
No one's gonna argue that that's not the case.
00:13:44 [Speaker 1]
I wonder if there's a third way.
00:13:45 [Speaker 1]
Our interview of the week, my keen on America interview of the week is, with Sebastian Mallaby, who's just written a a major new book about Dennis Hassabis called The Infinity Machine.


00:13:58 [Speaker 1]
He told me he spent many hours in the top room of a Highgate pub talking to Dennis.
00:14:06 [Speaker 1]
And Mallaby, who's very, experienced, very distinguished journalist, written all sorts of books on venture capital, suggests that Demis is the model in the sense that he's trying to do good, but he's also managing the reality of the world.
00:14:23 [Speaker 1]
So on the one hand, he's well versed in the ethics of Haar, Kant, and Spinoza.
00:14:29 [Speaker 1]
But on the other hand, he recognizes that we don't live in a Kantian world, that he has to operate within Google and within the world of AI.
00:14:38 [Speaker 1]
So Hasidus seems to to represent a much more mature version of either Sam or Daria, doesn't he?


00:14:49 [Speaker 1]
I mean, I know you I'm not sure if you've you've had the chance to look at the interview, but it's it's a really good interview because it focuses on this problem of personality cults.
00:14:59 [Speaker 1]
I mean, in a way, I guess, Sebastian has fallen into the cult by writing a book about Demis, but all these guys are struggling with historical realities that not a no single individual can actually deal with.


00:15:18 [Speaker 2]
Yeah.
00:15:18 [Speaker 2]
Dennis, I also saw an interview with him this week where he's he acknowledged that if he had his way, LLMs would not have been released to consumers or businesses, but would have largely stayed inside the lab laboratory doing things like AlphaFold, which is solving specific problems, and taking time to do it.
00:15:43 [Speaker 2]
And and, you know, so he's not really believer in the distribution of AI to the world.
00:15:51 [Speaker 2]
He's more of a scientist who wants to apply it to specific problems.


00:15:55 [Speaker 1]
Yeah.
00:15:55 [Speaker 1]
And Sebastian said that, in our conversation that Demi said to him lots of times in this conversation, I sometimes I I really just would much rather run a a research institute at Princeton.
00:16:08 [Speaker 1]
I I don't like all this corporate politics or all these corporate realities and compromises.


00:16:14 [Speaker 2]
Yeah.
00:16:14 [Speaker 2]
So that that's probably a lot to do with him.
00:16:18 [Speaker 2]
That said, at Google, he's he was taken from DeepMind and made the boss not just of DeepMind, but of of Gemini and all the other Google AI stuff.
00:16:30 [Speaker 2]
And that puts him in a position where he wears multiple hats.
00:16:34 [Speaker 2]
One and one of them is to commercialize Gemini.


00:16:37 [Speaker 2]
So he's he's pushing out.
00:16:40 [Speaker 2]
This week, they integrated NotebookLM in with Gemini.
00:16:45 [Speaker 2]
So now if you go to Gemini, there's a Notebooks area, and you could save files there and create projects and so on.
00:16:51 [Speaker 2]
So they're very much building a consumer facing and business facing set of set of tools from AI.
00:16:59 [Speaker 2]
And those things are starting to have an impact on on everything, including Yeah.


00:17:05 [Speaker 1]
And Sebastian also noted that Dennis had a very close relationship with Sunda at Google, and that explained his his comfort because the history of DeepMind and Google is rocky.
00:17:19 [Speaker 1]
And in in the new book, he he also covers that early history when there are all sorts of rouse about what Google couldn't couldn't couldn't do.
00:17:27 [Speaker 1]
And he also it was another part of the conversation, which was really interesting, was how Dennis wouldn't actually acknowledge it, but the reality is that he he must have somehow authorized the firing of Mustafa Solomon, who who went off now, who runs AI at at Microsoft.
00:17:46 [Speaker 1]
So he does have to compromise, and he does have to operate in the real world.
00:17:53 [Speaker 1]
He has a CEO.


00:17:54 [Speaker 1]
And as you said, there's Google is increasingly an AI company.


00:17:59 [Speaker 2]
Yeah.
00:17:59 [Speaker 2]
But it's interesting, isn't it, that there's no narrative around Suleiman and him picked up by the media that


00:18:07 [Speaker 1]
I just picked up on it.
00:18:08 [Speaker 1]
Maybe I should write a 15,000 word piece for the New Yorker on that.


00:18:11 [Speaker 2]
Yeah.
00:18:12 [Speaker 2]
Because clearly clearly, there there there are, Altman like maneuvers that must have gone on, during that whole period that, because he's more more of a scientist than than a negotiator.
00:18:30 [Speaker 2]
He doesn't get tarred with that brush whereas Altman does, but actually, in practice, they're both in start ups, highly dynamic situations and context, having to make decisions probably several times a day where they don't really have the evidence to get And


00:18:47 [Speaker 1]
then and, I I thought Sebastian did a good job.
00:18:50 [Speaker 1]
He's the book has been criticized.
00:18:53 [Speaker 1]
There was a very, very hostile review in The Guardian suggesting that, Hassett, that that that, Malaby had written a kind of hagiography that he wasn't critical enough of Demis.
00:19:06 [Speaker 1]
But I think what, what Sebastian is acknowledging is that operating in the real world, you can't just read Kant or Spinoza and operate like that.
00:19:15 [Speaker 1]
We all know that.


00:19:15 [Speaker 1]
You can't business is tricky.
00:19:18 [Speaker 1]
It's dirty.
00:19:19 [Speaker 1]
I mean, it's Mike Tyson's famous remark.
00:19:22 [Speaker 1]
Everyone has a game plan till they get punched in the face.


00:19:25 [Speaker 2]
And maybe for our listeners, Andrew, just to explain what a Kantian world looks like in this context.
00:19:31 [Speaker 2]
What does


00:19:32 [Speaker 1]
it mean?
00:19:32 [Speaker 1]
This is a tech podcast rather than a philosophy one.
00:19:36 [Speaker 1]
I'm not an expert on Immanuel Kant, but he was, of course, the great philosopher of the, enlightenment who suggested that you wrote a book, I think, called The Ethical, or you've talked about The Ethical Pure Reason, which suggested that one could be good and reasonable at the same time.
00:19:53 [Speaker 1]
And, of course, Spinoza, another of the fathers, a a Dutch Jewish philosopher, suggested a similar thing.
00:20:02 [Speaker 1]
And and maybe we need to credit them with good intent, but the reality of the world isn't like that.


00:20:09 [Speaker 1]
And one of the things I said to, Malaby is that whilst Dario oh, sorry.
00:20:17 [Speaker 1]
Not Dario.
00:20:18 [Speaker 1]
While Dennis is reading Kant and, and Spinoza or at least quoting Kant and Spinoza, he probably has also been or hasn't been reading, but has borrowed the the logic of Hobbes, the reality of the world, because that's just the nature of thing.


00:20:37 [Speaker 2]
Yeah.
00:20:38 [Speaker 2]
Yeah.
00:20:39 [Speaker 2]
I look.
00:20:40 [Speaker 2]
The the the the interesting thing is that those thinkers lived at a time of huge change, and they become increasingly relevant, in our time because Relevant or irrelevant?
00:20:51 [Speaker 2]
Relevant.


00:20:52 [Speaker 2]
Yeah.
00:20:53 [Speaker 2]
Because this is a time, in my lifetime at least, and I've seen a lot of change where change is accelerated to the point where it's a daily, daily occurrence and, and you're having to adjust your worldview, you know, certainly every week.
00:21:12 [Speaker 2]
And, you know, new reality implies a new canvas for everybody.


00:21:18 [Speaker 1]
Yeah.
00:21:19 [Speaker 1]
I mean, that's why, that's why, Brian Merchant's book, Blood in the Machine, which was really about the early nineteenth century Luddite rebellion against industrial technology, is so relevant, and that's why you, you linked with his open season for refusing AI.
00:21:36 [Speaker 1]
In many ways, history is just repeating itself.


00:21:39 [Speaker 2]
Yeah.
00:21:39 [Speaker 2]
But on a new level on a new level, you know, you you it's interesting because the other thing that happened this week is the Artemis two mission, which by contrast seems a bit of a throwback to the nineteen sixties and seventies.


00:21:55 [Speaker 1]
An attempt.
00:21:56 [Speaker 1]
I I'm not sure.
00:21:57 [Speaker 1]
Take our minds off other stuff in The Middle East, perhaps.


00:22:00 [Speaker 2]
Yeah.
00:22:01 [Speaker 2]
But it definitely doesn't seem to represent state of the art, at all, whereas this AI stuff definitely does.
00:22:10 [Speaker 2]
And the two are gonna converge at some point where AI and and rocketry and space travel, you know, is a single industry.


00:22:20 [Speaker 1]
Yeah.
00:22:20 [Speaker 1]
And a lot of my, a lot of my other shows this week have focused on this.
00:22:25 [Speaker 1]
Did a show with another New York Times writer, Noam Shieber, who has a new book out called Mutiny, the rise and revolt of the college educated working class, and he's on to something.
00:22:38 [Speaker 1]
I mean, these people go to university, and they spend a $100,000 a year on being educated to acquire skills.
00:22:47 [Speaker 1]
They're almost immediately redundant.


00:22:49 [Speaker 1]
I mean, that comes back to the, the, the anthropic news this week.
00:22:55 [Speaker 1]
Every week, anthropic comes out with a a new release that makes an a new generation of college educated people redundant or irrelevant.


00:23:04 [Speaker 2]
Yeah.
00:23:04 [Speaker 2]
I mean, you know, the post of the week this week is is a tweet I did myself.
00:23:09 [Speaker 2]
I I've I've run these two websites.
00:23:12 [Speaker 2]
One's called seriouslyphotography.com.
00:23:14 [Speaker 2]
The other one is seriouslybc.com on WordPress for more than ten years.


00:23:21 [Speaker 2]
And what they do is they curate articles a bit like That Was the Week in their specific domains.
00:23:26 [Speaker 2]
And, at the end of the week last week on a Friday around five in the afternoon, I suddenly thought, you know, these are these sites suck.
00:23:36 [Speaker 2]
I should upgrade them.
00:23:38 [Speaker 2]
And I got Anthropic to rewrite them, and which he did very fast.
00:23:44 [Speaker 2]
And now the two sites, which still exist, are a 100% created by code.


00:23:49 [Speaker 2]
It would have cost me tens of thousands of dollars to hire people to do that, and it would have taken a couple of months.
00:23:55 [Speaker 2]
And and literally in ten minutes, I had vastly improved websites that are much more responsive than they were before and got rid of WordPress, which, has become a theme, by the way.
00:24:08 [Speaker 2]
Matt Mullenweg of WordPress did a big debate with Matthew Phillips of Cloudflare who claims to have built a WordPress replacement, a couple weeks ago.


00:24:22 [Speaker 1]
Yeah.
00:24:22 [Speaker 1]
And, I did an interview this week.
00:24:26 [Speaker 1]
I mean, a a lot of this stuff, AI has become a kind of, like, the moral code, perhaps, to borrow a word.
00:24:35 [Speaker 1]
A couple of your pieces is about, everyone now is trying to argue that they're not using AI.
00:24:42 [Speaker 1]
So The Verge had a piece about, really, you made this without AI?


00:24:45 [Speaker 1]
Prove it.
00:24:45 [Speaker 1]
Of course, you can't.
00:24:46 [Speaker 1]
Although people will probably come up with software which claim that they can figure out whether or not AI is part of it.
00:24:54 [Speaker 1]
And another Hollywood Reporter piece you write is Hollywood assistants are using AI despite their better judgment.
00:25:00 [Speaker 1]
It's become the moral thing in itself.


00:25:02 [Speaker 1]
If you're using AI, supposedly, then you're a bad guy, which is just so childish and absurd to me.


00:25:09 [Speaker 2]
Yeah.
00:25:10 [Speaker 2]
Yeah.
00:25:10 [Speaker 2]
I you know, we had friends from The UK visiting this week, and the the husband is a teacher, a very good teacher.
00:25:18 [Speaker 2]
And, I think his natural inclination is to suspect the students of using AI for bad purposes.


00:25:27 [Speaker 1]
Understandably in the sense that there's some truth to that.
00:25:30 [Speaker 1]
Right?


00:25:31 [Speaker 2]
Yeah.
00:25:32 [Speaker 2]
And and, he's, he's one of those teachers that actually reads the students' essays and marks them properly.
00:25:38 [Speaker 2]
And I pointed out to him that probably in the not too distant future, he won't be teaching and he won't be marking.
00:25:45 [Speaker 2]
What he'll probably be doing, if he still has a job, is mentoring and teaching human beings how to be better at at at being themselves.
00:25:56 [Speaker 2]
And AI will take on many of the tasks that are, repeatable, if you will, and and and where the value doesn't exist.


00:26:06 [Speaker 2]
The value really exists in the human connection.


00:26:09 [Speaker 1]
Yeah.
00:26:09 [Speaker 1]
We are in violent disagreement, Keith, unfortunately, on this, which I know sometimes some of our viewers and listeners like it when we disagree, but I can't disagree with you on on a lot of this stuff.
00:26:19 [Speaker 1]
One one of the other shows that I recorded last week, which won't go out until next month, is with Eric Reese, who, of course, the author of The Lean Startup, very influential figure, lives now in the North Bay.
00:26:34 [Speaker 1]
He has a new book out called Incorruptible, Why Good Companies Go Bad and How Great Companies Stay Great.
00:26:41 [Speaker 1]
I I can't put the interview up until next week or until next month because the book is out then.


00:26:47 [Speaker 1]
But he I think you'd like the book, and I think you'll enjoy the interview, and we can discuss it, next month when it goes live.
00:26:55 [Speaker 1]
But he makes a structural argument that it's very much in your camp that worrying about these individuals is really beside the point.
00:27:05 [Speaker 1]
Because in Incorruptible, he suggests that good companies go bad and how great companies stay great have nothing to do with individuals.
00:27:14 [Speaker 1]
It has to do with their structure, their foundation.
00:27:18 [Speaker 1]
So I think he's it's an important argument to get us away from this personality driven cult.


00:27:25 [Speaker 2]
Yeah.
00:27:26 [Speaker 2]
Well, look.
00:27:27 [Speaker 2]
Most startups start with a mission.
00:27:29 [Speaker 2]
Some would call it a vision, like Google's do no evil or Microsoft.
00:27:34 [Speaker 2]
Yeah.


00:27:34 [Speaker 1]
I mean, of course, we talked about that in the conversation.


00:27:37 [Speaker 2]
Now what happens and and Facebook's a great example of this.
00:27:43 [Speaker 2]
Facebook was really about getting to know people on your campus.
00:27:46 [Speaker 2]
Possibly dating them, was part of it.
00:27:50 [Speaker 2]
But it really you know, there was no advertising business model.
00:27:54 [Speaker 2]
Like, there was no advertising business model, when Google first launched.


00:27:59 [Speaker 2]
And what happens is the commercial reality of turning that into a venture backed business with some implied returns for the investors.
00:28:08 [Speaker 2]
What happens is you bring in a team, you know, so you meet you call someone like Ron Conway, who's one of your investors.
00:28:14 [Speaker 2]
He'll introduce you to a bunch of people who've been successful at commercializing other startups.
00:28:20 [Speaker 2]
They'll come in and form a team, people like, Pallihapitiya.
00:28:25 [Speaker 2]
And, suddenly, the modus operandi of the business becomes to make money and to do it as efficiently and and, you know, tuned as possible.


00:28:36 [Speaker 2]
When that happens, the ethos of the business still exists somewhere in the background, but it isn't any longer the Yeah.


00:28:43 [Speaker 1]
But but but I I'm not in agree I mean, you are you suggesting then that all businesses because that's not what Reese is arguing in incorruptibles.
00:28:51 [Speaker 1]
He's suggesting that if you create the right kind of structural foundation of behaving decently, of not exploiting your customers, of creating a a


00:29:04 [Speaker 2]
Yeah.


00:29:05 [Speaker 1]
And these aren't Kantian companies.
00:29:07 [Speaker 1]
They're not spin they're not companies that run on spinos and principles.
00:29:11 [Speaker 1]
But at the same time, there is a difference between companies like Facebook, which I think most of us would agree is profoundly immoral, and other companies which are more accountable and responsible?


00:29:22 [Speaker 2]
Well, I I think if you put the prism on that of business to business or enterprise companies versus consumer, I think you'll find that in consumer, it's fast tracked being corrupted from your original vision because, the only way to monetize consumers is either subscription.


00:29:42 [Speaker 1]
Oh, but no.
00:29:43 [Speaker 1]
But recent, again, I don't wanna give away everything we talked about, but one of Reese's models is Patagonia.
00:29:49 [Speaker 1]
And he, we in our conversation, he talked about a story where the CEO of Patagonia turned down, deals, which benefit massively benefited Patagonia because he just thought they were bad.
00:30:03 [Speaker 1]
And Patagonia remains perhaps the model for an ethically run company.
00:30:07 [Speaker 1]
We all love their products.


00:30:09 [Speaker 1]
So it is possible to have decent consumer companies like Patagonia.


00:30:16 [Speaker 2]
Yeah.
00:30:16 [Speaker 2]
You have to stick to your you have to stick to your original vision and not allow anything to be executed that contradicts it, which is super hard to do, and that's why it's quite rare.
00:30:28 [Speaker 2]
And But should


00:30:28 [Speaker 1]
we I mean, should we is Reece right to make companies like Patagonia the model for twenty first century company, especially in Silicon Valley where where there's so much debate about this, where there's kind of both a a concern, but also misunderstanding of the anti tech zeitgeist.
00:30:52 [Speaker 1]
I mean, you and I agree on it this week and the New Yorker piece, but some of it's justified.
00:30:57 [Speaker 1]
I mean, it's not just all some evil media conspiracy.


00:31:03 [Speaker 2]
Look.
00:31:03 [Speaker 2]
I don't know enough about Patagonia's operating, but I I will guess that they do care about reducing costs and increasing margins.
00:31:12 [Speaker 2]
And and and so there may be some constraints on like, they probably wouldn't use child


00:31:19 [Speaker 1]
labor.
00:31:20 [Speaker 1]
To put it mildly, I mean, that goes without saying.


00:31:23 [Speaker 2]
But but possibly, you know, there's other things they're doing, like, expecting wealthy consumers to pay, much higher costs when they buy the stuff, therefore, putting it out of reach of poor people, just to make something up.


00:31:41 [Speaker 1]
So They also I mean, we can maybe when when I run the Reece piece, we can talk about it.
00:31:46 [Speaker 1]
They also encourage recycling.
00:31:48 [Speaker 1]
So I I'm a wealthy Patagonia owner, but often, you have these products.
00:31:54 [Speaker 1]
They last for years.
00:31:55 [Speaker 1]
And when the zipper breaks, you send them back, and they fix it for free.


00:31:59 [Speaker 1]
So they're not just obsessed with selling stuff.
00:32:01 [Speaker 1]
One of the other companies that came up, which was interesting, and maybe this fits back into our conversation about Sam and Silicon Valley and Anthropic and all the rest of it is, another of the kinds of companies that Reese writes about in his new book are cooperatives like Mondragon, which is a very successful Spanish, cooperative.
00:32:27 [Speaker 1]
Is it possible given, Keith, you're a historian of the working class, old Marxist, that given we're back in Luddite territory, but we'll start thinking about cooperatives, those alternative corporate structures to just straight capitalist product cap capitalist companies?


00:32:53 [Speaker 2]
You know, I I grew up with a co op.
00:32:55 [Speaker 2]
There was a co op in in in in, my council estate, and they gave you a red book.
00:33:01 [Speaker 2]
And, my mom every week on a Thursday would write a list of things she was buying, and you take it to the co op.
00:33:08 [Speaker 2]
You'd get the food and stuff for free.


00:33:10 [Speaker 1]
Well, after this was, of course, she read her Gossipy magazine.


00:33:15 [Speaker 2]
Exactly.
00:33:17 [Speaker 2]
That those would have been on the list.
00:33:19 [Speaker 2]
And, they give you the stuff, and then they get you have a coop book with stamps in, which you got you got stamps every time you spent money.
00:33:27 [Speaker 2]
But you didn't pay the money there and then you paid it after when payday happened.
00:33:31 [Speaker 2]
And so the coop was pretty foundational to the way the whole community ran.


00:33:37 [Speaker 2]
But that was institutions trying to figure out how to work with people that didn't have cash flow.
00:33:44 [Speaker 2]
Today, debt and credit cards play the same role.
00:33:47 [Speaker 2]
And I would argue that, no, that isn't the future.
00:33:51 [Speaker 2]
The the future is one where pure capitalism produces abundance, and there isn't the need to accommodate to poor people because there will be no poor people.


00:34:05 [Speaker 1]
And now we're gonna now you're edging into Elon territory.
00:34:07 [Speaker 1]
I think you and I will disagree on that one.
00:34:10 [Speaker 1]
But coming back to cooperatives, the whole point of a cooperative is not that you you you don't have to pay for the stuff at the end of the week.
00:34:19 [Speaker 1]
The whole point is it's a a a workaround.
00:34:22 [Speaker 1]
So and another of my another feature of the conversation with Reese is we talked about difference between a store like Berkeley Bowl and Whole Foods and why Whole Foods is the model for not how for how not to run companies.


00:34:37 [Speaker 1]
So it could conceivably I mean, it I don't know how it works for Anthropic and OpenAI.
00:34:44 [Speaker 1]
I guess their owner their worker owns in a sense on there.
00:34:48 [Speaker 1]
I mean, everyone gets


00:34:50 [Speaker 2]
Well, yeah, probably 20% of the company is owned by the employees.


00:34:55 [Speaker 1]
So are they in a sense, corporate?
00:34:57 [Speaker 1]
I mean, this a lot of this stuff comes back to the the dilemma that Sam and everybody else at OpenAI wrestled with and continues to wrestle with about having a company that's both nonprofit and for profit within the same shell?


00:35:13 [Speaker 2]
I I think it's simpler than that.
00:35:15 [Speaker 2]
I mean, ultimately, there's math, and the math says you gotta produce value.
00:35:22 [Speaker 2]
And if you produce value, you can distribute it.
00:35:25 [Speaker 2]
And so, things like ownership, are temporal concerns at a primitive stage of human development.
00:35:36 [Speaker 2]
In in the future, there will probably be no ownership or to put it in the opposite, everything will be commonly owned.


00:35:45 [Speaker 1]
Now you're sounding like Marx.
00:35:46 [Speaker 1]
What did you say in the future, there'll be no ownership?


00:35:49 [Speaker 2]
Or by saying the same thing in a different way, everything will be commonly owned.
00:35:55 [Speaker 2]
And and that's because in abundance, the end of scarcity results in the end of, self interested competition.


00:36:06 [Speaker 1]
Yeah.
00:36:06 [Speaker 1]
Well, you've been reading too much marks here.
00:36:09 [Speaker 1]
In the future No.
00:36:10 [Speaker 1]
There'll be


00:36:11 [Speaker 2]
It's


00:36:12 [Speaker 1]
no ownership according to Keith.
00:36:13 [Speaker 1]
But I think one thing we can guarantee is next week, Keith, when we meet again, April, there will remain ownership.
00:36:24 [Speaker 1]
Is that fair?


00:36:26 [Speaker 2]
There will remain ownership.
00:36:28 [Speaker 2]
You will own, now yeah.
00:36:30 [Speaker 2]
Oh, not now, actually.
00:36:32 [Speaker 2]
Kenon and I will own.
00:36:34 [Speaker 2]
That was the week.


00:36:34 [Speaker 2]
And, Jeff three w capital is commenting.


00:36:39 [Speaker 1]
You always bring in these bots.


00:36:43 [Speaker 2]
No.
00:36:43 [Speaker 2]
Because they're here in Restream.
00:36:45 [Speaker 2]
It's showing them to me, and he's making the point that we should talk about Grok more because it's gonna end up being the runaway winner, which I will say when I drive my Tesla, I'm constantly amazed by how good Grok is, so he's not wrong.


00:36:58 [Speaker 1]
Well, we will see.
00:36:59 [Speaker 1]
One thing is for there are two things for sure.
00:37:02 [Speaker 1]
There is a future, and it will be next week.
00:37:05 [Speaker 1]
We'll meet again next Saturday, and, there will certainly be ownership next week.
00:37:12 [Speaker 1]
So as always, Keith, we were in too much agreement this week.


00:37:15 [Speaker 1]
Maybe next week, we can disagree more.
00:37:17 [Speaker 1]
Thank you so much.


00:37:19 [Speaker 2]
No.