Feb. 5, 2026

Your Data Will Be Used Against You: Andrew Guthrie Ferguson on Policing in the Age of Self-Surveillance

The player is loading ...
Your Data Will Be Used Against You: Andrew Guthrie Ferguson on Policing in the Age of Self-Surveillance


A man was convicted by his own heartbeat — and that's just the beginning of our digital dystopia.

 

About the Guest

Andrew Guthrie Ferguson is Professor of Law at George Washington University Law School and a national expert on surveillance technologies, policing, and criminal justice. He is an elected member of the American Law Institute and the author of the PROSE Award–winning The Rise of Big Data Policing. His new book, Your Data Will Be Used Against You: Policing in the Age of Self-Surveillance (NYU Press, March 2026), examines how smart devices and digital surveillance are transforming criminal prosecution — and what the law must do to catch up.

About This Episode

Following yesterday’s conversation with Christopher Mathias about doxxing and the ethics of unmasking, Andrew Keen turns to the legal side of the same question: what happens when the data we generate about ourselves becomes evidence? Andrew Guthrie Ferguson joins the show from Washington, D.C. to discuss his new book — a deeply researched investigation into how pacemakers, smartphones, smart cars, and doorbell cameras are being used to convict people in court, and why the law has almost nothing to say about it.

The conversation moves from a man convicted by his own heartbeat to AI-powered real-time crime centres, from Eric Schmidt’s infamous privacy defence to masked ICE agents in Minneapolis, and from Bentham’s panopticon to Ferguson’s proposed “tyrant test” — a framework for designing data protections by imagining the worst leader with access to your most intimate information.

Chapters:
00:00 Introduction: Digital privacy and unmasking
The theme of digital privacy and what it means to be unmasked in a data-driven world

01:25 Meet Andrew Guthrie Ferguson
 Introducing the guest and his new book on privacy, surveillance, and the law

02:10 The Dual-Edged Sword of Digital Devices
 How our everyday devices expose everyone and the complicated trade-offs that creates

03:40 From “Don’t Be Ashamed” to Privacy Nuance
 The shift from early Silicon Valley privacy optimism to a more complex reality

04:45 Regulating Government, Not Google
 Ferguson’s focus on keeping personal data out of court rather than off corporate servers

05:55 The Pacemaker Data Court Case
 How personal medical device data was used as evidence in a criminal trial

07:30 Convicted by His Own Heartbeat
 An arson and insurance fraud case where heart-rate data contradicted the suspect’s story

09:40 Google’s Three-Part Warrant System
 How tech companies helped shape rules for law enforcement access to location data

11:15 The Fourth Amendment Digital Gap
 What reasonable expectations of privacy mean in the modern digital environment

12:45 Digital Privileges and Intimate Data
 Whether certain types of personal data should be legally protected like confidential relationships

14:20 Surveillance Battles on the Ground
 Protests, law enforcement, and the evolving intelligence dynamic in Minneapolis

16:05 “Just Doing Our Job” and State Surveillance
 The common defence of surveillance practices and why it remains controversial

18:10 The Texas Drone Fleet
 Drones as first responders and the expansion of aerial policing technology

20:45 Real-Time Crime Centers and Mass Cameras
 Integrated camera networks, data fusion, and the lack of clear oversight

22:50 The Tyrant Test for Privacy Laws
 Designing privacy protections assuming the worst possible leader has access to the data

25:15 AI Supercharges Surveillance
 How artificial intelligence turns ordinary cameras into powerful tracking tools

27:30 AI-Assisted Police Reports
 Using body-camera audio and AI tools to generate reports and the implications for justice

29:10 No Turning Back From Technology
 Why abandoning digital tools isn’t realistic and why new laws may be needed instead

31:15 Closing: Every Smart Device Is Surveillance
 The idea that modern connected devices inherently function as surveillance tools

 

Links & References

Mentioned in this episode:

 

About Keen On America
Keen On America is a daily podcast hosted by Andrew Keen, the Anglo-American writer and Silicon Valley insider named by GQ magazine as one of the world’s “100 Most Connected Men.” Every day, Andrew brings his sharp Transatlantic wit to the forces reshaping the United States — interviewing leading thinkers and writers about American history, politics, technology, culture, and business. With nearly 2,800 episodes since the show’s founding on TechCrunch in 2010, Keen On America is the most prolific intellectual interview show in the history of podcasting.

Website | Substack | YouTube

00:00 - Introduction: Digital privacy and unmasking

01:25 - Meet Andrew Guthrie Ferguson

02:10 - The Dual-Edged Sword of Digital Devices

03:40 - From “Don’t Be Ashamed” to Privacy Nuance

04:45 - Regulating Government, Not Google

05:55 - The Pacemaker Data Court Case

07:30 - Convicted by His Own Heartbeat

09:40 - Google’s Three-Part Warrant System

11:15 - The Fourth Amendment Digital Gap

12:45 - Digital Privileges and Intimate Data

14:20 - Surveillance Battles on the Ground

16:05 - “Just Doing Our Job” and State Surveillance

18:10 - The Texas Drone Fleet

20:45 - Real-Time Crime Centers and Mass Cameras

22:50 - The Tyrant Test for Privacy Laws

25:15 - AI Supercharges Surveillance

27:30 - AI-Assisted Police Reports

29:10 - No Turning Back From Technology

31:15 - Closing: Every Smart Device Is Surveillance

1
00:00:00,120 --> 00:00:01,980
Hello, my name is Andrew Keen.

2
00:00:02,100 --> 00:00:07,890
Welcome to Keen on America, the Daily Show
about everything that matters with the

3
00:00:07,890 --> 00:00:10,890
world's leading commentators and thinkers.

4
00:00:30,735 --> 00:00:31,455
Hello everybody.

5
00:00:31,455 --> 00:00:34,754
It's Thursday, February the fifth, 2026.

6
00:00:34,754 --> 00:00:40,095
Yesterday we did a show, uh, with the
journalist Christopher Mathias, uh, who

7
00:00:40,095 --> 00:00:45,525
has a new book out to catch a fascist,
the fight to expose the Radical, right,

8
00:00:45,525 --> 00:00:55,185
that seemed to suggest that, uh, anti
Antifa has a right to unmask neo-Nazis.

9
00:00:55,185 --> 00:00:58,364
And we had an interesting
conversation about the.

10
00:00:59,070 --> 00:01:04,170
Unite the right rally in Charlottesville
back in 2017 and whether one has

11
00:01:04,320 --> 00:01:08,340
the right or not to expose people
who were involved in that rally.

12
00:01:08,340 --> 00:01:12,810
It's all of course about
privacy and indeed shame.

13
00:01:12,810 --> 00:01:13,410
We are.

14
00:01:13,860 --> 00:01:17,760
Again, talking about privacy
today, perhaps in a more legal

15
00:01:17,760 --> 00:01:22,260
way with my guest, Andrew Guthrie
Ferguson, who has a new book out.

16
00:01:22,560 --> 00:01:28,860
"Your Data Will Be Used Against You:
Policing in the Age of Self Surveillance."

17
00:01:29,310 --> 00:01:31,920
Uh, Andrew is joining
us from Washington DC.

18
00:01:31,920 --> 00:01:36,750
He teaches, uh, law at, uh, GW Law School.

19
00:01:37,050 --> 00:01:42,929
Uh, Andrew, I know this is beyond perhaps
the focus of your book, but when it comes

20
00:01:42,929 --> 00:01:49,800
to exposing what people are up to in
their political lives, what is your take

21
00:01:49,800 --> 00:01:52,740
on on this extremely controversial issue?

22
00:01:53,865 --> 00:01:56,685
Controversial, of course, both
on the left and the right.

23
00:01:57,554 --> 00:02:01,185
I think it points to a a point
that is discussed in the book,

24
00:02:01,185 --> 00:02:04,575
although I don't obviously touch on
that issue, which is that our data

25
00:02:04,575 --> 00:02:07,035
exposes us, it exposes everyone.

26
00:02:07,425 --> 00:02:11,954
Uh, whoever is communicating with
some kind of, uh, digital device in

27
00:02:11,954 --> 00:02:15,920
their pocket, uh, is uh, uh, exposed.

28
00:02:15,920 --> 00:02:17,570
And it has a duality to it, right?

29
00:02:17,570 --> 00:02:20,120
Sometimes it's useful for law
enforcement to find people.

30
00:02:20,390 --> 00:02:24,410
Sometimes, uh, people who are protesting
and dissenting, uh, get exposed.

31
00:02:24,410 --> 00:02:27,560
And as you were just talking about,
it can work the other way around.

32
00:02:27,950 --> 00:02:31,550
The point is, uh, uh, maybe it was
in your earlier discussion that there

33
00:02:31,550 --> 00:02:35,570
really aren't really laws or rules
or regulation or constitutional,

34
00:02:35,780 --> 00:02:37,670
uh, protections about this data.

35
00:02:38,150 --> 00:02:39,770
Uh, my focus on the book is about how.

36
00:02:40,070 --> 00:02:43,340
The police and prosecutors and the
government can use data against us.

37
00:02:43,670 --> 00:02:47,480
Uh, but obviously this sort of exposure
of data, uh, has a double edged.

38
00:02:48,765 --> 00:02:49,155
Yeah.

39
00:02:49,155 --> 00:02:51,825
And a very sharp double
edge, unfortunately.

40
00:02:52,185 --> 00:02:55,515
Uh, Eric Schmidt fa, uh, when
he was CEO of Google, back in,

41
00:02:55,515 --> 00:03:00,015
I think it was 2008 or oh nine,
famously said about online privacy.

42
00:03:00,015 --> 00:03:03,915
Uh, you shouldn't be ashamed
of anything you do, uh, or if

43
00:03:03,915 --> 00:03:05,475
you are ashamed, don't do it.

44
00:03:05,475 --> 00:03:08,415
How would you respond to that?

45
00:03:09,255 --> 00:03:13,275
At the point, at that point at least,
was a fairly typical Silicon Valley.

46
00:03:14,565 --> 00:03:17,775
To the issue of privacy now, I think
it's a little bit more nuanced.

47
00:03:18,435 --> 00:03:21,675
Well, I think the really hard questions
about data privacy general and about

48
00:03:21,675 --> 00:03:25,245
what information Google could collect,
uh, I think they're actually more

49
00:03:25,425 --> 00:03:29,805
narrow and tailored and perhaps solvable
issues about controlling what the

50
00:03:29,805 --> 00:03:31,905
government can do with that information.

51
00:03:31,935 --> 00:03:36,315
So, for example, I do like the
fact that I can Google in a

52
00:03:36,315 --> 00:03:38,025
search query whatever I'd like to.

53
00:03:38,460 --> 00:03:39,180
Uh, query.

54
00:03:39,540 --> 00:03:44,340
Uh, and I would wish that data could
stay with between me and Google

55
00:03:44,340 --> 00:03:47,459
and not necessarily be turned into
evidence against me in a court of law.

56
00:03:47,790 --> 00:03:50,700
And part of what the book does
is, uh, rationalize or, or,

57
00:03:50,704 --> 00:03:51,900
or wrestle with that tension.

58
00:03:52,109 --> 00:03:55,380
There is a reason why we
all have acceptance or the

59
00:03:55,380 --> 00:03:56,609
seduction of self surveillance.

60
00:03:56,609 --> 00:03:57,390
There's the reason why.

61
00:03:57,695 --> 00:04:01,385
We buy, you know, watches that
monitor our heartbeat or buy cameras

62
00:04:01,385 --> 00:04:02,795
that we put on our front door.

63
00:04:02,945 --> 00:04:04,235
There's a reason for that.

64
00:04:04,535 --> 00:04:07,895
Um, but what I don't think we've really
wrestled with is that that data that we

65
00:04:07,895 --> 00:04:13,205
create can be used against us in a court
of law, uh, and has been and will be.

66
00:04:13,415 --> 00:04:15,725
And, um, that presents its own danger.

67
00:04:15,725 --> 00:04:18,760
So in the book, I actually
proposed solutions not for Google,

68
00:04:18,760 --> 00:04:23,255
collecting our data or data privacy
or data protection writ large,

69
00:04:23,495 --> 00:04:25,655
but more about how to regulate.

70
00:04:25,914 --> 00:04:29,664
Government use of it and prosecution
use of it against individuals.

71
00:04:29,935 --> 00:04:29,965
Uh.

72
00:04:32,310 --> 00:04:36,630
Does it depend, of course on
how one views government and

73
00:04:36,630 --> 00:04:38,729
its rights in terms of citizens?

74
00:04:38,729 --> 00:04:43,080
I mean, what rights should
government have, uh, Andrew when

75
00:04:43,080 --> 00:04:48,390
it comes to what you call policing
in the age of self surveillance?

76
00:04:49,169 --> 00:04:53,550
Well, I think there are certain things
that, uh, we would like to eat product.

77
00:04:53,550 --> 00:04:54,599
Lemme give you a good example.

78
00:04:54,780 --> 00:04:57,960
So I think we can all
agree that a digital.

79
00:04:58,365 --> 00:05:01,905
Pacemaker, something goes in
your heart that keeps your heart

80
00:05:01,905 --> 00:05:05,700
beating and records, uh, your heart
beat and gives it to your doctor.

81
00:05:06,375 --> 00:05:08,325
Is a a good, it's an objective.

82
00:05:08,325 --> 00:05:10,485
Good we, that is saving lives.

83
00:05:10,485 --> 00:05:14,325
It's important and it's something
that we should be investing in for

84
00:05:14,325 --> 00:05:15,555
innovation and everything else.

85
00:05:16,185 --> 00:05:20,505
At the same time, we might also agree
that that data about your smart heartbeat

86
00:05:20,865 --> 00:05:24,495
should not be available to detectives
who can go to your doctor's door with a

87
00:05:24,555 --> 00:05:26,505
warrant and get it and use against you.

88
00:05:26,955 --> 00:05:28,635
And use it against you in a court of law.

89
00:05:28,665 --> 00:05:30,975
And that's happened in a
case I discussed in the book.

90
00:05:31,215 --> 00:05:33,525
And I think that balance is there.

91
00:05:33,645 --> 00:05:38,175
The person being charged in that case was
arguably committing a crime, but that's

92
00:05:38,175 --> 00:05:40,215
his heartbeat being used against him.

93
00:05:40,515 --> 00:05:44,445
And I think that we should be able
to design laws that balance, that

94
00:05:45,105 --> 00:05:49,065
very good need for having digital
conveniences and innovations that

95
00:05:49,065 --> 00:05:50,355
have changed our world for good.

96
00:05:50,715 --> 00:05:55,495
Uh, but at the same time controlling how
that information can be used against us.

97
00:05:56,820 --> 00:06:00,330
Give me more details
on that heartbeat case.

98
00:06:00,330 --> 00:06:03,930
Why did the police wanna use
his heartbeat and what was the

99
00:06:03,930 --> 00:06:05,580
crime and what was the outcome?

100
00:06:06,090 --> 00:06:10,620
So there's this guy who basically was
committing a insurance fraud using arson.

101
00:06:10,920 --> 00:06:16,380
Uh, he was burning down his house and, uh,
he came up with his whole story about how

102
00:06:16,380 --> 00:06:18,420
he had rescued everything right in time.

103
00:06:18,420 --> 00:06:21,750
And he told us, you know, it's an
elaborate lie, uh, about this in order

104
00:06:21,750 --> 00:06:24,810
to get the insurance, uh, uh, uh, uh.

105
00:06:25,245 --> 00:06:27,525
Money, uh, which again is a, is a crime.

106
00:06:27,974 --> 00:06:32,175
And so detectives wanted to disprove
his story by showing that his

107
00:06:32,175 --> 00:06:35,265
heartbeat wasn't responding the way
it would've been if you were running

108
00:06:35,265 --> 00:06:39,075
around in the middle of a fire, uh,
trying to save all of your possession.

109
00:06:39,075 --> 00:06:42,315
So they went and they got, uh, the
heartbeat and they used it against

110
00:06:42,315 --> 00:06:45,195
him, and he was sort of convicted,
uh, based on his own heartbeat.

111
00:06:45,195 --> 00:06:46,515
And again, that's a hard story.

112
00:06:46,515 --> 00:06:47,115
Like this is.

113
00:06:47,665 --> 00:06:48,265
A crime.

114
00:06:48,505 --> 00:06:51,415
It's uh, uh, something that
we, we don't want to support.

115
00:06:51,415 --> 00:06:56,845
At the same time, this is intimate data
that really was sort of involuntary

116
00:06:56,845 --> 00:07:00,355
given, like, I mean, I guess it could
have chosen not to have a pacemaker and

117
00:07:00,355 --> 00:07:03,925
not stay alive, but once you have that,
it seems like we might be able to carve

118
00:07:03,925 --> 00:07:08,065
out protections about certain sensitive
data, certain information that maybe

119
00:07:08,065 --> 00:07:09,475
we don't want the government to have.

120
00:07:11,035 --> 00:07:13,465
What's been the typical response, Andrew?

121
00:07:14,130 --> 00:07:18,690
Silicon Valley when it comes to this
issue of self surveillance and the, the

122
00:07:18,690 --> 00:07:20,429
data that can be used by the police.

123
00:07:20,429 --> 00:07:25,020
I know Apple in particular, uh,
have been quite resistant to giving

124
00:07:25,020 --> 00:07:29,820
to, to, to opening iPhones for the
police for one reason or other.

125
00:07:29,820 --> 00:07:35,039
What it, it can, one, make some
rules about Silicon Valley's

126
00:07:35,849 --> 00:07:41,369
commitment or lack of commitment
to providing data for the police.

127
00:07:42,930 --> 00:07:47,190
I think the, the problem is that
currently the only protections are really

128
00:07:47,190 --> 00:07:48,720
coming from the companies themselves.

129
00:07:48,930 --> 00:07:50,640
There really isn't a law that controls it.

130
00:07:50,640 --> 00:07:54,870
Even if you're thinking about, you
know, uh, Google, which has been

131
00:07:54,870 --> 00:07:59,880
collecting sort of location data and
came up with its own three part warrant.

132
00:08:00,045 --> 00:08:03,585
So law enforcement could get access to
this or what they call the censor vault,

133
00:08:03,585 --> 00:08:04,905
basically all of our location data.

134
00:08:05,205 --> 00:08:08,355
But it was Google's lawyers who
actually came up with the idea of

135
00:08:08,355 --> 00:08:09,675
having this three step process.

136
00:08:09,675 --> 00:08:13,215
It wasn't like a judge saying this what
the fourth amendment holds, or this

137
00:08:13,215 --> 00:08:15,195
is, uh, a legislative determination.

138
00:08:15,345 --> 00:08:16,575
It was like Google's lawyers.

139
00:08:16,575 --> 00:08:20,025
Now that case is actually going before
the Supreme Court, uh, just got granted

140
00:08:20,025 --> 00:08:21,735
certs going up to, to be decided.

141
00:08:22,005 --> 00:08:26,205
Um, but again, the point is the company's
making these these calls, so Apple has

142
00:08:26,205 --> 00:08:28,155
seen that it's in their business interest.

143
00:08:28,395 --> 00:08:30,315
To be more privacy protective.

144
00:08:30,645 --> 00:08:35,595
Uh, Google has essentially said, go
sold you a cheaper phone, but made

145
00:08:35,595 --> 00:08:38,174
the money back on the data that they
were collecting from that phone.

146
00:08:38,564 --> 00:08:43,424
Uh, and unfortunately now of course, all
of those tech companies are somewhat.

147
00:08:43,669 --> 00:08:47,990
Beholden to a, uh, governmental
administration in the US that is,

148
00:08:48,229 --> 00:08:49,880
you know, playing pretty tough.

149
00:08:49,880 --> 00:08:53,750
And I, I, I think it'd be hard
for, for companies to push back the

150
00:08:53,750 --> 00:08:57,920
same way, um, that they had when
they have all these other large

151
00:08:57,920 --> 00:09:00,020
government contracts, uh, at issue.

152
00:09:00,020 --> 00:09:05,089
Apple is sort of in a different boat than,
you know, Microsoft or Google or Amazon

153
00:09:05,089 --> 00:09:06,439
or any of those other big companies.

154
00:09:07,380 --> 00:09:11,670
What rights can and should we have
when it comes to these issues?

155
00:09:11,670 --> 00:09:15,210
I know that you focus in your
book on the fourth Amendment.

156
00:09:15,210 --> 00:09:19,170
You might explain, uh, what the Fourth
Amendment is and how it connects.

157
00:09:19,530 --> 00:09:23,250
But if one is indeed guilty of a
crime burning down the house in

158
00:09:23,250 --> 00:09:26,730
order to get insurance money, why
shouldn't the government use any kind

159
00:09:26,730 --> 00:09:31,500
of data it can get access to, to make
its case if indeed we are guilty?

160
00:09:32,220 --> 00:09:35,970
Well, that is actually the cur,
almost the current law, right?

161
00:09:35,970 --> 00:09:40,560
That, uh, there if you create
data, it can be used against you.

162
00:09:40,560 --> 00:09:43,205
In fact, you know, the truth of
the matter is there is nothing.

163
00:09:44,040 --> 00:09:47,010
Too private, too secret
that the government can't

164
00:09:47,010 --> 00:09:48,900
access, uh, with a warrant.

165
00:09:49,229 --> 00:09:54,689
So, you know, your digital diary,
your period app, your health monitor,

166
00:09:54,689 --> 00:09:58,410
your smart bed, whatever it is, uh,
if the government can get a warrant

167
00:09:58,439 --> 00:10:00,474
to get access to it, and sometimes
they don't even need a warrant.

168
00:10:01,235 --> 00:10:02,584
Um, they can use it.

169
00:10:02,584 --> 00:10:04,444
And so that is the current default.

170
00:10:04,475 --> 00:10:09,334
I think that that default, uh, devalues
sort of a sense of not just privacy, but

171
00:10:09,334 --> 00:10:11,615
also gives the government more control.

172
00:10:11,824 --> 00:10:15,845
I think that, uh, we've seen,
uh, potential abuses of that

173
00:10:15,845 --> 00:10:17,704
technology, that sort of use of.

174
00:10:18,225 --> 00:10:21,975
Uh, surveillance technology, even
in the last like month or two where

175
00:10:21,975 --> 00:10:26,565
we've seen a government that has seen,
uh, a fit to go after journalists

176
00:10:26,565 --> 00:10:31,995
and podcasters and, uh, protesters
who are out, uh, on the streets.

177
00:10:32,355 --> 00:10:35,895
Uh, and so I think that we might
want to, uh, have a balance.

178
00:10:35,895 --> 00:10:38,445
I actually think it's a kind of
a bipartisan issue because you

179
00:10:38,445 --> 00:10:41,385
never know who's gonna be in charge
of, uh, the government, but we

180
00:10:41,385 --> 00:10:42,795
might wanna have a balance about.

181
00:10:43,010 --> 00:10:45,319
Should there be higher limits?

182
00:10:45,350 --> 00:10:48,500
You know, maybe something more than a
warrant to be able to get access to this?

183
00:10:48,740 --> 00:10:52,280
Should there be particular areas that
are just so private and so intimate

184
00:10:52,280 --> 00:10:56,540
that the police can't get it, uh,
no matter what, uh, that might

185
00:10:56,630 --> 00:10:59,780
lie under like a Fourth Amendment
understanding of like a reason why.

186
00:10:59,780 --> 00:11:00,050
Just

187
00:11:00,050 --> 00:11:04,040
explain, uh, sorry, Andrew, for,
for our, our non-legal audience,

188
00:11:04,040 --> 00:11:05,015
what the Fourth Amendment is.

189
00:11:05,775 --> 00:11:05,985
Sure.

190
00:11:05,985 --> 00:11:10,275
So the Fourth Amendment comes from the
United States Constitution that governs,

191
00:11:10,305 --> 00:11:12,795
uh, unreasonable searches and seizures.

192
00:11:12,795 --> 00:11:15,045
So taking things, but also taking data.

193
00:11:15,525 --> 00:11:19,905
Uh, the Supreme Court, uh, has said
that the controlling standard is

194
00:11:19,905 --> 00:11:23,745
whether or not we all have a reasonable
expectation of privacy in something,

195
00:11:24,105 --> 00:11:25,785
what that means in a digital age.

196
00:11:26,020 --> 00:11:28,420
It's honestly kind of an open
question still being debated.

197
00:11:28,810 --> 00:11:32,920
The court has said in a case called
Carpenter that your cell site location

198
00:11:32,920 --> 00:11:39,070
data, so the data that's revealed when
you walk around, uh, the, uh, world, uh,

199
00:11:39,070 --> 00:11:40,870
requires a warrant to get access to it.

200
00:11:41,235 --> 00:11:45,765
Uh, but there have been many other
pieces of information where the court

201
00:11:45,765 --> 00:11:49,454
has held that there is no, quote unquote,
reasonable expectation of privacy.

202
00:11:49,755 --> 00:11:53,564
And police can get it for any reason or
no reason, just because they want it.

203
00:11:54,015 --> 00:11:57,435
Um, so you could have a system
where you could have a warrant.

204
00:11:57,460 --> 00:12:00,730
Or you could even, and this is some of
the proposal in the book, go even beyond

205
00:12:00,730 --> 00:12:04,060
that to say maybe there's certain areas
that should be essentially privileged.

206
00:12:04,060 --> 00:12:06,880
It's like when you're, when a client
and a lawyer are talking, they're

207
00:12:06,880 --> 00:12:09,910
something called an attorney-client
privilege, where those communications

208
00:12:10,030 --> 00:12:11,320
just can't be used in court.

209
00:12:11,560 --> 00:12:15,310
When two married people are speaking,
that communication is privileged.

210
00:12:15,310 --> 00:12:15,670
Why?

211
00:12:15,670 --> 00:12:16,180
Because we.

212
00:12:16,360 --> 00:12:21,640
Value something else besides the ability
to use data against someone in a court.

213
00:12:21,820 --> 00:12:23,440
We value the sanctity of marriage.

214
00:12:23,440 --> 00:12:25,990
We value the sanctity of an
attorney-client privilege.

215
00:12:26,260 --> 00:12:29,890
And I think that there, there are some
pieces of our digital world, uh, that

216
00:12:29,890 --> 00:12:33,040
also deserve that kind of privilege or,
so I, I make the argument in the book,

217
00:12:34,030 --> 00:12:36,220
this isn't gonna make
the police very happy is.

218
00:12:37,035 --> 00:12:38,084
No, no, it's not.

219
00:12:38,145 --> 00:12:41,864
Um, I, I think, you know, the police
and, and honestly the tech companies

220
00:12:41,864 --> 00:12:45,824
sort of like the status quo, where
there really isn't regulation.

221
00:12:46,094 --> 00:12:48,734
The Fourth Amendment is still
debating things in kind of

222
00:12:48,734 --> 00:12:50,864
an analog pre-digital world.

223
00:12:51,224 --> 00:12:54,645
Uh, and that means the
technology proceeds a pace.

224
00:12:55,010 --> 00:12:57,470
And sometimes, you know, the,
the book is honestly filled with

225
00:12:57,470 --> 00:13:03,170
stories of, uh, uses that get a
bad guy and, and, and lock 'em up.

226
00:13:03,170 --> 00:13:04,189
And I wrestle with that.

227
00:13:04,400 --> 00:13:07,310
Uh, honestly, I, I'm trying to show
that there is a positive to it.

228
00:13:07,760 --> 00:13:10,850
Uh, but I also think the danger
of that is that that same data can

229
00:13:10,850 --> 00:13:12,500
be used to, you know, track it.

230
00:13:12,939 --> 00:13:17,110
Police officers, ex-girlfriend, or as
we're seeing in Minneapolis, like go

231
00:13:17,110 --> 00:13:21,130
after, uh, protestors or people who
are, uh, using their First Amendment

232
00:13:21,130 --> 00:13:25,510
rights, uh, in ways that are deeply
uncomfortable that the police, you know,

233
00:13:25,510 --> 00:13:29,470
without, uh, the proper requirements,
even if you call something a crime,

234
00:13:29,470 --> 00:13:31,120
it, everything becomes available.

235
00:13:31,450 --> 00:13:35,350
Uh, that you can use the same
power in a real authoritarian way.

236
00:13:36,220 --> 00:13:41,770
You used the Minnesota example, uh,
the Minneapolis example, Andrew Bit.

237
00:13:43,660 --> 00:13:48,015
A lot of people would argue outside
the law that the, the, the, the,

238
00:13:48,015 --> 00:13:53,444
the current regime and certainly the
behavior of ice in Minneapolis is

239
00:13:54,314 --> 00:13:56,715
unconstitutional, perhaps even illegal.

240
00:13:56,715 --> 00:14:02,085
So, uh, can that example really be used?

241
00:14:02,265 --> 00:14:08,415
I mean, you can always find examples
of the police or a state overstepping.

242
00:14:08,880 --> 00:14:15,300
The boundaries, but that in
itself doesn't mean that there

243
00:14:15,300 --> 00:14:17,610
shouldn't be laws should there.

244
00:14:18,630 --> 00:14:20,130
I do think that there should be laws.

245
00:14:20,130 --> 00:14:23,370
In fact, the book is filled with
suggestion suggestions that there

246
00:14:23,430 --> 00:14:24,810
should be laws, but there are not.

247
00:14:25,080 --> 00:14:30,000
And and what's really chilling is that
the technologies that ICE and, and the

248
00:14:30,000 --> 00:14:34,920
Customs Border Patrol are using right now
in Minneapolis can be used by local law

249
00:14:34,920 --> 00:14:37,080
enforcement or the FBI in the same way.

250
00:14:37,410 --> 00:14:41,940
And there isn't a clear, uh, sense that
it is automatically unconstitutional.

251
00:14:41,940 --> 00:14:45,780
You have to wait for the Supreme Court to,
to claim that it is or state that it is.

252
00:14:46,230 --> 00:14:47,670
Um, and the.

253
00:14:48,025 --> 00:14:50,335
That we are seeing, and
this is what's interesting.

254
00:14:50,335 --> 00:14:54,655
We are seeing a very aggressive use
of new technologies, new surveillance

255
00:14:54,655 --> 00:15:00,564
technologies as we've never seen before
against, uh, otherwise, uh, relatively

256
00:15:00,564 --> 00:15:04,795
privileged, uh, population that was not
normally the target of surveillance.

257
00:15:05,035 --> 00:15:07,855
And so like the aperture of surveillance
ha has expanded to target people

258
00:15:07,855 --> 00:15:10,345
who weren't normally targeted and
suddenly everyone's saying, wow,

259
00:15:10,345 --> 00:15:12,055
this feels a little too much.

260
00:15:12,055 --> 00:15:16,255
Whereas that same technology
can be used against.

261
00:15:16,995 --> 00:15:21,465
Anyone, unless there was a
countervailing law or a constitutional

262
00:15:21,465 --> 00:15:25,485
rule saying it can't be, and right
now we don't really have that.

263
00:15:27,194 --> 00:15:29,954
How aggressive has the police become?

264
00:15:29,954 --> 00:15:31,965
I know that there are
lots of cases in which.

265
00:15:32,760 --> 00:15:38,550
Seems as if local police forces are
becoming increasingly greedy for data.

266
00:15:38,550 --> 00:15:43,200
And there's one, uh, example where
the Texas State police has more

267
00:15:43,200 --> 00:15:45,210
than doubled their drone fleet.

268
00:15:45,750 --> 00:15:51,030
Um, are there great differences
between state police forces and

269
00:15:51,030 --> 00:15:56,100
are there any guidelines on what
the police can and can't do in

270
00:15:56,100 --> 00:15:58,170
terms of collecting data about us?

271
00:15:59,340 --> 00:16:00,060
There are no.

272
00:16:00,390 --> 00:16:01,920
Like external guidelines.

273
00:16:01,920 --> 00:16:05,310
Obviously police departments themselves
might write the rules that limit

274
00:16:05,310 --> 00:16:09,630
themselves, but there's no federal law
or state law that's going to control it.

275
00:16:09,630 --> 00:16:11,550
They're actually for
drones as that example.

276
00:16:11,790 --> 00:16:15,840
There are a few like FAA laws that
sort of control like safety issues

277
00:16:15,840 --> 00:16:17,820
around, you know, drones flying ahead.

278
00:16:18,090 --> 00:16:18,790
But in terms of like.

279
00:16:19,030 --> 00:16:23,170
We have many cities now that are
adopting drones as first responders.

280
00:16:23,170 --> 00:16:27,939
So imagine a nine one one call goes out,
a drone response before the police officer

281
00:16:27,939 --> 00:16:32,140
taking a video of what's happening there,
which again, from a tactical strategy,

282
00:16:32,415 --> 00:16:34,360
uh, uh, perspective for the police is.

283
00:16:34,689 --> 00:16:35,709
Very helpful for them.

284
00:16:35,890 --> 00:16:39,370
They can, if you have a realtime
crime center, uh, which again is a

285
00:16:39,370 --> 00:16:43,329
new sort of idea of like fusing all
these video streams together, analysts

286
00:16:43,329 --> 00:16:45,760
at the realtime crime center can
see what's going on with the drone.

287
00:16:46,000 --> 00:16:49,990
They can then go to the various cameras
that are right around, uh, the area.

288
00:16:49,990 --> 00:16:53,380
They can even like go get access to
like the police officer's body camera

289
00:16:53,589 --> 00:16:55,780
and be able to, to see what's going on.

290
00:16:55,990 --> 00:17:02,050
Incredibly powerful, uh, uh, change for
police to get eyes on the ground quickly.

291
00:17:02,415 --> 00:17:06,615
There aren't real, uh, you know,
like external rules about what

292
00:17:06,615 --> 00:17:08,685
they can do, uh, with that data.

293
00:17:08,984 --> 00:17:11,565
Um, sometimes it's helpful,
sometimes it's not.

294
00:17:11,835 --> 00:17:14,745
Um, but it's definitely a change and it's
a change that we haven't talked about

295
00:17:14,745 --> 00:17:18,405
part of the book, which is, you know, on
the theme of self surveillance, it's sort

296
00:17:18,405 --> 00:17:19,694
of this democratic self surveillance.

297
00:17:19,724 --> 00:17:24,045
This is like a mediated democratic self
surveillance where we have chosen to

298
00:17:24,045 --> 00:17:28,065
spend our tax dollars, uh, on these
camera systems and these drones.

299
00:17:28,214 --> 00:17:31,480
And I think we should have a debate
about whether we're okay with them.

300
00:17:32,655 --> 00:17:37,095
When it comes to these kinds of
debates, do do American citizens

301
00:17:37,095 --> 00:17:39,435
tend to be somewhat paranoid?

302
00:17:39,795 --> 00:17:42,585
There's more and more fear of
big tech on lots of different

303
00:17:42,585 --> 00:17:43,755
fronts, especially with ai.

304
00:17:43,755 --> 00:17:46,875
And I wanna talk about AI a little
bit later in our conversation, but.

305
00:17:48,600 --> 00:17:51,480
Where do we get balance in
terms of this conversation?

306
00:17:51,480 --> 00:17:54,120
On the one hand, we have
legal scholars like yourself.

307
00:17:54,510 --> 00:17:57,540
On the other hand, we have sort
of hardcore libertarians who don't

308
00:17:57,540 --> 00:18:01,139
believe that the government have any
rights to any information about us.

309
00:18:01,139 --> 00:18:06,600
Where, where, what is the model for
beginning to figure this stuff out?

310
00:18:06,600 --> 00:18:08,250
Is it back in the 19th century?

311
00:18:08,250 --> 00:18:10,709
The beginning of the print,
uh, mass printing press.

312
00:18:11,970 --> 00:18:16,860
Well, I do think we are on like a, a
moment of evolution, if not revolution,

313
00:18:16,860 --> 00:18:19,230
about, uh, the power of surveillance.

314
00:18:19,230 --> 00:18:23,490
And you can just look at China and
see what happens when you have a,

315
00:18:23,550 --> 00:18:27,780
you know, city that is like just
covered in cameras everywhere you go.

316
00:18:27,900 --> 00:18:30,810
And it's not just cameras, it's
video analytics like baked into

317
00:18:30,810 --> 00:18:32,280
the camera so you can identify.

318
00:18:32,690 --> 00:18:35,240
Individuals and objects and where
they've gone, whether they're

319
00:18:35,240 --> 00:18:36,500
doing anything right or wrong.

320
00:18:36,770 --> 00:18:39,140
Uh, and then connecting all those
data sets to who they are and what

321
00:18:39,140 --> 00:18:40,550
they do and their social credit score.

322
00:18:40,550 --> 00:18:44,450
And the rest, like we actually have the
technology in the world, in America.

323
00:18:44,450 --> 00:18:47,390
We're sort of developing it in
our local way because policing is

324
00:18:47,390 --> 00:18:49,460
a very local, uh, uh, business.

325
00:18:49,760 --> 00:18:52,010
Uh, and so people aren't necess.

326
00:18:52,350 --> 00:18:57,030
Putting all the pieces together
to see this kind of change in law.

327
00:18:57,270 --> 00:18:59,610
But I think when you have, I mean,
there's a change in technology, but

328
00:18:59,610 --> 00:19:02,865
when you have that kind of change in
technology and really change in power.

329
00:19:03,570 --> 00:19:06,240
We should have a corresponding
change in law and have these

330
00:19:06,240 --> 00:19:07,920
debates about whether we are okay.

331
00:19:08,130 --> 00:19:12,180
And it is true that there will
be people who are, uh, completely

332
00:19:12,180 --> 00:19:14,070
against any surveillance technology.

333
00:19:14,280 --> 00:19:18,180
And then there'll be people who are
actually very, you know, pro technology

334
00:19:18,180 --> 00:19:19,530
because they sort of like the.

335
00:19:19,900 --> 00:19:22,180
You know, if we watch everything,
nothing bad will happen.

336
00:19:22,450 --> 00:19:27,610
Uh, and then hopefully we can have a
debate about how to give the data that we

337
00:19:27,610 --> 00:19:31,210
need, if we really need it in a particular
case, but maybe protect other interests

338
00:19:31,210 --> 00:19:34,510
like First Amendment interests, religious
interests, you know, medical interests.

339
00:19:34,510 --> 00:19:38,620
Like do you really need, uh, automated
license plate camera outside the, you

340
00:19:38,620 --> 00:19:42,550
know, medical clinic or the, you know,
AA meeting or any of those things.

341
00:19:42,550 --> 00:19:43,840
Like we could have that debate.

342
00:19:44,170 --> 00:19:46,090
It is hard, but it's doable.

343
00:19:47,639 --> 00:19:52,800
You bring up the China example and
their notorious social credit system.

344
00:19:52,800 --> 00:19:54,899
There are lots of cameras
around the world, not just in

345
00:19:54,899 --> 00:19:56,280
China and London, for example.

346
00:19:56,280 --> 00:20:00,780
I think there are more cameras in London
than anywhere else, but, uh, the UK still

347
00:20:00,780 --> 00:20:06,330
has a constitutional system as opposed
to the authoritarian system in China.

348
00:20:06,899 --> 00:20:10,229
Can one distinguish them
between types of government?

349
00:20:10,229 --> 00:20:12,600
I, I think I've already
asked this question, but.

350
00:20:13,085 --> 00:20:17,345
Some governments are more responsible
in terms of how they use this data and

351
00:20:18,665 --> 00:20:21,335
how would you compare European models?

352
00:20:21,335 --> 00:20:24,004
Obviously you've, you've
mentioned China with the us.

353
00:20:24,935 --> 00:20:28,835
Well, I think what's interesting is
I think that the, the origin story of

354
00:20:28,835 --> 00:20:34,115
how we reach this point where we have
essentially built a surveillance state.

355
00:20:34,190 --> 00:20:39,200
Almost where we can do this kind of
tracking and um, uh, monitoring of

356
00:20:39,200 --> 00:20:44,930
people but haven't panicked yet is
in part because up until right about

357
00:20:44,930 --> 00:20:49,190
this moment in American history, the
people using the technology were.

358
00:20:49,545 --> 00:20:53,085
Somewhat moderated and even somewhat
concerned, like the Biden administration,

359
00:20:53,385 --> 00:20:55,365
uh, was doing internal checks.

360
00:20:55,365 --> 00:20:57,165
They had, you know, an AI policy.

361
00:20:57,375 --> 00:21:02,265
They clearly were hearing voices
of civil libertarians and activists

362
00:21:02,265 --> 00:21:06,345
who were against it, and there
was a, a sense of restraint.

363
00:21:06,765 --> 00:21:10,695
I think what just has happened just
in the last few months is that we've

364
00:21:10,695 --> 00:21:13,545
seen that there are no restraints.

365
00:21:13,725 --> 00:21:14,925
The norms are gone.

366
00:21:15,794 --> 00:21:17,985
There is no law or
constitutional law to fill it.

367
00:21:18,254 --> 00:21:22,064
And so we're in a point here in
America where the parallels to

368
00:21:22,064 --> 00:21:26,745
China are actually stronger than
they were, you know, two years ago.

369
00:21:27,014 --> 00:21:31,125
And, uh, you know, and maybe you
can look to other governments

370
00:21:31,125 --> 00:21:32,175
that would do it wisely.

371
00:21:32,175 --> 00:21:36,794
And again, the European, um, uh, many
European cities have pretty elaborate

372
00:21:36,794 --> 00:21:42,195
surveillance systems and people don't seem
quite as, uh, uh, uh, worried about it.

373
00:21:42,495 --> 00:21:42,864
Um, but.

374
00:21:44,320 --> 00:21:48,060
If an election can change that
and we can see how, um, power can

375
00:21:48,060 --> 00:21:52,405
be used or misused, or how the,
the people targeted can change.

376
00:21:54,015 --> 00:21:57,585
There's a huge debate, probably
outside your sphere of interest,

377
00:21:57,585 --> 00:22:02,265
Andrew, about the crisis of democracy
and the threat of authoritarianism.

378
00:22:02,775 --> 00:22:07,395
Do you think one reason why the American
democracy seems to be in crisis and we

379
00:22:07,395 --> 00:22:13,455
have this shift towards authoritarianism,
is this emerging panopticon?

380
00:22:13,455 --> 00:22:15,220
I mean, we can blame
it all, of course, on.

381
00:22:16,020 --> 00:22:19,110
Trump or Stephen Miller
or somebody within ice.

382
00:22:19,110 --> 00:22:24,330
But are there structural reasons why
democracy seems to be increasingly in

383
00:22:24,330 --> 00:22:29,520
crisis in the digital age, where the
government knows more and more about

384
00:22:29,520 --> 00:22:32,010
us through our self surveillance?

385
00:22:34,500 --> 00:22:38,250
I'm, I'm sure people have made that
argument and can make that argument.

386
00:22:38,645 --> 00:22:42,240
I, I think that what's
interesting now is that.

387
00:22:42,675 --> 00:22:47,715
The Trump administration has sort
of seen the potential of this

388
00:22:47,715 --> 00:22:48,915
surveillance that has been there.

389
00:22:48,915 --> 00:22:52,185
I mean, again, the Biden administration
was investing in a lot of this technology.

390
00:22:52,305 --> 00:22:54,345
They were building out the
real time crime centers.

391
00:22:54,585 --> 00:22:58,605
They were, um, um, you know,
fighting against, you know, Ron

392
00:22:58,605 --> 00:23:01,785
Widen, Senator Wen's, like the
Fourth Amendment is not for sale.

393
00:23:01,785 --> 00:23:04,185
Act like there was like push,
like the Biden administration was

394
00:23:04,185 --> 00:23:06,795
building this out, but I think
they thought it, they were doing

395
00:23:06,795 --> 00:23:08,715
it, it wisely and it's only now.

396
00:23:08,960 --> 00:23:12,860
Where there's this like, aggressive use
of technology that I think the Trump

397
00:23:12,860 --> 00:23:15,830
administration has realized, like, wow,
if we keep, there's these companies

398
00:23:15,830 --> 00:23:19,280
that will sell us this data and we
could track anyone we want right now.

399
00:23:19,280 --> 00:23:22,250
Maybe we're focused on, you
know, undocumented people or

400
00:23:22,250 --> 00:23:25,970
people protesting, uh, ice, but
we could use this for anything.

401
00:23:26,000 --> 00:23:30,410
And I think we're still, they, they
may be waking up to this power and

402
00:23:30,410 --> 00:23:32,780
I think that this is a moment where.

403
00:23:33,075 --> 00:23:38,205
People who are now, the new
targets of, uh, of, of this power

404
00:23:38,205 --> 00:23:39,795
are, are frightened about it.

405
00:23:40,155 --> 00:23:43,275
Um, but my point to them and
even to the Trump administration

406
00:23:43,275 --> 00:23:46,125
is like, you know, president
Trump's data was used against him.

407
00:23:46,365 --> 00:23:50,625
Like they actually went back and
used his data, his texts, uh, and

408
00:23:50,625 --> 00:23:54,825
this is a bipartisan fear, like
whoever's in power may well abuse

409
00:23:54,825 --> 00:23:58,305
this power, and thus we should have
rules and regulations and limits.

410
00:23:58,515 --> 00:24:00,820
So we will prevent this
foreseeable danger.

411
00:24:02,090 --> 00:24:05,765
When it comes to the, the right
to privacy of the American.

412
00:24:06,659 --> 00:24:09,060
Founders, who do you think was the wisest?

413
00:24:09,090 --> 00:24:13,169
We began this conversation talking
about Charlottesville, of course,

414
00:24:13,169 --> 00:24:14,970
the home of Thomas Jefferson.

415
00:24:14,970 --> 00:24:19,620
I'm guessing in terms of this conversation
and the appearance of a, a panopticon

416
00:24:19,620 --> 00:24:24,240
like architecture in America be
turning in his grave, but did some of

417
00:24:24,240 --> 00:24:27,179
the founders anticipate any of this?

418
00:24:28,199 --> 00:24:28,889
I think so.

419
00:24:28,889 --> 00:24:32,850
So at the end of the book, I, I talk about
the tyrant test and I sort of use as a

420
00:24:32,850 --> 00:24:35,340
metaphor, but also a plan of action where.

421
00:24:36,720 --> 00:24:39,120
The tyrant test says we
should imagine the worst.

422
00:24:39,300 --> 00:24:39,960
Imagine

423
00:24:40,890 --> 00:24:41,400
the tyrant.

424
00:24:41,400 --> 00:24:42,060
The tyrant test.

425
00:24:42,120 --> 00:24:42,930
Tyrant test.

426
00:24:42,930 --> 00:24:42,990
Yeah.

427
00:24:43,080 --> 00:24:45,780
So the, the, the, the solution,
one of the solutions at the end of

428
00:24:45,780 --> 00:24:47,100
the book is called the Tyrant test.

429
00:24:47,640 --> 00:24:49,950
Uh, and it basically says
we should imagine the worst.

430
00:24:49,980 --> 00:24:54,870
We should imagine the tyrant reading your
most embarrassing Google queries, uh,

431
00:24:54,870 --> 00:24:56,730
and finding out everywhere you've gone.

432
00:24:57,150 --> 00:24:57,660
And then we should.

433
00:24:58,085 --> 00:24:59,075
Plan accordingly.

434
00:24:59,195 --> 00:25:00,665
So what would we do structurally?

435
00:25:00,665 --> 00:25:03,125
These are big questions
that involve big answers.

436
00:25:03,125 --> 00:25:06,185
There have to be legislative changes,
there have to be judicial changes.

437
00:25:06,185 --> 00:25:08,855
We need to have a division
of power between federal and

438
00:25:08,855 --> 00:25:10,535
state and like community.

439
00:25:10,535 --> 00:25:12,995
Maybe even have things like
juries and grand juries.

440
00:25:13,325 --> 00:25:17,345
Uh, and I say, you know, the model for
this tyrant test is America, right?

441
00:25:17,345 --> 00:25:20,600
The founding fathers were
concerned about the idea of a.

442
00:25:21,280 --> 00:25:23,379
Any entity having too much power.

443
00:25:23,560 --> 00:25:27,760
So they tried to split it apart, tried
to have contested, uh, places of power,

444
00:25:27,939 --> 00:25:31,959
and we should do the same thing with,
uh, data and, and digital surveillance.

445
00:25:32,169 --> 00:25:34,419
We should create checks,
rights and remedies.

446
00:25:34,629 --> 00:25:38,379
We should have, uh, limits of, of juries
and grand juries who could sort of vet

447
00:25:38,379 --> 00:25:41,355
whether we want this kind of surveillance
technology in their community.

448
00:25:41,770 --> 00:25:45,010
And we have to have legislative
protections and judicial, uh,

449
00:25:45,010 --> 00:25:46,719
acts like judges have to actually.

450
00:25:47,159 --> 00:25:50,340
Maybe act to protect, uh,
individuals when the Fourth

451
00:25:50,340 --> 00:25:52,560
Amendment is, uh, you know, at issue.

452
00:25:52,830 --> 00:25:56,129
And if we do all of the above and
I try to lay 'em out in a more

453
00:25:56,129 --> 00:26:00,540
sophisticated way that I'm saying here
in the book, we we're moving forward.

454
00:26:00,659 --> 00:26:03,720
It's not gonna be a solution to
every problem we have, but it's

455
00:26:03,720 --> 00:26:06,534
definitely better than what we have
now, which is essentially nothing.

456
00:26:09,345 --> 00:26:13,095
We, what we have now, of
course, is AI moving very fast.

457
00:26:13,095 --> 00:26:18,855
It was a recent, um, story
about perplexity, public safety

458
00:26:19,365 --> 00:26:23,415
deal, alarming experts when it
comes to policing the police.

459
00:26:23,415 --> 00:26:23,895
Ai.

460
00:26:23,895 --> 00:26:26,115
How does AI change all this?

461
00:26:26,115 --> 00:26:26,505
Andrew?

462
00:26:27,345 --> 00:26:29,475
So it's interesting, AI is
changing surveillance 'cause

463
00:26:29,475 --> 00:26:30,615
it's kind of supercharging.

464
00:26:30,645 --> 00:26:35,865
So think about the cameras that we've had,
uh, on our city streets for, for ages now.

465
00:26:35,865 --> 00:26:37,995
Like everyone's kind of familiar
with these cameras, right?

466
00:26:38,235 --> 00:26:40,875
But now imagine 'em all going back
to that real time crime center

467
00:26:40,875 --> 00:26:43,245
just de uh, described and then.

468
00:26:43,650 --> 00:26:47,970
Being able to track or identify any
objects, so man, woman, child, bus, van

469
00:26:48,000 --> 00:26:52,710
blue, van Blue Subaru, Tesla, whatever it
is, and be able to separate out all those

470
00:26:52,710 --> 00:26:54,810
objects and then track them back in time.

471
00:26:54,810 --> 00:26:55,680
It's like a time machine.

472
00:26:55,980 --> 00:26:57,690
And the way we do that is ai, right?

473
00:26:57,690 --> 00:26:59,815
We basically can train
the AI to match like.

474
00:27:00,135 --> 00:27:01,245
This is what Tesla looks like.

475
00:27:01,245 --> 00:27:02,685
This is what a cyber truck looks like.

476
00:27:02,685 --> 00:27:05,055
This is what, uh, you
know, a Ford looks like.

477
00:27:05,235 --> 00:27:09,945
And we have trained the video surveillance
so that we have turned ordinary

478
00:27:09,945 --> 00:27:14,475
cameras into something wholly new,
which is a system of tracking in major

479
00:27:14,475 --> 00:27:18,165
cities that have, you know, cameras
like Chicago's over 32,000 cameras.

480
00:27:18,495 --> 00:27:24,045
Uh, like we are developing a, our own
panopticon o of sorts in this way.

481
00:27:24,045 --> 00:27:25,485
And behind that is ai.

482
00:27:25,805 --> 00:27:27,485
We'll see other smaller pieces.

483
00:27:27,485 --> 00:27:30,725
I just read an article about AI assisted
police reports, so it's not in the

484
00:27:30,725 --> 00:27:35,315
book, but it's the idea of you can
take the audio from a pol a a, uh,

485
00:27:35,345 --> 00:27:40,775
body camera footage and turn it into
a pre-printed chat GPT police report.

486
00:27:40,775 --> 00:27:45,665
That then becomes the basis of the arrest
prosecution and the rest of, uh, the

487
00:27:45,665 --> 00:27:47,345
criminal justice system going forward.

488
00:27:47,765 --> 00:27:51,725
Fascinating new technology, changing
the relationship between police

489
00:27:51,785 --> 00:27:53,915
and, and, uh, the community.

490
00:27:53,970 --> 00:27:55,650
We're just gonna see more of it.

491
00:27:57,060 --> 00:28:01,770
It was interesting in my conversation
with, uh, Matts, we talked about the

492
00:28:01,770 --> 00:28:04,650
rights and wrongs of wearing masks.

493
00:28:04,650 --> 00:28:10,620
From his point of view, he thinks that
the left or Antifa have a right to, uh,

494
00:28:10,620 --> 00:28:15,720
expose the identity of people on the far
right and that they have no right and

495
00:28:15,750 --> 00:28:18,120
people working for ice to wear masks.

496
00:28:18,120 --> 00:28:22,950
What's your take on masks
and anonymity in public?

497
00:28:24,135 --> 00:28:26,324
Well, I, I haven't read his
book, so I, I apologize,

498
00:28:26,415 --> 00:28:28,544
but, um, yeah, and he's not,
and to be fair, he, he's

499
00:28:28,544 --> 00:28:30,705
anything but a legal scholar.

500
00:28:30,705 --> 00:28:33,165
He's a polemicist, he's a guy on the left.

501
00:28:33,165 --> 00:28:37,455
So I'm not trying to compare, but I'm,
I'm curious as to your thinking on our

502
00:28:38,264 --> 00:28:44,355
legal right to anonymity on the right,
for example, to wear masks if we want to

503
00:28:44,355 --> 00:28:50,595
go to a a, a white supremacist rally or a
Antifa rally, should we have that right?

504
00:28:53,085 --> 00:28:56,835
It's sort of outside my expertise, but
just sort of, you know, spit balling here.

505
00:28:57,135 --> 00:29:02,024
Uh, I think that individuals should
be able to choose whether to mask

506
00:29:02,024 --> 00:29:04,875
or not and thus keep a measure
of anonymity when they protest.

507
00:29:04,875 --> 00:29:06,345
'cause it probably cuts both ways.

508
00:29:06,645 --> 00:29:09,825
I don't think law enforcement should,
I think there's a real troubling, uh,

509
00:29:09,825 --> 00:29:14,054
reality that we're seeing masked law
enforcement just because we really want

510
00:29:14,054 --> 00:29:17,895
a government, uh, and, and police power
that's accountable like the whole.

511
00:29:18,650 --> 00:29:25,490
You know, sense of why we might trust
giving over sort of this carceral power

512
00:29:25,700 --> 00:29:30,650
to someone else is because we should
implicitly trust that they're following

513
00:29:30,650 --> 00:29:34,520
rules, that they understand what they're
doing, and there's a transparent way

514
00:29:34,520 --> 00:29:35,420
to hold 'em accountable when they.

515
00:29:35,785 --> 00:29:36,835
Don't follow those rules.

516
00:29:37,045 --> 00:29:40,015
I think masking and not having
badge numbers and not doing all the

517
00:29:40,015 --> 00:29:44,005
things that we normally have with
local policing, uh, undermines that.

518
00:29:44,365 --> 00:29:48,685
Uh, but for ordinary citizens, like
I, I understand why, you know, I

519
00:29:48,685 --> 00:29:50,545
think there's a value in anonymity.

520
00:29:50,905 --> 00:29:54,025
Um, uh, and, you know, and, and protests.

521
00:29:55,995 --> 00:29:57,645
All sorts of debates endlessly.

522
00:29:57,645 --> 00:30:02,115
Andrew, again, you don't need me to tell
you this about the rise or fall of crime.

523
00:30:02,145 --> 00:30:05,145
Uh, every administration seems
to be obsessed with Trump in.

524
00:30:06,960 --> 00:30:10,440
When, uh, Jeremy Bentham in the
late nine, uh, 18th century came

525
00:30:10,440 --> 00:30:13,680
up with his idea of the panopticon
where everything could be seen.

526
00:30:13,680 --> 00:30:18,270
It wasn't that people would actually,
everything would be seen, but that

527
00:30:18,270 --> 00:30:22,080
people would fear they were being
watched, which would enable them, or

528
00:30:22,080 --> 00:30:24,270
at least in Bentham's utilitarian.

529
00:30:25,065 --> 00:30:28,515
Mind to behave themselves, to
obey the law, whether it was in

530
00:30:28,935 --> 00:30:31,515
prisons or schools or hospitals.

531
00:30:31,965 --> 00:30:37,485
Could one, make the argument perhaps in
a, in a utilitarian sense that all these

532
00:30:37,485 --> 00:30:43,185
cameras are actually good because it means
that people are breaking the law less?

533
00:30:44,370 --> 00:30:47,535
I think that's the rationale, the
Chinese government for, for, uh,

534
00:30:47,595 --> 00:30:48,825
putting them on their streets.

535
00:30:48,885 --> 00:30:52,425
But I, I think there's a
countervailing argument that, you

536
00:30:52,425 --> 00:30:53,715
know, the utilitarian argument.

537
00:30:54,135 --> 00:30:56,205
Necessarily suppresses the.

538
00:30:56,580 --> 00:31:00,690
Outlier, the dissenter, the voice that
people don't want to hear or, or see.

539
00:31:00,960 --> 00:31:01,950
And that's pretty American.

540
00:31:02,280 --> 00:31:06,780
Um, the idea of being able to,
uh, do what you want, even if it's

541
00:31:06,899 --> 00:31:08,250
uncomfortable to other people.

542
00:31:08,639 --> 00:31:13,620
Uh, and so I'm not sure we
necessarily live in an, a utilitarian

543
00:31:13,830 --> 00:31:15,270
world in that, that sense.

544
00:31:15,690 --> 00:31:19,830
Um, I also, I'm not sure it necessarily
works, like we've had lots of deterrents.

545
00:31:19,830 --> 00:31:22,560
So you have deterrence in, you
know, cameras and, you know.

546
00:31:22,640 --> 00:31:25,070
Seven elevens that still get
robbed, and it's not necessarily,

547
00:31:25,220 --> 00:31:26,720
it works in that same way.

548
00:31:27,200 --> 00:31:34,610
Um, but, uh, I also, I, I think it,
it is an understandable argument

549
00:31:34,850 --> 00:31:39,080
and I think that it is an argument
that needs to be pushed back again

550
00:31:39,080 --> 00:31:40,820
because we, I don't think we want.

551
00:31:41,625 --> 00:31:46,125
To live in a world where the
government can see everything we do

552
00:31:46,185 --> 00:31:51,525
and thus judge us for us and possibly
penalize us for us, uh, us for it.

553
00:31:51,585 --> 00:31:54,555
Um, especially if that data would
then be used to prosecute you.

554
00:31:54,555 --> 00:31:59,535
Like it can be in China where if you
jaywalk, uh, you can get your old face put

555
00:31:59,535 --> 00:32:01,370
up on the side and they ticket you for it.

556
00:32:02,640 --> 00:32:07,470
Um, certainly I don't think, uh, anyone
watching is necessarily sympathetic

557
00:32:07,470 --> 00:32:13,230
to the Chinese state, but if we had
technology, if we carry cameras or

558
00:32:13,230 --> 00:32:18,180
smartphones or AI devices around with
us, which were increasingly doing, which

559
00:32:18,180 --> 00:32:24,300
will reveal everything we're doing, uh,
and it did indeed radically reduce crime.

560
00:32:26,764 --> 00:32:28,564
What should a government do?

561
00:32:28,564 --> 00:32:34,475
Sh should a civilized government in a
sense, in this context at least, uh,

562
00:32:34,504 --> 00:32:39,364
believe that a degree of crime isn't
a bad thing when it comes to freedom.

563
00:32:41,085 --> 00:32:45,195
I, I think the danger in that sentence
is the definition of crime, right?

564
00:32:45,195 --> 00:32:49,875
So if crime is seeking out an abortion
for your daughter in Texas or Idaho,

565
00:32:49,875 --> 00:32:50,774
or a place that's criminal, no.

566
00:32:50,929 --> 00:32:51,090
No.

567
00:32:51,095 --> 00:32:51,195
Aren't

568
00:32:51,195 --> 00:32:56,504
you just questioning the nature of the law
and saying that if you don't approve of

569
00:32:56,504 --> 00:32:58,065
the law, you have the right to break it?

570
00:32:59,325 --> 00:33:06,315
Well, I, I, well, I think that the danger
is like when protest becomes criminalized.

571
00:33:06,959 --> 00:33:11,370
Criminalized, then it's all
about who's defining the crime.

572
00:33:11,850 --> 00:33:14,310
Um, I think we could come to a
compromise about saying, you know,

573
00:33:14,310 --> 00:33:17,730
there's certain crimes and certain
times where some of this technology.

574
00:33:17,940 --> 00:33:21,540
It can be very useful, uh, and we
should have rules about how you get

575
00:33:21,540 --> 00:33:24,210
access to it and when you can't get
access to it and how you can't abuse it.

576
00:33:24,450 --> 00:33:28,080
But I think the one thing we do know
is if you had that magical power where

577
00:33:28,080 --> 00:33:31,500
everyone was being watched and everything
was being controlled by one person, it

578
00:33:31,500 --> 00:33:35,580
would most certainly be abused and it
would be abused for political reasons, for

579
00:33:35,580 --> 00:33:37,620
personal reasons, for financial reasons.

580
00:33:37,980 --> 00:33:40,920
And I think I, I would
rather live in a world where.

581
00:33:41,045 --> 00:33:42,245
We didn't have that.

582
00:33:42,455 --> 00:33:46,985
And we can deal with a, a little level
of, of some crime that probably still

583
00:33:46,985 --> 00:33:51,755
gonna exist even when you over the cameras
there, um, but isn't gonna be misused

584
00:33:51,755 --> 00:33:53,975
in that sort of tyrannical, uh, way.

585
00:33:53,975 --> 00:33:56,405
And I think that is what would happen.

586
00:33:56,525 --> 00:33:58,745
I, I, I would bet good money,
that's what will happen.

587
00:33:59,465 --> 00:34:02,255
For better or worse, Andrew, we
seem to be living in the future.

588
00:34:02,255 --> 00:34:04,985
There's a movie coming out
this year called Mercy.

589
00:34:05,580 --> 00:34:12,480
Uh, about, uh, an AI world in which
intentionality reveals whether or not

590
00:34:12,480 --> 00:34:17,699
one's guilty technology is moving so fast.

591
00:34:18,060 --> 00:34:24,210
H how is this gonna impact, um,
criminality and indeed intentionality?

592
00:34:24,360 --> 00:34:26,880
Uh, can one be guilty?

593
00:34:28,050 --> 00:34:30,600
Intent, uh, of, of, of, of crimes.

594
00:34:30,600 --> 00:34:33,179
If, if, if one thinks in that way.

595
00:34:34,560 --> 00:34:39,690
So if you take criminal law, uh, you
learn about how we as a society don't

596
00:34:39,929 --> 00:34:42,720
punish people based on thoughts alone.

597
00:34:43,290 --> 00:34:49,650
Um, you learn that we require some kind
of act, and so I imagine that whatever,

598
00:34:49,830 --> 00:34:54,240
you know, AI Hollywood wants to create,
whether it's Minority Report, you know,

599
00:34:54,240 --> 00:34:56,880
which is the earlier version of Mercy
probably, although I haven't seen Yeah.

600
00:34:56,880 --> 00:34:58,590
Minority Report course covered,

601
00:34:58,590 --> 00:34:58,890
right?

602
00:34:58,890 --> 00:35:06,390
I mean, again, a, it will be wrong and
b, it, it still hopefully will require

603
00:35:06,390 --> 00:35:08,730
people to do some, some kind of act.

604
00:35:08,730 --> 00:35:10,920
So it's not just in
your head, but more so.

605
00:35:11,610 --> 00:35:18,270
If we have these changes, we should have
corresponding legal responses that if

606
00:35:18,270 --> 00:35:23,040
we're gonna change the nature of power
between police and prosecutors and

607
00:35:23,190 --> 00:35:29,310
ordinary people, we need to rebalance the
protections for those, uh, people as well.

608
00:35:29,490 --> 00:35:32,550
Because if we just like, follow
the technology, it's probably

609
00:35:32,550 --> 00:35:37,290
gonna have not be a happy ending,
uh, to that, that that movie.

610
00:35:38,955 --> 00:35:42,524
Yeah, I'm not sure there is a very
happy ending to that particular movie,

611
00:35:42,524 --> 00:35:46,424
Andrew, in our age of self surveillance,
people are gonna be watching this and

612
00:35:47,234 --> 00:35:50,714
getting really, I think, very nervous,
probably thinking to themselves,

613
00:35:50,714 --> 00:35:52,575
I've gotta give up my smartphone.

614
00:35:52,754 --> 00:35:53,774
I'm not gonna go out.

615
00:35:53,774 --> 00:35:55,395
I don't want to be seen by cameras.

616
00:35:55,875 --> 00:36:03,075
What's your advice to people when it
comes to this avalanche of data, self

617
00:36:03,075 --> 00:36:05,354
surveillance, and otherwise should people.

618
00:36:05,700 --> 00:36:09,045
Consider turning their, their phones off.

619
00:36:10,035 --> 00:36:12,915
Most of the time, especially when
they're doing something, shall

620
00:36:12,915 --> 00:36:16,755
we say, morally dubious, dubious
if, if not breaking the law,

621
00:36:17,985 --> 00:36:22,365
I think as a general matter, like turning
your back on new technology just isn't

622
00:36:22,365 --> 00:36:25,485
going to be a, a workable solution, right?

623
00:36:25,695 --> 00:36:29,385
We live by our smartphones
and our smart cars.

624
00:36:29,385 --> 00:36:31,965
You can't even buy a car now that
isn't tracking you with some,

625
00:36:32,325 --> 00:36:34,935
uh, device like that is the norm.

626
00:36:35,550 --> 00:36:37,590
We should have laws to protect it.

627
00:36:37,590 --> 00:36:42,150
Again, my argument in the book isn't that
we need to have like data privacy laws,

628
00:36:42,150 --> 00:36:45,330
although we should, I hope someone, you
know, does the hard job of figuring out

629
00:36:45,330 --> 00:36:47,010
the data privacy, data protection world.

630
00:36:47,220 --> 00:36:50,730
But this is like, should this
data necessarily be used against

631
00:36:50,730 --> 00:36:53,040
us in a court of law and then.

632
00:36:53,400 --> 00:36:56,279
If so, what are the, the rules
by which you would do it?

633
00:36:56,430 --> 00:37:00,480
Maybe the current like warrant
system and, and you, for many of

634
00:37:00,480 --> 00:37:02,160
the data we're talking about, you
don't even need a warrant, but the

635
00:37:02,160 --> 00:37:03,600
warrant system might be two weeks.

636
00:37:03,600 --> 00:37:07,259
So maybe you need something stronger and
we could actually solve that problem.

637
00:37:07,589 --> 00:37:10,440
The bigger data protection problem
of, than having to like, you know,

638
00:37:10,440 --> 00:37:12,360
hide your phone is gonna be difficult.

639
00:37:12,540 --> 00:37:15,480
But we could pass laws
tomorrow saying, you know what?

640
00:37:15,540 --> 00:37:19,230
The government can't use this data
against you in a court of law.

641
00:37:19,575 --> 00:37:22,995
Google can still use it to sell
you products, but the government

642
00:37:22,995 --> 00:37:24,525
can't use it to, to prosecute you.

643
00:37:24,795 --> 00:37:29,175
We haven't had that world, that is sort
of a revolutionary, counterintuitive

644
00:37:29,175 --> 00:37:32,835
world, but we could, we could have
the best of both worlds if we wanted

645
00:37:32,835 --> 00:37:34,700
to make these difficult decisions.

646
00:37:36,345 --> 00:37:40,035
Well, I think the one thing we've learned
from this conversation, Andrew, is

647
00:37:40,035 --> 00:37:44,085
that if you are planning to burn down
your own house and claim the insurance

648
00:37:44,085 --> 00:37:46,515
money, don't use a heart monitor.

649
00:37:46,515 --> 00:37:47,055
Is that right?

650
00:37:47,625 --> 00:37:51,495
That is definitely a, a helpful
tip about being careful that you

651
00:37:51,495 --> 00:37:53,265
know every smart device you own.

652
00:37:53,325 --> 00:37:54,705
It is a surveillance device.

653
00:37:56,115 --> 00:37:59,774
Well, on that helpful note for
anyone planning to burn down your

654
00:37:59,774 --> 00:38:01,875
house, put your, put your phone away.

655
00:38:01,964 --> 00:38:05,895
Uh, although some people will be
watching this on their phone, uh, very

656
00:38:05,895 --> 00:38:09,254
interesting and important new book,
your data will be used against you

657
00:38:09,254 --> 00:38:11,924
policing in the Age of Self Surveillance.

658
00:38:12,435 --> 00:38:15,615
Uh, Andrew Guthrie Ferguson,
thank you so much for a really

659
00:38:15,615 --> 00:38:16,785
interesting conversation.

660
00:38:17,145 --> 00:38:17,444
Thank you.

661
00:38:17,444 --> 00:38:17,835
It was a pleasure.