1
00:00:00,000 --> 00:00:27,640
Hey everybody, KMO here with episode number 19 of The KMO Show.

2
00:00:27,640 --> 00:00:31,280
Prepared for release on Wednesday, August 2nd, 2023.

3
00:00:31,280 --> 00:00:33,960
I don't have a guest this week.

4
00:00:33,960 --> 00:00:40,360
For the next hour, it will be just me and you, as I try and surely fail, to squeeze

5
00:00:40,360 --> 00:00:44,760
in all of my observations and thoughts on three seemingly unrelated topics.

6
00:00:44,760 --> 00:00:51,160
The Congressional UAP hearings, the Hollywood Writers Strike, and the tech CEO cult known

7
00:00:51,160 --> 00:00:54,560
as Effective Accelerationism.

8
00:00:54,560 --> 00:00:59,440
Abbreviated as E-slash-A-C-C online, but that's more of a mouthful than just saying the full

9
00:00:59,440 --> 00:01:01,000
ten syllable phrase.

10
00:01:01,000 --> 00:01:04,440
I say that these are seemingly unrelated topics.

11
00:01:04,440 --> 00:01:09,520
The TLDR is that artificial intelligence is the thread tying them all together.

12
00:01:09,520 --> 00:01:13,840
The COVID pandemic, and I would argue the woke destruction of previously popular film

13
00:01:13,840 --> 00:01:19,700
franchises like Star Wars and the MCU, left the cinema multiplexes feeling like ghost

14
00:01:19,700 --> 00:01:22,240
towns these past few years.

15
00:01:22,240 --> 00:01:26,280
Since I've been back in Berryville after my winter out west, I've only seen one film

16
00:01:26,280 --> 00:01:27,560
in the movie theater.

17
00:01:27,560 --> 00:01:29,760
Guardians of the Galaxy Vol. 3.

18
00:01:29,760 --> 00:01:32,520
I liked it well enough.

19
00:01:32,520 --> 00:01:35,160
I live in a small town with only one movie theater.

20
00:01:35,160 --> 00:01:40,640
It only has one screen and has one showing per day Friday through Sunday.

21
00:01:40,640 --> 00:01:45,760
The nearest multiplex is in Rogers, Arkansas, which is over an hour's drive from here.

22
00:01:45,760 --> 00:01:50,760
Given all that, it should come as no surprise that I don't see a lot of movies in the theater,

23
00:01:50,760 --> 00:01:55,640
nor that I haven't seen either the Barbie movie starring Margot Robbie or Christopher

24
00:01:55,640 --> 00:02:00,080
Nolan's biopic of Robert Oppenheimer starring Killian Murphy.

25
00:02:00,080 --> 00:02:03,680
Nor have I watched much of the congressional testimony of former intelligence officer David

26
00:02:03,680 --> 00:02:09,680
Grush, who claims to have substantial evidence that the US military has several craft of

27
00:02:09,680 --> 00:02:15,980
non-human origin in their possession, along with non-human biologics, which implies the

28
00:02:15,980 --> 00:02:19,720
corpses of alien pilots of those crashed vehicles.

29
00:02:19,720 --> 00:02:23,640
But I've done a fair bit of reading on the topic, and I have some opinions.

30
00:02:23,640 --> 00:02:26,280
So let's start with that.

31
00:02:26,280 --> 00:02:28,280
You're listening to the KMO Show.

32
00:02:28,280 --> 00:02:29,280
Let's go.

33
00:02:29,280 --> 00:02:35,680
First, for the purposes of concision, I'm going to read a little bit from Wikipedia,

34
00:02:35,680 --> 00:02:42,340
just because it's got a lot of dense information here that I can just get it on the table quickly.

35
00:02:42,340 --> 00:02:48,520
There is a Wikipedia page entitled, David Grush UFO Whistleblower Claims.

36
00:02:48,520 --> 00:02:51,200
And I'll read just a couple paragraphs here.

37
00:02:51,200 --> 00:02:57,560
In June 2023, United States Air Force officer and former intelligence official David Grush

38
00:02:57,560 --> 00:03:02,160
publicly claimed that unnamed officials told him that the US federal government maintains

39
00:03:02,160 --> 00:03:09,080
a highly secretive UFO or UAP recovery program and is in possession of quote, non-human,

40
00:03:09,080 --> 00:03:13,000
quote, spacecraft and quote, dead pilots, close quote.

41
00:03:13,000 --> 00:03:18,040
In 2022, Grush filed a whistleblower complaint with the US Office of the Intelligence Committee

42
00:03:18,040 --> 00:03:24,160
Inspector General, ICIG, to support his plan to share classified information with the US

43
00:03:24,160 --> 00:03:27,000
Senate Select Committee on Intelligence.

44
00:03:27,000 --> 00:03:31,720
He also filed a complaint alleging retaliation by his superiors over a similar complaint

45
00:03:31,720 --> 00:03:33,760
he made in 2021.

46
00:03:33,760 --> 00:03:38,400
Grush asserts that individuals with whom he has conversed shared the concern that American

47
00:03:38,400 --> 00:03:43,400
citizens have been killed as part of the government's efforts to cover up the information.

48
00:03:43,400 --> 00:03:48,760
In response to his June 2023 claims, both the National Aeronautics and Space Administration,

49
00:03:48,760 --> 00:03:53,920
that's NASA, and the US Department of Defense, or DoD, issued statements saying respectively

50
00:03:53,920 --> 00:03:59,260
that there is no evidence of extraterrestrial life and there is no verifiable information

51
00:03:59,260 --> 00:04:04,240
about the possession and reverse engineering of any extraterrestrial materials.

52
00:04:04,240 --> 00:04:10,480
During a congressional hearing on July 26, 2023, under the House Committee on Oversight

53
00:04:10,480 --> 00:04:14,880
and Accountability, Grush repeated his claims alongside testimony from US fighter pilots

54
00:04:14,880 --> 00:04:19,800
Ryan Graves and David Fravor on experiences related to UFOs.

55
00:04:19,800 --> 00:04:24,360
Grush testified that he could not elaborate publicly on some aspects of his claims, but

56
00:04:24,360 --> 00:04:29,960
offered to provide further details to representatives in a sensitive, compartmented information

57
00:04:29,960 --> 00:04:30,960
facility.

58
00:04:30,960 --> 00:04:34,120
Okay, there's more, but I'm going to stop reading there, other than to say a little

59
00:04:34,120 --> 00:04:37,840
bit about David Grush's background.

60
00:04:37,840 --> 00:04:41,800
He is a veteran of the Air Force.

61
00:04:41,800 --> 00:04:43,760
He served in Afghanistan.

62
00:04:43,760 --> 00:04:49,320
And then later he worked for the NGA, or the National Geospatial Intelligence Agency, and

63
00:04:49,320 --> 00:04:52,820
the NRO, the National Reconnaissance Office.

64
00:04:52,820 --> 00:04:59,000
And he was representing the National Reconnaissance Office when he participated from 2019 to 2021

65
00:04:59,000 --> 00:05:03,060
in the Unidentified Aerial Phenomena Task Force.

66
00:05:03,060 --> 00:05:08,760
And that was the precursor investigation to the current office which investigates such

67
00:05:08,760 --> 00:05:15,200
things which is called AARO, or the All Domain Anomaly Resolution Office.

68
00:05:15,200 --> 00:05:19,640
The director of AARO is Sean M. Kirkpatrick.

69
00:05:19,640 --> 00:05:22,180
He's a laser and materials physicist.

70
00:05:22,180 --> 00:05:30,040
And he, after David Grush's most recent high profile testimony to the Senate, issued not

71
00:05:30,040 --> 00:05:36,480
a statement, an official statement through official channels, but published a letter

72
00:05:36,480 --> 00:05:42,480
on his LinkedIn page basically saying, David Grush doesn't work for my organization, never

73
00:05:42,480 --> 00:05:45,520
has, neither do the two other witnesses.

74
00:05:45,520 --> 00:05:51,800
And Kirkpatrick has testified in front of Congress himself saying that there is no evidence

75
00:05:51,800 --> 00:05:57,200
that the United States government is in possession of materials or technology or biological samples

76
00:05:57,200 --> 00:06:00,440
of non-terrestrial origin.

77
00:06:00,440 --> 00:06:03,480
Basically he's saying, we've looked into it.

78
00:06:03,480 --> 00:06:13,760
94, I think it was, no, 96% of the cases that they looked into had a mundane explanation

79
00:06:13,760 --> 00:06:17,960
and 4% were genuinely anomalous.

80
00:06:17,960 --> 00:06:23,260
They didn't have the information necessary to determine the exact explanation or cause

81
00:06:23,260 --> 00:06:24,260
for what was reported.

82
00:06:24,260 --> 00:06:26,240
Does that mean it was aliens?

83
00:06:26,240 --> 00:06:30,760
I'll put my cards on the table and say, I don't think so.

84
00:06:30,760 --> 00:06:36,640
I will be very, very surprised to learn that extraterrestrials in physical spaceships have

85
00:06:36,640 --> 00:06:44,560
crossed interstellar distances from other solar systems to ours to visit our planet.

86
00:06:44,560 --> 00:06:53,040
That seems entirely improbable to me, but I am determined to keep a realistically open

87
00:06:53,040 --> 00:06:54,320
mind.

88
00:06:54,320 --> 00:07:01,720
Not a completely open mind, but I'll hold my opinions as lightly as I can.

89
00:07:01,720 --> 00:07:06,400
Somebody who speaks my language on this issue is Dr. Adam Frank.

90
00:07:06,400 --> 00:07:10,140
He is a professor of astrophysics at the University of Rochester.

91
00:07:10,140 --> 00:07:13,320
He has published many editorials on this topic.

92
00:07:13,320 --> 00:07:14,760
I'm looking at one.

93
00:07:14,760 --> 00:07:16,080
This is in the New York Times.

94
00:07:16,080 --> 00:07:19,160
It was from May 30th, 2021.

95
00:07:19,160 --> 00:07:22,280
Headline is, I'm a physicist who searches for aliens.

96
00:07:22,280 --> 00:07:24,440
UFOs don't impress me.

97
00:07:24,440 --> 00:07:27,400
Now, I'm not actually going to read from this.

98
00:07:27,400 --> 00:07:28,840
It's long and time is short.

99
00:07:28,840 --> 00:07:35,160
I'll just say that he's looking for aliens by looking at the spectrographic data of light

100
00:07:35,160 --> 00:07:41,100
that passes through the atmosphere of distant planets, looking at the chemical composition

101
00:07:41,100 --> 00:07:48,240
of the atmosphere of the planet, and looking within that data for evidence of either life,

102
00:07:48,240 --> 00:07:54,200
which is to say biosignatures, or alien technology, or technosignatures.

103
00:07:54,200 --> 00:08:00,720
From my perspective, this is the sober adult way to go about looking for aliens.

104
00:08:00,720 --> 00:08:06,480
But I'm offering these opinions basically just out of honesty.

105
00:08:06,480 --> 00:08:08,640
I'm telling you what side I'm on.

106
00:08:08,640 --> 00:08:12,000
I'm not a partisan for this position.

107
00:08:12,000 --> 00:08:13,760
I'm not arguing for it.

108
00:08:13,760 --> 00:08:18,160
I'm just letting you know where I stand as I come to this issue and go through the information.

109
00:08:18,160 --> 00:08:23,040
I am going to read something that Dr. Frank wrote, but it's something much more recent.

110
00:08:23,040 --> 00:08:25,020
This is from Big Think.

111
00:08:25,020 --> 00:08:30,040
The article is from August 2nd, 2023, so that's today.

112
00:08:30,040 --> 00:08:34,320
And the title is, Here's What a Scientist Makes of Congress's UFO Hearing.

113
00:08:34,320 --> 00:08:37,040
The X-Files Was Not a Documentary.

114
00:08:37,040 --> 00:08:38,040
Key takeaways.

115
00:08:38,040 --> 00:08:43,060
The recent congressional hearing on Unidentified Aerial Phenomenon, UAPs, highlights the need

116
00:08:43,060 --> 00:08:48,440
for more empirical research, given that most UAPs can be explained by earthly phenomenon.

117
00:08:48,440 --> 00:08:53,020
Assertions about the recovery and reverse engineering of alien UAPs have drawn skepticism.

118
00:08:53,020 --> 00:08:59,840
Due to a glaring absence of tangible evidence, we need transparent investigations into UAPs.

119
00:08:59,840 --> 00:09:04,800
Ultimately, astrobiology is arguably more likely to provide reliable answers about the

120
00:09:04,800 --> 00:09:06,880
existence of alien life.

121
00:09:06,880 --> 00:09:10,640
Now, before I start reading this, I'll just throw out this tidbit.

122
00:09:10,640 --> 00:09:17,520
Dr. Frank has been calling for NASA to investigate alien life for a long time.

123
00:09:17,520 --> 00:09:21,160
And now, NASA has gotten into the UAP game.

124
00:09:21,160 --> 00:09:26,960
They have a committee, I think it is, to investigate claims of unidentified aerial phenomenon,

125
00:09:26,960 --> 00:09:29,200
or unidentified anomalous phenomenon.

126
00:09:29,200 --> 00:09:33,520
That acronym morphs a little bit from time to time, depending on where you read it.

127
00:09:33,520 --> 00:09:40,620
And his concern is that because NASA is getting into the game and imbuing this topic with

128
00:09:40,620 --> 00:09:45,760
the imprimatur of serious science and space exploration, that could do more harm than

129
00:09:45,760 --> 00:09:46,760
good.

130
00:09:46,760 --> 00:09:48,520
And I'm sympathetic to that concern.

131
00:09:48,520 --> 00:09:50,440
But Dr. Frank writes,

132
00:09:50,440 --> 00:09:54,360
Everybody's talking about aliens after last week's congressional hearing on UFOs, now

133
00:09:54,360 --> 00:09:59,520
officially rebranded as unidentified aerial phenomenon or UAPs.

134
00:09:59,520 --> 00:10:03,960
As an astrophysicist working on the remarkable science of searching for life in the universe,

135
00:10:03,960 --> 00:10:08,040
I just finished the little book of aliens on this exact subject.

136
00:10:08,040 --> 00:10:11,200
I've been asked by lots of folks what I make of it all.

137
00:10:11,200 --> 00:10:12,660
How do I see the testimony?

138
00:10:12,660 --> 00:10:16,680
Is there anything in what we heard that speaks scientifically to the possibilities of life

139
00:10:16,680 --> 00:10:18,840
existing elsewhere in the universe?

140
00:10:18,840 --> 00:10:23,040
To answer that question, we have to parse out two different threads that emerged during

141
00:10:23,040 --> 00:10:25,080
the hearing.

142
00:10:25,080 --> 00:10:26,360
Extraordinary testimony.

143
00:10:26,360 --> 00:10:30,200
This involves the Navy pilots and their stories of encounters with objects behaving in ways

144
00:10:30,200 --> 00:10:32,000
that defied their expectations.

145
00:10:32,000 --> 00:10:33,360
That's putting it mildly.

146
00:10:33,360 --> 00:10:37,560
I first will say that I think it's a good thing these pilots have been willing to come

147
00:10:37,560 --> 00:10:38,560
forward.

148
00:10:38,560 --> 00:10:42,320
Removing the stigma of making these kinds of reports is the first essential step in

149
00:10:42,320 --> 00:10:45,280
figuring out what's going on with UAPs.

150
00:10:45,280 --> 00:10:49,080
I also liked the way the pilots were pretty agnostic about what was happening.

151
00:10:49,080 --> 00:10:53,000
Like the rest of us, they simply wanted a clear explanation of these encounters.

152
00:10:53,000 --> 00:10:57,920
Finally, I felt they were being entirely honest and forthright in recalling what they remembered

153
00:10:57,920 --> 00:10:59,720
about these encounters.

154
00:10:59,720 --> 00:11:01,320
So where do we go from there?

155
00:11:01,320 --> 00:11:03,520
The answer in this case is really simple.

156
00:11:03,520 --> 00:11:05,540
Do some good science.

157
00:11:05,540 --> 00:11:10,640
It's worth noting that last year, NASA, the folks who land robots on distant planets,

158
00:11:10,640 --> 00:11:14,600
convened a panel to begin a true scientific unpacking of UAPs.

159
00:11:14,600 --> 00:11:18,760
At their first press conference held this summer, the team announced that only 6% of

160
00:11:18,760 --> 00:11:22,760
the many UAP cases they studied could not be explained.

161
00:11:22,760 --> 00:11:28,480
In other words, 94% of their UAP cases had earthly causes.

162
00:11:28,480 --> 00:11:32,840
This conclusion is consistent with other studies of this kind that have been done.

163
00:11:32,840 --> 00:11:37,360
So it's safe to say that the Earth is not suddenly awash in strange and unexplainable

164
00:11:37,360 --> 00:11:40,000
phenomena sailing through the skies.

165
00:11:40,000 --> 00:11:42,160
But what about the other 6%?

166
00:11:42,160 --> 00:11:46,200
In some cases, no explanation could be found because there simply wasn't enough information

167
00:11:46,200 --> 00:11:51,600
– the all-important data that science lives on – to even propose an explanation.

168
00:11:51,600 --> 00:11:57,320
Still, some of the unexplained cases do fall into the truly strange and weird.

169
00:11:57,320 --> 00:12:00,460
This is the space where I would say the pilots' testimonies live.

170
00:12:00,460 --> 00:12:05,620
Their descriptions are definitely of the raise-the-hair-on-the-back-of-your-neck variety.

171
00:12:05,620 --> 00:12:07,480
What do we do in those cases?

172
00:12:07,480 --> 00:12:10,640
Here is the point where better science comes in.

173
00:12:10,640 --> 00:12:15,320
While we could spend lots of time trying to figure out exactly what the pilots saw, I

174
00:12:15,320 --> 00:12:17,460
feel like it's a dead end.

175
00:12:17,460 --> 00:12:20,180
Science cannot do much with personal testimonies.

176
00:12:20,180 --> 00:12:23,980
One problem, as every cop and psychologist will tell you, is that human memory is not

177
00:12:23,980 --> 00:12:25,480
a photographic record.

178
00:12:25,480 --> 00:12:30,760
Instead, it's a reconstruction that can differ from the original event in many ways,

179
00:12:30,760 --> 00:12:33,560
no matter how earnest the reporters are.

180
00:12:33,560 --> 00:12:37,840
But what's even more important, and as I describe more fully in the book, is that to

181
00:12:37,840 --> 00:12:43,120
really do science, you need hard data collected through rational search strategies from instruments

182
00:12:43,120 --> 00:12:45,160
you fully understand.

183
00:12:45,160 --> 00:12:48,720
If my colleagues and I ever want to claim that we've detected signatures of life on

184
00:12:48,720 --> 00:12:53,640
alien planets light-years away, we will have to know everything about our instruments – how

185
00:12:53,640 --> 00:12:57,520
they respond to light when the telescope is at 40 degrees Fahrenheit, and how that response

186
00:12:57,520 --> 00:13:00,640
changes when the temperature goes up to 60 degrees.

187
00:13:00,640 --> 00:13:05,020
The exact same type of knowledge is required to determine whether a UAP accelerated in

188
00:13:05,020 --> 00:13:08,480
ways that no human technology could reproduce.

189
00:13:08,480 --> 00:13:14,120
Personal testimony, onboard targeting cameras, and even military radars cannot do that.

190
00:13:14,120 --> 00:13:19,640
Ultimately, to really know if UAPs have anything to do with advanced technologies from alien

191
00:13:19,640 --> 00:13:23,800
life, we will need to set up a new kind of research program.

192
00:13:23,800 --> 00:13:25,640
And I'm all for it.

193
00:13:25,640 --> 00:13:29,880
As I've written before, an open, transparent, and scientific investigation of UAPs would

194
00:13:29,880 --> 00:13:31,880
be great.

195
00:13:31,880 --> 00:13:39,400
My personal opinion is that the alien explanation is a long, long, long shot.

196
00:13:39,400 --> 00:13:43,060
Peer-state adversaries is a much more likely explanation.

197
00:13:43,060 --> 00:13:48,640
But my belief and $4.98 will get you a Starbucks coffee, so let's do the real science.

198
00:13:48,640 --> 00:13:53,440
At the very least, it will show people how science goes about its business – a business

199
00:13:53,440 --> 00:13:58,000
that gave us cell phones that work, jet planes that don't fall out of the sky, and medical

200
00:13:58,000 --> 00:14:01,080
procedures that heal.

201
00:14:01,080 --> 00:14:05,160
And then the heading for the next section is, show me the spaceship, Mulder.

202
00:14:05,160 --> 00:14:07,440
Dr. Frank continues.

203
00:14:07,440 --> 00:14:09,500
Now on to the second thread.

204
00:14:09,500 --> 00:14:14,320
One of three witnesses at the hearing claimed that the U.S. had recovered downed UAPs of

205
00:14:14,320 --> 00:14:20,320
non-human origin, that non-human biologics, whatever that means, had also been recovered,

206
00:14:20,320 --> 00:14:24,880
and that technology from the possibly alien craft had been reverse-engineered.

207
00:14:24,880 --> 00:14:28,840
I am, to put it mildly, highly skeptical.

208
00:14:28,840 --> 00:14:31,300
First of all, such claims are nothing new.

209
00:14:31,300 --> 00:14:35,520
As I wrote about recently, and as I detail in the book, there have been ex-military officials

210
00:14:35,520 --> 00:14:38,540
making these kinds of claims going back 70 years.

211
00:14:38,540 --> 00:14:44,800
What doesn't go back 70 years is actual hold-in-your-hand evidence of such claims.

212
00:14:44,800 --> 00:14:47,600
So what we heard in the hearing was old news.

213
00:14:47,600 --> 00:14:51,380
Somebody says they heard from somebody else in the know that we have alien spaceships

214
00:14:51,380 --> 00:14:56,720
in the garage, but once again we have no actual evidence of claims so extraordinary they sound

215
00:14:56,720 --> 00:14:59,640
like they come straight out of an episode of The X-Files.

216
00:14:59,640 --> 00:15:03,320
One of my rules of thumb is that if something sounds like the plot of a science fiction

217
00:15:03,320 --> 00:15:06,320
movie, it probably is.

218
00:15:06,320 --> 00:15:10,960
There is no reason, based on any existing science, to take these truly stunning claims

219
00:15:10,960 --> 00:15:11,960
seriously.

220
00:15:11,960 --> 00:15:16,880
I am not changing this position until somebody ponies up some actual artifacts, which I will

221
00:15:16,880 --> 00:15:20,880
note are always promised to be coming, but never show up.

222
00:15:20,880 --> 00:15:23,800
In other words, show me the spaceship.

223
00:15:23,800 --> 00:15:27,200
Okay, that's not the end of the piece, but that's where I will stop reading.

224
00:15:27,200 --> 00:15:32,080
Now, I've said that the connecting thread between all of these topics is artificial

225
00:15:32,080 --> 00:15:33,080
intelligence.

226
00:15:33,080 --> 00:15:35,880
How does artificial intelligence play into this story?

227
00:15:35,880 --> 00:15:40,000
Well, let's start with Dr. Frank's call.

228
00:15:40,000 --> 00:15:41,320
Show me the spaceship.

229
00:15:41,320 --> 00:15:47,840
Yeah, you can go to YouTube and you can watch hours and hours of clearly, obviously faked

230
00:15:47,840 --> 00:15:50,000
UFO videos.

231
00:15:50,000 --> 00:15:55,500
There's a YouTube channel that is a bunch of special effects artists who analyze special

232
00:15:55,500 --> 00:15:56,500
effects.

233
00:15:56,500 --> 00:15:57,800
They describe how they're done.

234
00:15:57,800 --> 00:15:59,360
It's called the Corridor Digital Crew.

235
00:15:59,360 --> 00:16:00,640
It's a lot of fun.

236
00:16:00,640 --> 00:16:03,240
I highly recommend watching that channel.

237
00:16:03,240 --> 00:16:08,320
And they have a recurring segment that they do where they look at UFO footage and they

238
00:16:08,320 --> 00:16:11,480
can spot the artifacts of fakery.

239
00:16:11,480 --> 00:16:18,520
But something they've also documented is how good artificial intelligence is getting at

240
00:16:18,520 --> 00:16:25,960
doing special effects, doing animation, doing CGI basically that is way, way better than

241
00:16:25,960 --> 00:16:27,920
what we're used to seeing.

242
00:16:27,920 --> 00:16:35,200
And I predict that in the coming months and years, like the next two or three years, we're

243
00:16:35,200 --> 00:16:43,780
going to see a flood of basically AI-generated evidence or proof that the extraterrestrial

244
00:16:43,780 --> 00:16:50,460
hypothesis is the actual explanation for unidentified aerial phenomenon or anomalous phenomenon.

245
00:16:50,460 --> 00:16:57,520
This proof will not be released by the Pentagon or any government agency, but it will be leaked

246
00:16:57,520 --> 00:17:03,120
and I'm putting leaked in air quotes, because in the long run, it's not going to hold up

247
00:17:03,120 --> 00:17:08,480
to scrutiny, but in the near term, it's going to be convincing enough to add fuel to the

248
00:17:08,480 --> 00:17:09,480
fire.

249
00:17:09,480 --> 00:17:13,960
It's not going to be released by any government agency because in the fullness of time, it

250
00:17:13,960 --> 00:17:18,960
will be revealed to be a hoax and no government agency is going to want to be on record as

251
00:17:18,960 --> 00:17:25,000
having propagated a hoax, but they will leak it deliberately just to keep this thing going.

252
00:17:25,000 --> 00:17:27,400
Now, what's the point of all of this?

253
00:17:27,400 --> 00:17:33,080
I suggest, I'm not saying this is a fact, this is just what seems likely to me, that

254
00:17:33,080 --> 00:17:37,560
the true audience for all of this is not the general public.

255
00:17:37,560 --> 00:17:45,240
The Pentagon, which is to say the US military, receives a lot of money, a lot of money to

256
00:17:45,240 --> 00:17:46,460
do what they do.

257
00:17:46,460 --> 00:17:51,740
And every now and again, various agencies try to audit the Pentagon to see if the money

258
00:17:51,740 --> 00:17:57,000
that has been allocated to them has been spent for legitimate defense purposes.

259
00:17:57,000 --> 00:17:59,460
And there's a great phrase that I want to share with you here.

260
00:17:59,460 --> 00:18:06,340
This is from a 2016 article in the New York Magazine Intelligencer.

261
00:18:06,340 --> 00:18:11,120
The title of the article is, The $6 Trillion Issue You Won't Be Hearing About at Tonight's

262
00:18:11,120 --> 00:18:12,120
Debate.

263
00:18:12,120 --> 00:18:17,240
So picking up in the middle of the article, I'm not even going to try to lay the context.

264
00:18:17,240 --> 00:18:21,160
I think the bit that I'm coming to, it stands on its own.

265
00:18:21,160 --> 00:18:25,780
Paragraph begins, Trump's new defense demands, again, this is from 2016, Trump's new defense

266
00:18:25,780 --> 00:18:30,520
demands don't help us understand headlines like the one from mid-August that said, the

267
00:18:30,520 --> 00:18:37,120
Defense Department's Inspector General found more than $6.5 trillion, quote, wrongful adjustments

268
00:18:37,120 --> 00:18:44,080
to accounting entries, close quote, in the Army's general fund for 2015 alone.

269
00:18:44,080 --> 00:18:45,760
I just love that.

270
00:18:45,760 --> 00:18:49,080
Wrongful adjustments to accounting entries.

271
00:18:49,080 --> 00:18:55,500
$6.5 trillion unaccounted for.

272
00:18:55,500 --> 00:18:58,000
The military doesn't want to talk about that.

273
00:18:58,000 --> 00:19:05,800
They claim that there's nothing to this UFO business, but they also act like they're covering

274
00:19:05,800 --> 00:19:07,280
it up.

275
00:19:07,280 --> 00:19:08,700
Why?

276
00:19:08,700 --> 00:19:14,680
I suspect it is to capture the imagination of people like Christian Gillibrand, who is

277
00:19:14,680 --> 00:19:17,800
one of the senators who is pushing hard for this UAP business.

278
00:19:17,800 --> 00:19:19,460
Now, maybe she's in on it.

279
00:19:19,460 --> 00:19:24,240
Maybe she's not really a UFO believer or UAP believer.

280
00:19:24,240 --> 00:19:27,200
Maybe she's just part of the kayfabe dance here.

281
00:19:27,200 --> 00:19:28,360
I don't know.

282
00:19:28,360 --> 00:19:30,760
But I think this is all theater.

283
00:19:30,760 --> 00:19:36,280
This is all a distraction meant to prevent the reckoning of, where's the money?

284
00:19:36,280 --> 00:19:38,380
Where's the money, Pentagon?

285
00:19:38,380 --> 00:19:43,100
But anyway, lots of people have expressed their frustration with the low-quality evidence

286
00:19:43,100 --> 00:19:47,880
that we have for these extraordinary claims, and I think that the evidence is about to

287
00:19:47,880 --> 00:19:54,020
get a lot more compelling because AI is getting really, really good at faking such stuff.

288
00:19:54,020 --> 00:20:03,400
Which brings us to the next issue, Hollywood.

289
00:20:03,400 --> 00:20:08,020
Screenwriters fear AI could be used to churn out a rough first draft with a few simple

290
00:20:08,020 --> 00:20:13,680
prompts and writers may then be hired after this initial step to punch such drafts up,

291
00:20:13,680 --> 00:20:16,540
albeit at a lower pay rate.

292
00:20:16,540 --> 00:20:23,000
These concerns expose the techno-optimist lie that AI will create more jobs than it destroys.

293
00:20:23,000 --> 00:20:25,240
Millions of background actors could be put out of work.

294
00:20:25,240 --> 00:20:29,120
How many coders will it take to program their likenesses into the background?

295
00:20:29,120 --> 00:20:30,280
Handsome, maybe?

296
00:20:30,280 --> 00:20:34,360
And the job of writer might remain, but it will be degraded so that they will effectively

297
00:20:34,360 --> 00:20:39,300
be assistants to the bots cleaning up the drafts that AI churns out.

298
00:20:39,300 --> 00:20:43,680
And what's true for this industry is going to be true for many, many more because bosses

299
00:20:43,680 --> 00:20:46,600
are always going to look for ways to use fewer workers.

300
00:20:46,600 --> 00:20:47,600
Workers are expensive.

301
00:20:47,600 --> 00:20:50,460
They have rights, and there's at least some limitations on how much you're allowed to

302
00:20:50,460 --> 00:20:51,640
exploit them.

303
00:20:51,640 --> 00:20:56,800
But they never talk back, they never need time off, and they require no humanity.

304
00:20:56,800 --> 00:21:02,040
If the country only cares about profits for the top, human beings could become truly disposable.

305
00:21:02,040 --> 00:21:05,880
That's to say nothing of the way that Hollywood has already been degraded and stripped of

306
00:21:05,880 --> 00:21:10,920
beauty, risk-taking, and creativity by the demand to place the safest, most market-palatable

307
00:21:10,920 --> 00:21:11,920
bet.

308
00:21:11,920 --> 00:21:14,800
Now, you may not think that this fight has a lot to do with you other than creating an

309
00:21:14,800 --> 00:21:17,480
annoyance as your favorite show production is delayed.

310
00:21:17,480 --> 00:21:20,520
You may think that these Hollywood stars and star lists have nothing in common with you

311
00:21:20,520 --> 00:21:23,920
and are privileged to even have the ability to complain about all of this.

312
00:21:23,920 --> 00:21:24,920
And you know what?

313
00:21:24,920 --> 00:21:26,040
There's some truth to that.

314
00:21:26,040 --> 00:21:29,800
After all, it is their prominence, combined with their union power, by the way, which

315
00:21:29,800 --> 00:21:34,200
is the only thing that even gives them a chance to push back on any of this.

316
00:21:34,200 --> 00:21:36,120
But this is just the beginning.

317
00:21:36,120 --> 00:21:39,160
Automation has already come for blue-collar America.

318
00:21:39,160 --> 00:21:41,720
Now it's coming for white-collar workers, too.

319
00:21:41,720 --> 00:21:45,480
Everyone now has an interest in seeing the shared threat to their livelihoods and supporting

320
00:21:45,480 --> 00:21:50,220
one another in these struggles that will draw new lines in the sand of what is acceptable

321
00:21:50,220 --> 00:21:53,360
and what is immoral in this new landscape.

322
00:21:53,360 --> 00:21:58,440
Bottom line, technology should benefit human beings, not destroy their lives.

323
00:21:58,440 --> 00:22:02,160
Because in this future that we are just catching a glimpse of, it's not that people will become

324
00:22:02,160 --> 00:22:03,160
wholly irrelevant.

325
00:22:03,160 --> 00:22:07,000
It's that the gulp between the haves and have-nots will become ever greater as the

326
00:22:07,000 --> 00:22:09,960
owner class separates more and more from the labor class.

327
00:22:09,960 --> 00:22:15,240
It's that every last sector of our lives will be colonized, commoditized, for profit.

328
00:22:15,240 --> 00:22:19,720
And if this brave new world can come for Hollywood stars and starlets, what chance do ordinary

329
00:22:19,720 --> 00:22:25,400
people ultimately stand?

330
00:22:25,400 --> 00:22:28,360
You probably recognize the voice of Crystal Ball.

331
00:22:28,360 --> 00:22:33,640
That was Crystal doing a monologue on the topic of the Hollywood strike.

332
00:22:33,640 --> 00:22:38,540
Hollywood writers and actors are both on strike, and really what's at stake here is money.

333
00:22:38,540 --> 00:22:44,340
But there's a particular way that AI plays into this that is very relevant to both writers

334
00:22:44,340 --> 00:22:46,380
and actors.

335
00:22:46,380 --> 00:22:53,000
Back in the 20th century, way back, for example, a typical season of one of the Star Trek shows,

336
00:22:53,000 --> 00:22:58,400
say Star Trek The Next Generation or Star Trek Deep Space Nine, would have 24 episodes

337
00:22:58,400 --> 00:23:01,680
and sometimes more in a single season.

338
00:23:01,680 --> 00:23:11,600
Flash forward to today, and a prestige TV show on one of the big streaming platforms

339
00:23:11,600 --> 00:23:17,480
might have 8 episodes, or 10, or maybe 12.

340
00:23:17,480 --> 00:23:22,400
Fewer episodes written means less money paid to writers, but that's not just a matter of

341
00:23:22,400 --> 00:23:25,640
shorter seasons.

342
00:23:25,640 --> 00:23:31,760
Writers would get a fee for doing the initial work, but then the real money was in the residuals.

343
00:23:31,760 --> 00:23:40,080
Every time an episode that you wrote got shown on TV again as a rerun, you got paid again.

344
00:23:40,080 --> 00:23:45,560
And so you could get paid year after year for something that you wrote.

345
00:23:45,560 --> 00:23:46,740
Excellent.

346
00:23:46,740 --> 00:23:48,700
That's great.

347
00:23:48,700 --> 00:23:51,200
How does it work in streaming?

348
00:23:51,200 --> 00:23:54,480
There is no such thing as reruns in streaming.

349
00:23:54,480 --> 00:24:00,200
A piece of content gets put up on a streaming platform, and people watch it or they don't.

350
00:24:00,200 --> 00:24:05,680
But here's the thing, the streaming services are playing their cards really close to their

351
00:24:05,680 --> 00:24:06,680
chest.

352
00:24:06,680 --> 00:24:11,440
They're going to tell you exactly how many times a given show has been watched.

353
00:24:11,440 --> 00:24:13,280
They'll tell you, oh, this show was a hit.

354
00:24:13,280 --> 00:24:14,280
This show was a success.

355
00:24:14,280 --> 00:24:16,520
You know, lots of people watch this.

356
00:24:16,520 --> 00:24:17,520
How many?

357
00:24:17,520 --> 00:24:18,620
Well, we're not going to say.

358
00:24:18,620 --> 00:24:21,600
That's proprietary data.

359
00:24:21,600 --> 00:24:23,920
Why would they be so coy about that?

360
00:24:23,920 --> 00:24:27,120
Well, here's a bit of speculation.

361
00:24:27,120 --> 00:24:28,960
This is from Chris Gore.

362
00:24:28,960 --> 00:24:31,840
He's reading a tweet, but you know, I'll play Chris's voice.

363
00:24:31,840 --> 00:24:36,120
Chris Gore is an independent filmmaker, and he's also the publisher of a magazine that

364
00:24:36,120 --> 00:24:41,280
I was reading back in the 80s because it dealt with independent film and obscure films and

365
00:24:41,280 --> 00:24:46,200
films that I would never really be able to see in a movie theater, except maybe the Art

366
00:24:46,200 --> 00:24:49,920
House Theater, the Tivoli down in Westport in Kansas City, which is where I lived at

367
00:24:49,920 --> 00:24:50,920
the time.

368
00:24:50,920 --> 00:24:56,520
But anyway, here's Chris Gore of Film Threat fame talking about why the streaming services

369
00:24:56,520 --> 00:25:05,440
might not want to reveal the viewership numbers and what might happen if they did.

370
00:25:05,440 --> 00:25:08,120
Hollywood is on strike.

371
00:25:08,120 --> 00:25:10,220
Everything is shut down.

372
00:25:10,220 --> 00:25:15,720
The only films being made now are from A24.

373
00:25:15,720 --> 00:25:24,440
Animation is moving ahead, but certain movies, for example, the Marvels and Dune Part II

374
00:25:24,440 --> 00:25:25,700
are on the bubble.

375
00:25:25,700 --> 00:25:31,880
If the strike continues past September, those movies may not release this year.

376
00:25:31,880 --> 00:25:38,040
And things were looking up because Barbie and Oppenheimer have ignited the box office

377
00:25:38,040 --> 00:25:42,400
in a huge way, which is exciting for everybody.

378
00:25:42,400 --> 00:25:49,520
But I think it's important that we look at the long term big picture of all of this.

379
00:25:49,520 --> 00:25:51,600
This is Andrew Schultz on Instagram.

380
00:25:51,600 --> 00:25:57,260
And Andrew Schultz says, thoughts on the Hollywood strike.

381
00:25:57,260 --> 00:26:02,340
The real issue is that actors and writers want fair residual payments from the streamers.

382
00:26:02,340 --> 00:26:08,880
In order to define what is fair, the streamers will need to share how many people are actually

383
00:26:08,880 --> 00:26:10,340
watching their shows.

384
00:26:10,340 --> 00:26:11,720
And here lies the problem.

385
00:26:11,720 --> 00:26:14,400
OK, number two, this is five parts.

386
00:26:14,400 --> 00:26:19,880
Number two, my suspicion is that the streamers are refusing to share the viewership numbers,

387
00:26:19,880 --> 00:26:24,980
not because they're being cheap, but because no one is watching and revealing extremely

388
00:26:24,980 --> 00:26:28,440
low viewership would kill the stock price.

389
00:26:28,440 --> 00:26:34,800
So number three, if most of these streamers are losing money in an effort to gain market

390
00:26:34,800 --> 00:26:40,720
share, the only justification for their spending is their stock price being high.

391
00:26:40,720 --> 00:26:44,920
Once that stock price tanks with the real viewership numbers, the streamers will have

392
00:26:44,920 --> 00:26:52,960
to cut back on spending, which means for way less shows will be greenlit and the budgets

393
00:26:52,960 --> 00:26:58,640
for those shows will be severely reduced, which means way less acting gigs and writing

394
00:26:58,640 --> 00:26:59,640
gigs.

395
00:26:59,640 --> 00:27:05,880
So essentially, if the actors and directors strike is successful by making the streamers

396
00:27:05,880 --> 00:27:11,780
release the real viewership, the strike will essentially force the streamers to hire less

397
00:27:11,780 --> 00:27:13,220
actors and directors.

398
00:27:13,220 --> 00:27:15,880
So they're striking themselves out of work.

399
00:27:15,880 --> 00:27:18,880
Just a hunch, though.

400
00:27:18,880 --> 00:27:20,400
Just a hunch.

401
00:27:20,400 --> 00:27:22,320
So what do you think of that possibility?

402
00:27:22,320 --> 00:27:27,100
That speculation that not that many people are watching these streaming services.

403
00:27:27,100 --> 00:27:28,760
How many do you subscribe to?

404
00:27:28,760 --> 00:27:35,200
I can tell you right now that I almost always have an Amazon Prime video subscription because

405
00:27:35,200 --> 00:27:41,960
I subscribe to Amazon Prime for the free shipping, but I don't actually watch much on Prime.

406
00:27:41,960 --> 00:27:44,760
I do not have a Netflix subscription right now.

407
00:27:44,760 --> 00:27:46,840
I don't have a Disney Plus subscription.

408
00:27:46,840 --> 00:27:52,140
I do have a Paramount Plus subscription so I can watch new episodes of Star Trek Strange

409
00:27:52,140 --> 00:27:53,400
New Worlds.

410
00:27:53,400 --> 00:27:58,160
And I do have an Apple Plus, you know, an Apple TV Plus subscription because I'm watching

411
00:27:58,160 --> 00:28:02,680
Foundation Season 2 and before that I was watching Silo.

412
00:28:02,680 --> 00:28:08,120
With Netflix, I will subscribe a couple of times a year for a month.

413
00:28:08,120 --> 00:28:12,520
You know, I'm waiting for a new season of certain shows, so I just subscribed for a

414
00:28:12,520 --> 00:28:15,760
month so that I could watch the new season of Black Mirror.

415
00:28:15,760 --> 00:28:19,200
And if there's a new season of Love, Death and Robots, well, then I'll subscribe again

416
00:28:19,200 --> 00:28:20,200
and I'll watch that.

417
00:28:20,200 --> 00:28:23,040
After I've watched the thing that I subscribed for, then I'll poke around and see if there's

418
00:28:23,040 --> 00:28:25,160
anything else that interests me.

419
00:28:25,160 --> 00:28:29,640
But you know, then I unsubscribe again because I'm cheap.

420
00:28:29,640 --> 00:28:31,600
I watch a lot of YouTube.

421
00:28:31,600 --> 00:28:36,280
I'm sure you do this, but there'll be nights when I will go to the streaming services that

422
00:28:36,280 --> 00:28:41,440
I'm subscribed to and I'll flip through a bunch of different titles and I won't select

423
00:28:41,440 --> 00:28:42,440
any of them.

424
00:28:42,440 --> 00:28:43,920
Like I'll see a movie, I'll think, oh yeah, that looks good.

425
00:28:43,920 --> 00:28:46,880
I've been meaning to watch that, but it's after nine.

426
00:28:46,880 --> 00:28:48,640
I'm not ready to commit to a two hour movie.

427
00:28:48,640 --> 00:28:52,320
I think I'll probably want to go to sleep before this thing would be over.

428
00:28:52,320 --> 00:28:57,640
So I just pop on over to YouTube and I eat away the night, you know, 15, 20 minutes at

429
00:28:57,640 --> 00:28:58,920
a time.

430
00:28:58,920 --> 00:29:03,600
But I'm not watching that much stuff on streaming and I don't watch anything, you know, on broadcast

431
00:29:03,600 --> 00:29:05,040
TV.

432
00:29:05,040 --> 00:29:10,200
Young people play games, young people are online, you know, doing social media stuff.

433
00:29:10,200 --> 00:29:13,280
Boomers are, you know, spending their nights on Facebook.

434
00:29:13,280 --> 00:29:19,560
So I think people are just watching a lot less of filmed entertainment than they used

435
00:29:19,560 --> 00:29:20,560
to.

436
00:29:20,560 --> 00:29:24,200
In fact, the word filmed is anachronistic, you know, everything's shot on video these

437
00:29:24,200 --> 00:29:25,320
days.

438
00:29:25,320 --> 00:29:32,520
But we're talking about AI and you know, AI actors are in danger because studios want

439
00:29:32,520 --> 00:29:33,520
you to show up on set.

440
00:29:33,520 --> 00:29:36,120
They want to do a full scan of your body and your face.

441
00:29:36,120 --> 00:29:41,240
They want you to adopt a variety of facial expressions, maybe enact a few sample scenes

442
00:29:41,240 --> 00:29:45,760
so that they can capture the range of your voice and then pay you for one day and say

443
00:29:45,760 --> 00:29:51,000
thank you very much and use that information to make media in perpetuity.

444
00:29:51,000 --> 00:29:53,760
That's the end of acting as a paid profession.

445
00:29:53,760 --> 00:29:55,020
I agree.

446
00:29:55,020 --> 00:29:56,880
That sucks for actors.

447
00:29:56,880 --> 00:30:02,360
Writing, you know, TV shows, typically there's a showrunner, somebody who has a vision for

448
00:30:02,360 --> 00:30:03,360
the show.

449
00:30:03,360 --> 00:30:06,560
They're in charge of wrangling the writers and then they hire a bunch of writers.

450
00:30:06,560 --> 00:30:11,840
They get them all in a writer's room and they pitch ideas and they refine things and then

451
00:30:11,840 --> 00:30:15,480
they send people off on their own to do the actual writing.

452
00:30:15,480 --> 00:30:20,240
And while, you know, an episode of a TV show might have one or two writers credited, really

453
00:30:20,240 --> 00:30:25,240
everybody on the staff, everybody in the writers room had some input.

454
00:30:25,240 --> 00:30:31,240
But even without AI, you know, seasons are getting shorter and showrunners are being

455
00:30:31,240 --> 00:30:36,320
tasked with more and more of the writing duties and they have smaller writers rooms.

456
00:30:36,320 --> 00:30:39,920
You know, instead of 12 people, they might have four.

457
00:30:39,920 --> 00:30:40,920
Saves money.

458
00:30:40,920 --> 00:30:41,920
Cheaper.

459
00:30:41,920 --> 00:30:48,160
So, let me play you another clip from a different monologue by Crystal Ball.

460
00:30:48,160 --> 00:30:52,580
Whatever you think of Barbenheimer, the explosion of cultural fascination with both films is

461
00:30:52,580 --> 00:30:56,320
basically a testament to our love affair with human creativity.

462
00:30:56,320 --> 00:31:00,360
For once, studios took a risk on a few things that were truly new and different and they

463
00:31:00,360 --> 00:31:04,720
were rewarded with massive audiences and a flood of national discourse that has briefly

464
00:31:04,720 --> 00:31:09,040
recreated a monocultural event, the likes of which I really thought we might never see

465
00:31:09,040 --> 00:31:10,040
again.

466
00:31:10,040 --> 00:31:14,600
Ironically, this moment of delight in human imagination comes at a time when the very

467
00:31:14,600 --> 00:31:17,640
essence of creativity is actually under threat.

468
00:31:17,640 --> 00:31:22,280
Big tech, in order to monopolize the new world of AI, is attempting to feed their models

469
00:31:22,280 --> 00:31:27,640
with the whole world of human ingenuity, scraping every bit of language, articulated vision

470
00:31:27,640 --> 00:31:32,160
and novel innovation that they can get their hands on so that their machines might impersonate

471
00:31:32,160 --> 00:31:35,120
a bastardized version of the human spark.

472
00:31:35,120 --> 00:31:39,360
These so-called large language models can't create anything new, but by harvesting our

473
00:31:39,360 --> 00:31:43,260
musings, our pictures, our conversations, our stories, companies are hoping that the

474
00:31:43,260 --> 00:31:49,120
bots can be trained to mimic us well enough that we will accept their AI-derived products.

475
00:31:49,120 --> 00:31:53,200
Basically, they're trying to eat our souls and then sell them back to us.

476
00:31:53,200 --> 00:31:58,200
I would encourage you to go and listen to that entire monologue because Crystal Ball

477
00:31:58,200 --> 00:32:00,320
comes back again and again.

478
00:32:00,320 --> 00:32:05,380
She flexes her creativity and her prowess as a writer by finding several different ways

479
00:32:05,380 --> 00:32:12,800
to say content created by AI is really just stolen from human creators.

480
00:32:12,800 --> 00:32:18,480
It's just a soulless, mechanistic, cut and paste rehash of something that was originally

481
00:32:18,480 --> 00:32:20,540
created by a human being.

482
00:32:20,540 --> 00:32:28,800
There is some truth to that, but keep in mind, this is the beginning of August 2023.

483
00:32:28,800 --> 00:32:37,800
That GPT first became available for anybody, any public person to use in November of 2022.

484
00:32:37,800 --> 00:32:41,920
It's less than a year old, this generative AI.

485
00:32:41,920 --> 00:32:46,040
We've had generative AI in terms of the diffusion models, text to image generation, for more

486
00:32:46,040 --> 00:32:49,560
than a year, but not much more.

487
00:32:49,560 --> 00:32:55,320
For the LLM-powered, large language model-powered chatbots, it's less than a year that the

488
00:32:55,320 --> 00:32:57,480
public has had access to this stuff.

489
00:32:57,480 --> 00:33:01,760
This technology is very, very new.

490
00:33:01,760 --> 00:33:07,040
Crystal has assumed as a point of ideological convenience that this is it for AI.

491
00:33:07,040 --> 00:33:09,080
This is its peak capacity.

492
00:33:09,080 --> 00:33:11,400
It's never going to get any better than this.

493
00:33:11,400 --> 00:33:17,660
And what it can do right now is not as good as a good human writer.

494
00:33:17,660 --> 00:33:19,460
Maybe she's right.

495
00:33:19,460 --> 00:33:26,600
Maybe today, August 2nd, 2023, the natural language processing abilities of large language

496
00:33:26,600 --> 00:33:28,320
models hit the wall.

497
00:33:28,320 --> 00:33:31,560
It'll never get any better after today.

498
00:33:31,560 --> 00:33:32,560
That's possible.

499
00:33:32,560 --> 00:33:37,760
I'm going to go out on a limb, though, and say that the technology is going to continue

500
00:33:37,760 --> 00:33:43,280
to improve for a good long time, and in fact, the rate of improvement is probably going

501
00:33:43,280 --> 00:33:45,480
to increase.

502
00:33:45,480 --> 00:33:49,200
Accelerate, you might say.

503
00:33:49,200 --> 00:33:51,800
We'll come back to acceleration.

504
00:33:51,800 --> 00:33:54,200
AI?

505
00:33:54,200 --> 00:33:56,200
Not really AI.

506
00:33:56,200 --> 00:34:07,280
Capitalists, bosses, owners are using the changes in technology to claw back the gains

507
00:34:07,280 --> 00:34:10,120
of organized labor.

508
00:34:10,120 --> 00:34:12,060
But it didn't start yesterday.

509
00:34:12,060 --> 00:34:15,820
It didn't start with large language models.

510
00:34:15,820 --> 00:34:17,360
It didn't start with the gig economy.

511
00:34:17,360 --> 00:34:21,720
But you know, remember Uber, remember Amazon.

512
00:34:21,720 --> 00:34:25,920
There's lots of people doing online tasks for Amazon Mechanical Turk.

513
00:34:25,920 --> 00:34:32,000
They use Amazon as a platform to find these little micro gigs that they can do for micro

514
00:34:32,000 --> 00:34:33,280
payments.

515
00:34:33,280 --> 00:34:35,680
But Amazon doesn't consider them employees.

516
00:34:35,680 --> 00:34:38,320
It doesn't take any responsibility for them.

517
00:34:38,320 --> 00:34:39,760
It doesn't provide any benefits.

518
00:34:39,760 --> 00:34:44,360
It just stands as an intermediary between the human being doing this little micro task

519
00:34:44,360 --> 00:34:48,320
for some client of Amazon's.

520
00:34:48,320 --> 00:34:52,180
It's an intermediary, but it's not the employer.

521
00:34:52,180 --> 00:34:53,720
Same with Uber.

522
00:34:53,720 --> 00:34:56,760
Uber drivers are not employees of Uber.

523
00:34:56,760 --> 00:34:57,760
How can that be?

524
00:34:57,760 --> 00:35:04,820
I mean, they drive the cars for this service, but the service is no, no, they're just contractors.

525
00:35:04,820 --> 00:35:11,080
The technology has changed things such that the employers can step in and say, this situation

526
00:35:11,080 --> 00:35:13,440
doesn't exactly match the labor laws as written.

527
00:35:13,440 --> 00:35:17,100
So you know, we're going to interpret it to our maximum benefit.

528
00:35:17,100 --> 00:35:18,900
It's happened with drivers.

529
00:35:18,900 --> 00:35:22,600
It's happened with people doing these little micro tasks online.

530
00:35:22,600 --> 00:35:24,700
It's going to happen in a variety of industries.

531
00:35:24,700 --> 00:35:29,700
It's happening with artists, you know, with diffusion models and text to image generation,

532
00:35:29,700 --> 00:35:32,920
and it's happening with writers, with large language models.

533
00:35:32,920 --> 00:35:38,000
The thing is, AI is coming for everybody's livelihood, everybody in the working class

534
00:35:38,000 --> 00:35:39,000
anyway.

535
00:35:39,000 --> 00:35:45,240
And again, when I say AI is coming for, I mean, the ownership class is using AI to come

536
00:35:45,240 --> 00:35:54,840
for everybody's livelihood, but not all at the same time and not all in the same way.

537
00:35:54,840 --> 00:35:59,780
And if people in particular industries or particular job roles get together and push

538
00:35:59,780 --> 00:36:04,640
back, but only for the people in their industry, only for the people doing exactly what they

539
00:36:04,640 --> 00:36:11,080
do, that precludes a more working class wide solidarity.

540
00:36:11,080 --> 00:36:16,880
You know, if everybody were to lose their job to AI on the same day, we would understand

541
00:36:16,880 --> 00:36:22,600
as a society, we have to come to some new arrangement for provisioning people with the

542
00:36:22,600 --> 00:36:25,000
necessities of life.

543
00:36:25,000 --> 00:36:30,260
But since it's happening at different paces in different industries and in different ways,

544
00:36:30,260 --> 00:36:36,160
and people are pushing back against it only from their small perspective, you know, only

545
00:36:36,160 --> 00:36:42,800
in their little domain, their pushback is far less effective than it would be if it

546
00:36:42,800 --> 00:36:46,240
was more systemic, if it was more widespread.

547
00:36:46,240 --> 00:36:50,360
And so the creeping nature of technological change, I mean, it's moving quickly now, but

548
00:36:50,360 --> 00:36:55,520
still from a day to day standpoint, things don't change that much.

549
00:36:55,520 --> 00:36:59,440
They change dramatically over the course of a few months, you know, but from day to day,

550
00:36:59,440 --> 00:37:02,000
they don't change all that much.

551
00:37:02,000 --> 00:37:04,540
And so it's easy to ignore, it's easy to put off.

552
00:37:04,540 --> 00:37:09,200
And when you finally feel the pressure enough to act, you act as an individual or you act

553
00:37:09,200 --> 00:37:14,860
as a member of a small community or as an employee in a very specific field.

554
00:37:14,860 --> 00:37:16,620
That's not going to cut it.

555
00:37:16,620 --> 00:37:23,280
So I say, Crystal made an assumption about the nature of large language models and she

556
00:37:23,280 --> 00:37:25,280
made it for ideological convenience.

557
00:37:25,280 --> 00:37:28,440
She's on the side of workers against the ownership class.

558
00:37:28,440 --> 00:37:31,700
And I, you know, I agree that she should be.

559
00:37:31,700 --> 00:37:38,240
But at the same time, I think that time and events will prove her wrong about the limitations

560
00:37:38,240 --> 00:37:45,160
on the capacity of AI and of her whole position is predicated on the idea that human creativity

561
00:37:45,160 --> 00:37:50,720
is this very special field that will never be replicated, much less surpassed by artificial

562
00:37:50,720 --> 00:37:51,720
intelligence.

563
00:37:51,720 --> 00:37:55,040
Well, I think her position is going to crumble.

564
00:37:55,040 --> 00:38:01,300
I think she needs a better position, one that acknowledges that AI will in all likelihood

565
00:38:01,300 --> 00:38:05,680
continue to improve in its capabilities, even in areas that we used to think of as being

566
00:38:05,680 --> 00:38:09,800
the exclusive domain of creative human beings.

567
00:38:09,800 --> 00:38:17,960
And that brings us to the topic of acceleration in the capacities of artificial intelligence

568
00:38:17,960 --> 00:38:32,240
and people who want to slow things down and people who want to speed things up.

569
00:38:32,240 --> 00:38:33,240
All right.

570
00:38:33,240 --> 00:38:37,160
That brings us to accelerationism.

571
00:38:37,160 --> 00:38:41,000
The name itself is pretty self-explanatory.

572
00:38:41,000 --> 00:38:42,000
It's an ism.

573
00:38:42,000 --> 00:38:47,520
So, you know, that means a belief system, a prescription of some kind.

574
00:38:47,520 --> 00:38:49,640
What's being prescribed?

575
00:38:49,640 --> 00:38:51,560
Acceleration.

576
00:38:51,560 --> 00:38:53,480
Acceleration of what, though?

577
00:38:53,480 --> 00:39:01,560
Typically acceleration has to do with technology, but also capitalist forces.

578
00:39:01,560 --> 00:39:07,640
And in fact, recently, people who advocate something called effective accelerationism

579
00:39:07,640 --> 00:39:14,500
have just mashed technology and capitalism into one thing that they call techno capital.

580
00:39:14,500 --> 00:39:20,160
Capitalism grew out of this academic experiment in England at Warwick University called the

581
00:39:20,160 --> 00:39:23,380
Cybernetic Culture Research Unit.

582
00:39:23,380 --> 00:39:27,420
And this was formed by a guy named Nick Land, along with some other people.

583
00:39:27,420 --> 00:39:29,860
And at the time, Nick Land was a leftist.

584
00:39:29,860 --> 00:39:38,260
But at some point, he had a sudden and dramatic change of thinking and he became ultra right-wing.

585
00:39:38,260 --> 00:39:47,120
He moved to China for some reason and he basically founded a whole school of right-wing thought,

586
00:39:47,120 --> 00:39:54,220
which combined the technophilia of the singularitarian movement, you know, the people who were looking

587
00:39:54,220 --> 00:40:00,960
for a technological singularity, and people who were pretty explicitly right-wing, possibly

588
00:40:00,960 --> 00:40:02,600
even authoritarian.

589
00:40:02,600 --> 00:40:08,120
And as much as I think the word gets overused, one might even say fascistic.

590
00:40:08,120 --> 00:40:14,980
And Nick Land and his writing, which is famously impenetrable, I mean, he comes from that portion

591
00:40:14,980 --> 00:40:24,220
of the academic left, which basically makes their prose really florid and impenetrable

592
00:40:24,220 --> 00:40:29,500
because I think they, you know, what they have to say wouldn't take that many words

593
00:40:29,500 --> 00:40:32,180
to say if they just said it in plain English.

594
00:40:32,180 --> 00:40:38,220
And because they're inspired by, you know, mid-20th century French obscurantists, that's

595
00:40:38,220 --> 00:40:39,620
just how they write.

596
00:40:39,620 --> 00:40:45,460
And Nick Land, as much as he's rejected the priorities of the left, has retained their

597
00:40:45,460 --> 00:40:46,460
writing style.

598
00:40:46,460 --> 00:40:51,420
So his writing on accelerationism is very difficult.

599
00:40:51,420 --> 00:40:52,420
It's dense.

600
00:40:52,420 --> 00:40:55,940
It is unnecessarily replete with $2 words.

601
00:40:55,940 --> 00:41:03,100
And because it is not very clear, you can read into it pretty much anything you want.

602
00:41:03,100 --> 00:41:06,780
And if you're a leftist and you want to hate on the right, then you can, I mean, it's a

603
00:41:06,780 --> 00:41:07,780
Rorschach test.

604
00:41:07,780 --> 00:41:09,580
You can read anything into it that you want.

605
00:41:09,580 --> 00:41:15,580
So typically, if you say go to Google and you just, you know, very casually search for

606
00:41:15,580 --> 00:41:22,220
R slash ACC, which is to say right-wing accelerationism, it'll tell you that right-wing accelerationists

607
00:41:22,220 --> 00:41:28,820
want to use technology to bring back slavery and enslave women and, you know, reinstate

608
00:41:28,820 --> 00:41:30,140
the patriarchy.

609
00:41:30,140 --> 00:41:36,060
And surely there is somebody somewhere who fits that description, but it's not Nick Land.

610
00:41:36,060 --> 00:41:41,740
And it's not most right accelerationists, but, you know, because it's just, it is a

611
00:41:41,740 --> 00:41:46,660
move of convenience rather than actually take the time to figure out what somebody you don't

612
00:41:46,660 --> 00:41:50,940
like actually means, just describe to them the worst possible thing they might mean and

613
00:41:50,940 --> 00:41:52,260
just assert that it's fact.

614
00:41:52,260 --> 00:41:54,780
I mean, that is just how the left operates.

615
00:41:54,780 --> 00:42:00,100
It's how part of the right operates as well, but yeah.

616
00:42:00,100 --> 00:42:05,180
Accelerationism just on its own without any sort of modifier just means we need to move

617
00:42:05,180 --> 00:42:06,180
faster.

618
00:42:06,180 --> 00:42:09,540
Whatever we're doing now, we need to do more of it and faster because that'll get us to

619
00:42:09,540 --> 00:42:10,540
a better place.

620
00:42:10,540 --> 00:42:16,380
Now, some people think it'll get us to a better place just in a linear progression of betterness.

621
00:42:16,380 --> 00:42:17,540
You know, things are good now.

622
00:42:17,540 --> 00:42:18,540
We push it further.

623
00:42:18,540 --> 00:42:19,540
It gets better.

624
00:42:19,540 --> 00:42:20,540
We push it further than that.

625
00:42:20,540 --> 00:42:22,260
It gets even better.

626
00:42:22,260 --> 00:42:26,860
But some people, and this is particularly true of left accelerationists, think that

627
00:42:26,860 --> 00:42:35,420
we need to accelerate and accentuate techno-capitalistic processes because that's going to break capitalism.

628
00:42:35,420 --> 00:42:40,380
And if you remember earlier, I said, yeah, the AI, you know, not under its own volition

629
00:42:40,380 --> 00:42:45,900
because it doesn't want anything itself, but, you know, the ownership class using AI is

630
00:42:45,900 --> 00:42:51,860
coming for everybody's livelihood, everybody's secure living that they and people like them

631
00:42:51,860 --> 00:42:59,140
and people who came before them in their industry have fought for is being taken away via AI,

632
00:42:59,140 --> 00:43:00,820
among other things.

633
00:43:00,820 --> 00:43:04,740
But it's happening at different speeds, in different ways, in different industries.

634
00:43:04,740 --> 00:43:09,740
And if it happened to everybody all at the same time, then we would understand.

635
00:43:09,740 --> 00:43:11,940
We need a new economic system.

636
00:43:11,940 --> 00:43:16,300
This economic system is not serving everybody.

637
00:43:16,300 --> 00:43:18,980
It is serving a few at the expense of everybody.

638
00:43:18,980 --> 00:43:21,060
Now, a lot of people say that already.

639
00:43:21,060 --> 00:43:23,780
They see that already, but a lot of other people don't see it.

640
00:43:23,780 --> 00:43:29,500
If everybody lost their job to AI on the same day, you couldn't help but see it.

641
00:43:29,500 --> 00:43:34,660
And so you could offer that as an incentive for accelerationism.

642
00:43:34,660 --> 00:43:37,340
Hey, this is bad.

643
00:43:37,340 --> 00:43:39,180
Our resistance to it is ineffective.

644
00:43:39,180 --> 00:43:43,500
Let's just push it as fast and as far as it'll go in the direction it's already going so

645
00:43:43,500 --> 00:43:45,740
that the whole thing breaks.

646
00:43:45,740 --> 00:43:48,220
That's the left accelerationist viewpoint.

647
00:43:48,220 --> 00:43:51,260
There are two distinct right accelerationist viewpoints.

648
00:43:51,260 --> 00:43:55,820
One of them, and this is, I think, as I say, famously impenetrable writer, but this is

649
00:43:55,820 --> 00:43:58,160
what I think Nick Land means.

650
00:43:58,160 --> 00:44:03,660
We should push techno capitalist processes because that's going to bring about the creation

651
00:44:03,660 --> 00:44:09,460
of a new form of intelligence that will transcend human intelligence.

652
00:44:09,460 --> 00:44:11,180
This is the singularity.

653
00:44:11,180 --> 00:44:13,660
Most people won't get a piece of it.

654
00:44:13,660 --> 00:44:15,580
Most people aren't going to get uploaded to the cloud.

655
00:44:15,580 --> 00:44:18,960
Most people are not going to enjoy biological immortality.

656
00:44:18,960 --> 00:44:22,900
Most people are just going to get thrown away and pushed off by the wayside, but that's

657
00:44:22,900 --> 00:44:26,500
okay because in the long run, all humans are going to die anyway.

658
00:44:26,500 --> 00:44:29,380
And what we're looking for is our glorious technological destiny.

659
00:44:29,380 --> 00:44:34,260
Now, I imagine Nick Land would think that that's a caricature of his position, but I

660
00:44:34,260 --> 00:44:40,180
think it's a fairer caricature than you'll get from most people on the left.

661
00:44:40,180 --> 00:44:45,940
So that's kind of a dark exclusionary singularity vision of right accelerationism.

662
00:44:45,940 --> 00:44:51,140
The other right accelerationist viewpoint is the one articulated by Menchus Moldbug,

663
00:44:51,140 --> 00:44:54,420
aka Curtis Yarvin.

664
00:44:54,420 --> 00:44:56,300
Curtis Yarvin is a neo-reactionary.

665
00:44:56,300 --> 00:45:00,740
He's also a monarchist, and what he would like to see is just a fragmentation of all

666
00:45:00,740 --> 00:45:06,740
of these big political powers, like the United States or the EU or China, into these tiny

667
00:45:06,740 --> 00:45:12,500
little fiefdoms, little city-states, each of which is ruled by a monarch of some sort,

668
00:45:12,500 --> 00:45:18,340
preferably a hereditary monarch so that the monarch has a long-term vision, has skin in

669
00:45:18,340 --> 00:45:23,180
the game that extends beyond his or her own life, much less beyond the current term to

670
00:45:23,180 --> 00:45:24,820
which they have been elected.

671
00:45:24,820 --> 00:45:28,140
But ultimately, it's not a technological vision.

672
00:45:28,140 --> 00:45:32,900
As far as I understand it, it really is kind of a pining for the past, an acceleration

673
00:45:32,900 --> 00:45:41,220
to a collapse of the current sprawling large integrated political systems that use technology

674
00:45:41,220 --> 00:45:47,460
to affect that integration into something which is fragmented, individualistic in not

675
00:45:47,460 --> 00:45:53,200
in terms of an individual human beings range of options in the world, but individualistic

676
00:45:53,200 --> 00:45:59,020
in the variety of cultural expression that you get if you take one country and break

677
00:45:59,020 --> 00:46:02,020
it up into 400 little city-states.

678
00:46:02,020 --> 00:46:08,780
So that's accelerationism, both sort of neutral accelerationism, left acceleration, and two

679
00:46:08,780 --> 00:46:10,420
varieties of right accelerationism.

680
00:46:10,420 --> 00:46:15,780
If you're in tech, you certainly know the name Mark Andreessen, but if not, he is the

681
00:46:15,780 --> 00:46:17,620
guy that invented Netscape.

682
00:46:17,620 --> 00:46:24,580
He was the president of Netscape, but also integral in creating the first widely used

683
00:46:24,580 --> 00:46:28,020
web browser, Netscape Navigator.

684
00:46:28,020 --> 00:46:29,620
Now I say widely used.

685
00:46:29,620 --> 00:46:31,020
It wasn't the first web browser.

686
00:46:31,020 --> 00:46:32,660
That's not what I'm saying.

687
00:46:32,660 --> 00:46:35,340
It certainly came after Mosaic.

688
00:46:35,340 --> 00:46:41,340
But Mark Andreessen, among other people, including Jeff Bezos, have now openly started calling

689
00:46:41,340 --> 00:46:45,180
themselves effective accelerationists.

690
00:46:45,180 --> 00:46:49,180
And that's abbreviated E slash ACC.

691
00:46:49,180 --> 00:46:50,740
And he's got it in his Twitter handle.

692
00:46:50,740 --> 00:46:58,100
I know, Elon Musk changed the name to X. Just let that go.

693
00:46:58,100 --> 00:47:02,380
So what does that mean, effective accelerationism?

694
00:47:02,380 --> 00:47:05,500
And what do effective accelerationists want?

695
00:47:05,500 --> 00:47:14,300
Well, remember Sam Bankman Fried and the FTX crypto exchange, which crashed catastrophically.

696
00:47:14,300 --> 00:47:17,940
Fried is either in prison or under house arrest or awaiting trial.

697
00:47:17,940 --> 00:47:20,140
I don't know exactly where he is in that process.

698
00:47:20,140 --> 00:47:26,900
But he and his parents were advocates of something called effective altruism.

699
00:47:26,900 --> 00:47:29,820
Now maybe you've heard people have done audits.

700
00:47:29,820 --> 00:47:33,660
People have investigated different charitable organizations and discovered that most of

701
00:47:33,660 --> 00:47:38,760
the money that people donate to those organizations goes to the organization itself.

702
00:47:38,760 --> 00:47:40,660
It goes to pay executive salaries.

703
00:47:40,660 --> 00:47:43,980
It goes to pay for facilities.

704
00:47:43,980 --> 00:47:46,620
It basically is upkeep for the organization.

705
00:47:46,620 --> 00:47:50,060
And not much of that money actually goes to help people.

706
00:47:50,060 --> 00:47:54,960
In terms of philanthropy, it's just not very effective.

707
00:47:54,960 --> 00:48:02,260
So number-driven people, data-driven people, wanted to bring the principles and the discipline

708
00:48:02,260 --> 00:48:08,200
of data science and mathematical thinking to the topic of charitable giving.

709
00:48:08,200 --> 00:48:14,560
They wanted to find the charitable giving opportunities that would create the most benefit

710
00:48:14,560 --> 00:48:15,700
for the people in need.

711
00:48:15,700 --> 00:48:18,540
They wanted effective altruism.

712
00:48:18,540 --> 00:48:24,580
Well, because effective altruism is something that was championed by the Silicon Valley

713
00:48:24,580 --> 00:48:33,220
set by tech-centric people who tend to be easy targets for disingenuous attacks, because

714
00:48:33,220 --> 00:48:38,780
they often lack social skills and they lack the ability to discern when they're being

715
00:48:38,780 --> 00:48:40,660
attacked.

716
00:48:40,660 --> 00:48:47,980
So even before the crash of FTX and the downfall of Sam Bankman Fried, effective altruism,

717
00:48:47,980 --> 00:48:54,220
which is typically abbreviated online as EA, not Electronic Arts, but effective altruism,

718
00:48:54,220 --> 00:48:55,740
its reputation was already tarnished.

719
00:48:55,740 --> 00:48:57,260
It was already starting to get a bad name.

720
00:48:57,260 --> 00:49:02,060
But then when FTX collapsed and its poster boy, who was also the poster boy for effective

721
00:49:02,060 --> 00:49:09,660
altruism, when his reputation took a nosedive, then the rat started abandoning the sinking

722
00:49:09,660 --> 00:49:12,020
ship of effective altruism.

723
00:49:12,020 --> 00:49:17,260
And I think that effective accelerationism is a play on that.

724
00:49:17,260 --> 00:49:23,740
It's saying, hey, this is the next step in the evolution of this concept.

725
00:49:23,740 --> 00:49:30,300
But now we're taking something which is familiar, like Ray Kurzweil, who is probably the best

726
00:49:30,300 --> 00:49:35,460
known advocate for and articulator of the idea of the technological singularity.

727
00:49:35,460 --> 00:49:39,700
He explained the inevitability of the singularity using something he called the law of accelerating

728
00:49:39,700 --> 00:49:40,940
returns.

729
00:49:40,940 --> 00:49:41,940
It's a function of nature.

730
00:49:41,940 --> 00:49:48,300
It is a function of the universe that over time self-replicating structures take shape

731
00:49:48,300 --> 00:49:53,620
and they're slow in replication and slow in their evolution at first.

732
00:49:53,620 --> 00:49:58,900
But each replicator creates a new style or a new form of replication that moves even

733
00:49:58,900 --> 00:50:03,100
faster than the one before, orders of magnitude faster.

734
00:50:03,100 --> 00:50:07,700
So we don't know exactly what came before DNA, but something did.

735
00:50:07,700 --> 00:50:13,660
And then DNA came along and it evolved at the pace of evolution via natural selection.

736
00:50:13,660 --> 00:50:20,180
But eventually some organisms, which evolved at that speed, they grew big brains and they

737
00:50:20,180 --> 00:50:22,580
started to think about things.

738
00:50:22,580 --> 00:50:26,820
They invented language or discovered language, however you want to think about it.

739
00:50:26,820 --> 00:50:28,100
They started writing things down.

740
00:50:28,100 --> 00:50:32,740
They started transmitting information from one generation to the next.

741
00:50:32,740 --> 00:50:39,060
And that spiraled out into all kinds of cultural forms and cultural artifacts, many of which

742
00:50:39,060 --> 00:50:45,180
encoded information, transmitted information, and a very slow evolution by natural selection

743
00:50:45,180 --> 00:50:53,420
gave rise to culture and an accelerated sort of evolution in the realm of cultural replicators,

744
00:50:53,420 --> 00:50:55,380
otherwise known as memes.

745
00:50:55,380 --> 00:51:00,020
Back before the word meme came to mean a picture with a funny caption.

746
00:51:00,020 --> 00:51:06,920
So when Ray Kurzweil talked about the law of accelerating returns and how biology gives

747
00:51:06,920 --> 00:51:11,780
rise to culture, which then gives rise to information technology, which moves even faster

748
00:51:11,780 --> 00:51:16,940
than culture, which in turn will give rise to something else, which moves even more quickly

749
00:51:16,940 --> 00:51:24,060
still, he didn't tie economics into it in any explicit fashion.

750
00:51:24,060 --> 00:51:26,380
His theory was that it's a law of nature.

751
00:51:26,380 --> 00:51:27,380
It applied everywhere.

752
00:51:27,380 --> 00:51:34,340
It would apply in communist top-down authoritarian regimes as much as it would in a capitalist

753
00:51:34,340 --> 00:51:35,820
system.

754
00:51:35,820 --> 00:51:39,100
But the effective accelerationists have abandoned that.

755
00:51:39,100 --> 00:51:46,700
They have explicitly tied the benefits of accelerating technology with the benefits

756
00:51:46,700 --> 00:51:52,860
of capitalism, saying that one doesn't really work without the other, such that without

757
00:51:52,860 --> 00:51:59,020
blushing billionaires say, yeah, the process that made me a billionaire and the part that

758
00:51:59,020 --> 00:52:03,660
goes unspoken is and impoverished so many other people, that's good stuff.

759
00:52:03,660 --> 00:52:07,580
We want more of that, more of it and faster.

760
00:52:07,580 --> 00:52:14,100
Now some of the people on Twitter and Twitter or X is where this is really taking shape

761
00:52:14,100 --> 00:52:15,300
and evolving.

762
00:52:15,300 --> 00:52:21,320
You can follow the evolution of effective accelerationism in real time.

763
00:52:21,320 --> 00:52:27,020
So I'm going to read to you a bit from a story in Business Insider.

764
00:52:27,020 --> 00:52:32,660
The title is Get the Lowdown on E slash ACC or Effective Accelerationism.

765
00:52:32,660 --> 00:52:39,340
And just for simplicity sake, E slash ACC appears many times in this article.

766
00:52:39,340 --> 00:52:44,580
I'm just always going to read it as effective accelerationism.

767
00:52:44,580 --> 00:52:49,340
So the title is Get the Lowdown on Effective Accelerationism, Silicon Valley's favorite

768
00:52:49,340 --> 00:52:55,180
obscure theory about progress at all costs, which has been embraced by Marc Andreessen.

769
00:52:55,180 --> 00:52:59,140
There's an obscure theory doing the rounds in Silicon Valley as it quickly becomes the

770
00:52:59,140 --> 00:53:02,100
new ideological hobby of tech's power players.

771
00:53:02,100 --> 00:53:04,940
It's called effective accelerationism.

772
00:53:04,940 --> 00:53:10,020
On Twitter, now rebranded to X, some of the tech community's most prominent figures, including

773
00:53:10,020 --> 00:53:15,520
veteran investors Marc Andreessen and Gary Tan, have decided to include the term effective

774
00:53:15,520 --> 00:53:20,580
acceleration in their usernames as a badge of allegiance to the vision.

775
00:53:20,580 --> 00:53:24,980
So what exactly are the underlying tenets of effective accelerationism and why is it

776
00:53:24,980 --> 00:53:27,620
having a moment right now?

777
00:53:27,620 --> 00:53:29,700
Let's start with the name.

778
00:53:29,700 --> 00:53:34,440
It's a bit of a play on effective altruism, the social movement focused on an evidence-led

779
00:53:34,440 --> 00:53:39,860
form of philanthropy, which was infamously embraced by Sam Bankman Fried, the disgraced

780
00:53:39,860 --> 00:53:43,300
founder of crypto exchange FTX.

781
00:53:43,300 --> 00:53:47,180
The ideas of effective accelerationism appear to have their genesis in the theories of Nick

782
00:53:47,180 --> 00:53:51,260
Land, a British philosopher who lectured at the University of Warwick and who has come

783
00:53:51,260 --> 00:53:55,340
to be known as the father of the broader accelerationism movement.

784
00:53:55,340 --> 00:54:00,760
The more formalized effective acceleration idea has taken shape on Twitter and through

785
00:54:00,760 --> 00:54:04,060
substack newsletters since 2022.

786
00:54:04,060 --> 00:54:06,300
The basic idea of the philosophy is this.

787
00:54:06,300 --> 00:54:11,880
In a technological age, the powers of innovation and capital should be exploited to their extremes

788
00:54:11,880 --> 00:54:18,100
to drive radical social change, even if that means completely upending today's social order.

789
00:54:18,100 --> 00:54:25,820
The first effective accelerationism post co-authored by users named at Zestular, at Creatine Cycle,

790
00:54:25,820 --> 00:54:32,220
at Based Beth Jesus, and at Bazelord said technology and market forces, which they term

791
00:54:32,220 --> 00:54:38,540
techno-capital, are accelerating with a force that quote, cannot be stopped, close quote.

792
00:54:38,540 --> 00:54:43,900
Techno-capital can usher in the next evolution of consciousness, creating unthinkable next-generation

793
00:54:43,900 --> 00:54:48,260
life forms and silicon-based awareness, close quote, the post said.

794
00:54:48,260 --> 00:54:52,820
In an effective accelerationist world, no idea that offers hypothetical value should

795
00:54:52,820 --> 00:54:58,660
be considered too absurd, too dangerous, too out there to make a reality.

796
00:54:58,660 --> 00:55:04,760
For effective accelerationist adherence, the path of progress at all costs would, in theory,

797
00:55:04,760 --> 00:55:09,460
make possible any imaginative idea with a purported benefit to humanity.

798
00:55:09,460 --> 00:55:13,620
That could mean justifying the development of something as outlandish as Dyson's spheres,

799
00:55:13,620 --> 00:55:18,100
physicist Freeman Dyson's theoretical megastructures which would surround a star to harvest its

800
00:55:18,100 --> 00:55:24,800
energy, or something closer on the horizon, like artificial general intelligence.

801
00:55:24,800 --> 00:55:26,220
So I'm going to stop now.

802
00:55:26,220 --> 00:55:27,980
The next section is why is it happening now?

803
00:55:27,980 --> 00:55:31,060
Well, I think I've already telegraphed that.

804
00:55:31,060 --> 00:55:38,820
One, it's taking the baton from effective altruism, and two, it is the rollout of the

805
00:55:38,820 --> 00:55:45,460
large language model chatbots which has got everybody so excited about artificial intelligence.

806
00:55:45,460 --> 00:55:48,820
Now I've mentioned before, although maybe this is the first time you're listening to

807
00:55:48,820 --> 00:55:55,220
one of my podcasts, in the 90s I went to grad school for philosophy, and my area of emphasis

808
00:55:55,220 --> 00:56:00,980
was the philosophy of science and the philosophy of mind, and I was laser-focused on artificial

809
00:56:00,980 --> 00:56:01,980
intelligence.

810
00:56:01,980 --> 00:56:08,060
I was obsessed with the future of artificial intelligence back in the 90s.

811
00:56:08,060 --> 00:56:14,460
That kind of went away over the next few decades, and for a time I was explicitly a techno-doomer.

812
00:56:14,460 --> 00:56:20,100
I thought that the techno-industrial system was about to crash because of a lack of hydrocarbon

813
00:56:20,100 --> 00:56:21,100
energy.

814
00:56:21,100 --> 00:56:27,700
Well, the crash didn't come, and AI is here, and things are getting weird and moving fast

815
00:56:27,700 --> 00:56:30,700
now, and here's the thing.

816
00:56:30,700 --> 00:56:32,020
I don't know.

817
00:56:32,020 --> 00:56:38,580
I don't have an intellectual conviction, whether we have hit the knee of the curve, the inflection

818
00:56:38,580 --> 00:56:43,300
point in this rate of increase where things are just going to move faster and faster from

819
00:56:43,300 --> 00:56:50,380
now on, or if we're experiencing a moment of punctuated equilibrium where things jumped

820
00:56:50,380 --> 00:56:55,700
quickly with the introduction of the large language models, but then they'll slow down

821
00:56:55,700 --> 00:57:02,940
as the entire culture learns to absorb this potentially disruptive innovation.

822
00:57:02,940 --> 00:57:06,180
And I don't even know what to hope for.

823
00:57:06,180 --> 00:57:11,060
I've done a lot of things for money over the years, but the thing that I've done the longest

824
00:57:11,060 --> 00:57:15,940
is podcasting, which certainly would not be possible without the internet and without

825
00:57:15,940 --> 00:57:16,940
social media.

826
00:57:16,940 --> 00:57:22,940
I mean, I suppose podcasting just as an RSS feed, a way of delivering audio files to people,

827
00:57:22,940 --> 00:57:28,100
would be possible without social media, but how would you tell people about your podcast?

828
00:57:28,100 --> 00:57:29,540
How would they discover it?

829
00:57:29,540 --> 00:57:36,620
Really, my livelihood has been built around the internet for my entire adult life, and

830
00:57:36,620 --> 00:57:40,820
yet I can easily entertain the idea that my life would have been better if the internet

831
00:57:40,820 --> 00:57:44,700
had never been developed or if I had been born earlier.

832
00:57:44,700 --> 00:57:50,180
So I don't really have to spell out how artificial intelligence is related to this particular

833
00:57:50,180 --> 00:57:55,780
topic as I'm stringing together talk of the UAP Senate hearings and the Hollywood writers

834
00:57:55,780 --> 00:57:56,780
and actors strike.

835
00:57:56,780 --> 00:58:02,940
I mean, this particular topic is explicitly about Silicon Valley and Silicon Valley tech

836
00:58:02,940 --> 00:58:07,020
lords who think, yeah, everything that we've been doing has made the world better.

837
00:58:07,020 --> 00:58:09,780
We need to do a lot more of it a lot faster.

838
00:58:09,780 --> 00:58:16,240
And we cannot slow down regardless of who claims that they have been hurt or any claims

839
00:58:16,240 --> 00:58:22,780
that we are plunging ahead heedlessly into increasingly dangerous territory.

840
00:58:22,780 --> 00:58:27,820
The idea that the singularitarian vision, the idea that we're headed for a technological

841
00:58:27,820 --> 00:58:33,260
singularity which will basically solve all problems and create a whole host of new ones,

842
00:58:33,260 --> 00:58:41,540
but solve all the problems that we currently face, that has been the sort of cult religion

843
00:58:41,540 --> 00:58:46,920
of the Silicon Valley tech elite for quite a while now, but it's morphing in this moment.

844
00:58:46,920 --> 00:58:55,980
It is morphing from a technological singularity to a techno capital singularity.

845
00:58:55,980 --> 00:59:00,940
So what do we do about it?

846
00:59:00,940 --> 00:59:05,100
I hate to say it, but as far as I can see, there's nothing to do.

847
00:59:05,100 --> 00:59:12,420
Just watch, pay attention to what's happening or don't.

848
00:59:12,420 --> 00:59:17,700
Occupy yourself with whatever inspires you, whatever gives you energy, whatever motivates

849
00:59:17,700 --> 00:59:23,100
you to get off the couch and go interact with the real world and people.

850
00:59:23,100 --> 00:59:24,100
Do that.

851
00:59:24,100 --> 00:59:29,560
I mean, I'm clearly obsessing over these topics and making lots of media about them, thinking

852
00:59:29,560 --> 00:59:33,360
about them, talking about them, writing about them.

853
00:59:33,360 --> 00:59:38,660
But the whole time I realized that I'm mostly just an observer here.

854
00:59:38,660 --> 00:59:41,660
I mean, I'm a citizen.

855
00:59:41,660 --> 00:59:51,380
I am a participant in the economy, but I seem to have very little leverage over these larger

856
00:59:51,380 --> 00:59:52,900
processes.

857
00:59:52,900 --> 00:59:57,720
Even with my interest and my background, I'm hard pressed to keep pace with developments

858
00:59:57,720 --> 01:00:02,020
in artificial intelligence and to understand what's happening.

859
01:00:02,020 --> 01:00:03,060
So what do we do about it?

860
01:00:03,060 --> 01:00:07,700
I mean, here's the call for feedback.

861
01:00:07,700 --> 01:00:12,160
What do you think we could possibly do about it?

862
01:00:12,160 --> 01:00:13,420
You could reject technology.

863
01:00:13,420 --> 01:00:18,700
I mean, I certainly wouldn't be the first person to say, hey, don't upgrade your cell

864
01:00:18,700 --> 01:00:19,700
phone.

865
01:00:19,700 --> 01:00:20,700
Don't get a new one.

866
01:00:20,700 --> 01:00:23,780
When the current one breaks, don't replace it.

867
01:00:23,780 --> 01:00:25,100
You've heard people say that before.

868
01:00:25,100 --> 01:00:26,100
You may have nodded.

869
01:00:26,100 --> 01:00:28,820
You may have thought, yeah, my life would probably be better.

870
01:00:28,820 --> 01:00:32,060
And then you went and replaced your phone with a newer model.

871
01:00:32,060 --> 01:00:37,140
It's kind of like saying, wasn't the world better when we didn't drive so much, when

872
01:00:37,140 --> 01:00:39,940
we walked more or we rode horses?

873
01:00:39,940 --> 01:00:44,900
Well, you still get in your car and you drive to work probably.

874
01:00:44,900 --> 01:00:50,740
Maybe you say, and I've heard people say this, I will never interact with an artificial intelligence.

875
01:00:50,740 --> 01:00:53,140
Yeah, you will.

876
01:00:53,140 --> 01:00:55,180
You certainly will.

877
01:00:55,180 --> 01:01:00,100
You probably already do and don't know it, but you certainly, certainly will.

878
01:01:00,100 --> 01:01:04,620
I mean, that's kind of like saying, I'll never show ID.

879
01:01:04,620 --> 01:01:05,940
I'll never get a driver's license.

880
01:01:05,940 --> 01:01:07,860
I'll never get a passport.

881
01:01:07,860 --> 01:01:12,700
I'll never knuckle under and be the subservient little citizen who shows ID upon command.

882
01:01:12,700 --> 01:01:15,020
Yeah, you will.

883
01:01:15,020 --> 01:01:16,020
You may not like it.

884
01:01:16,020 --> 01:01:19,260
You may explicitly object to it.

885
01:01:19,260 --> 01:01:22,260
You might rail against it, but you'll knuckle under.

886
01:01:22,260 --> 01:01:24,060
You'll fall in line.

887
01:01:24,060 --> 01:01:30,300
And with AI, it won't be a matter of, you know, at least not at first and at least not

888
01:01:30,300 --> 01:01:34,340
only, it won't be a matter of you being coerced into using this thing.

889
01:01:34,340 --> 01:01:37,940
It'll just be that your life gets easier when you accept the blessings that are bestowed

890
01:01:37,940 --> 01:01:40,700
upon you by artificial intelligence.

891
01:01:40,700 --> 01:01:48,060
And I hadn't planned to talk about this, so I'll keep it short and light.

892
01:01:48,060 --> 01:01:53,900
This sort of observer status that I'm describing, you know, that I occupy in terms of technology

893
01:01:53,900 --> 01:02:01,900
and culture, I explicitly apply it to electoral politics as well.

894
01:02:01,900 --> 01:02:10,820
I just refuse to get bent out of shape over, you know, which corporate party is in ascendance

895
01:02:10,820 --> 01:02:12,620
in a given moment.

896
01:02:12,620 --> 01:02:13,900
I don't live in a swing state.

897
01:02:13,900 --> 01:02:16,180
My vote does not matter.

898
01:02:16,180 --> 01:02:22,860
And ultimately, nobody's vote matters because policy does not reflect the needs or the preferences

899
01:02:22,860 --> 01:02:24,380
of the majority of voters.

900
01:02:24,380 --> 01:02:29,060
It reflects the needs and the preferences of the donor class.

901
01:02:29,060 --> 01:02:34,100
People like Mark Andreessen, people like Jeff Bezos.

902
01:02:34,100 --> 01:02:35,780
Politics serve them.

903
01:02:35,780 --> 01:02:37,420
It doesn't serve you.

904
01:02:37,420 --> 01:02:38,420
It doesn't serve me.

905
01:02:38,420 --> 01:02:40,060
And it doesn't matter how you vote.

906
01:02:40,060 --> 01:02:41,980
That's not going to change.

907
01:02:41,980 --> 01:02:48,500
So we're coming up on an election year, a presidential election year, at a time when

908
01:02:48,500 --> 01:02:58,420
AI is getting really, really good at faking stuff, at manipulating us, at not only curating

909
01:02:58,420 --> 01:03:03,460
the content that other humans have created, which will get us angry and get us animated,

910
01:03:03,460 --> 01:03:08,540
but also creating variations on it just algorithmically, automatically.

911
01:03:08,540 --> 01:03:16,660
If you are either constitutionally susceptible to that sort of political outrage, or if you

912
01:03:16,660 --> 01:03:21,180
over time have been transformed from somebody who's kind of a take it or leave it, easy

913
01:03:21,180 --> 01:03:27,580
going kind of person into a political fanatic, you're going to get your strings pulled all

914
01:03:27,580 --> 01:03:31,940
day every day in the next couple of years.

915
01:03:31,940 --> 01:03:37,940
And you might think that you are the driver, that you are the puppet master, but you're

916
01:03:37,940 --> 01:03:39,620
not.

917
01:03:39,620 --> 01:03:44,300
Up until this point, the puppet masters, while they have used technology, they have used

918
01:03:44,300 --> 01:03:49,060
artificial intelligence, they've used social media, they've used vast computer networks

919
01:03:49,060 --> 01:03:53,700
and surveillance systems to gather data on you and figure out what it is that will get

920
01:03:53,700 --> 01:03:56,920
you to respond in the way that they want you to respond.

921
01:03:56,920 --> 01:03:59,020
It's mostly been orchestrated by humans.

922
01:03:59,020 --> 01:04:02,980
But I think over the next decade, that's going to shift.

923
01:04:02,980 --> 01:04:09,340
And the puppet masters will more and more become the AI.

924
01:04:09,340 --> 01:04:12,260
Will that be for the good or will it be for the bad?

925
01:04:12,260 --> 01:04:13,580
I don't know.

926
01:04:13,580 --> 01:04:15,820
I don't know.

927
01:04:15,820 --> 01:04:19,740
All I do know is it's not going to serve me.

928
01:04:19,740 --> 01:04:22,740
It's not going to improve my quality of life to get bent out of shape over it.

929
01:04:22,740 --> 01:04:27,020
So I am resolved for this coming election cycle.

930
01:04:27,020 --> 01:04:30,060
I'm not going to advocate for any particular candidate.

931
01:04:30,060 --> 01:04:33,520
I don't care who you vote for.

932
01:04:33,520 --> 01:04:35,080
Not interested.

933
01:04:35,080 --> 01:04:43,780
I'm just watching to see how the introduction of artificial intelligence, specifically large

934
01:04:43,780 --> 01:04:49,060
language models and diffusion models for creating images, but new types of tech that will certainly

935
01:04:49,060 --> 01:04:51,700
be coming online in the coming months and years.

936
01:04:51,700 --> 01:04:58,620
I'm just watching this election to see how these new forces are coming into play.

937
01:04:58,620 --> 01:05:01,740
Now, I'm not omniscient.

938
01:05:01,740 --> 01:05:04,100
I won't be able to see everything happening.

939
01:05:04,100 --> 01:05:05,500
I'll have to intuit.

940
01:05:05,500 --> 01:05:07,180
I'll have to piece things together.

941
01:05:07,180 --> 01:05:13,220
I'll have to look to other people for their perspective and for their interpretations.

942
01:05:13,220 --> 01:05:17,240
But when it comes to electoral politics, that's where I am.

943
01:05:17,240 --> 01:05:20,220
My interest is artificial intelligence.

944
01:05:20,220 --> 01:05:25,700
Politics is one arena in which it makes itself felt, in which it manifests.

945
01:05:25,700 --> 01:05:30,040
And that's how I'm going to treat politics, as an arena in which something that I'm interested

946
01:05:30,040 --> 01:05:32,460
in plays out.

947
01:05:32,460 --> 01:05:41,100
But in terms of who wins the contest, I can't say I don't care, but I'm resolved not to

948
01:05:41,100 --> 01:05:48,840
make a point of talking about my preferences, my hopes, my desires, or more importantly,

949
01:05:48,840 --> 01:05:54,520
my irritation with other people who have different hopes and desires and intentions.

950
01:05:54,520 --> 01:06:01,420
If there's a prescription at the end of all this, it's just be kind, be patient, be tolerant,

951
01:06:01,420 --> 01:06:03,420
be forgiving.

952
01:06:03,420 --> 01:06:10,740
This is going to be a weird, confusing, exciting, stimulating time for all of us.

953
01:06:10,740 --> 01:06:15,700
And all of us, at one point or another, are going to get worked up in a way that we do

954
01:06:15,700 --> 01:06:22,740
things and say things that we will regret, you know, when our blood is running cooler.

955
01:06:22,740 --> 01:06:27,900
If you yourself get out of hand, forgive yourself.

956
01:06:27,900 --> 01:06:31,580
If somebody you know gets out of hand, forgive them.

957
01:06:31,580 --> 01:06:37,180
If they irritate you, take some time away, you know?

958
01:06:37,180 --> 01:06:42,220
Better to not talk than to have stupid arguments that don't really serve you or them.

959
01:06:42,220 --> 01:06:48,420
You know, that you just make yourself vehicles for this larger orchestrated contest that,

960
01:06:48,420 --> 01:06:54,260
no matter whether you win or lose the given argument, you lose.

961
01:06:54,260 --> 01:07:01,420
As Matthew Broderick's character learned in that film in the 1980s, what was that film?

962
01:07:01,420 --> 01:07:03,140
War games.

963
01:07:03,140 --> 01:07:11,640
For some games, like global thermonuclear war, the only way not to lose is not to play.

964
01:07:11,640 --> 01:07:16,660
So the cultural struggle around, you know, the coming election, I'm just announcing

965
01:07:16,660 --> 01:07:19,700
my intention not to play.

966
01:07:19,700 --> 01:07:20,700
All right.

967
01:07:20,700 --> 01:07:23,820
Well, I think that brings us to the end of this podcast.

968
01:07:23,820 --> 01:07:25,540
Thank you so much for listening.

969
01:07:25,540 --> 01:07:29,180
I put up a whole bunch of stuff on my Patreon feed.

970
01:07:29,180 --> 01:07:30,900
Almost none of it is behind the paywall.

971
01:07:30,900 --> 01:07:35,980
So if you see a link of mine that goes to Patreon, don't assume that you have to be

972
01:07:35,980 --> 01:07:40,980
a Patreon supporter of mine in order to access the content there.

973
01:07:40,980 --> 01:07:44,020
This is mostly for people on Twitter.

974
01:07:44,020 --> 01:07:45,020
Or X.

975
01:07:45,020 --> 01:07:52,780
I also have a sub stack and you can find that and you can find links to my sub stack articles

976
01:07:52,780 --> 01:07:54,140
on my Patreon feed.

977
01:07:54,140 --> 01:07:57,260
The Patreon feed has got the easiest of URLs.

978
01:07:57,260 --> 01:08:01,100
It's patreon.com slash KMO.

979
01:08:01,100 --> 01:08:06,980
Or for a rhyme for added accessibility, KMO.show is another place where you can get tuned into

980
01:08:06,980 --> 01:08:11,020
my content, which takes various forms and exists on various platforms.

981
01:08:11,020 --> 01:08:12,460
All right.

982
01:08:12,460 --> 01:08:13,460
Enough pimping.

983
01:08:13,460 --> 01:08:16,100
I will talk to you again pretty soon.

984
01:08:16,100 --> 01:08:24,500
Stay well.

