1
00:00:00,000 --> 00:00:20,440
KMO Show Episode Number 29, Sunday, February 23rd, 2025.

2
00:00:20,440 --> 00:00:22,460
Hey everybody, KMO here.

3
00:00:22,460 --> 00:00:25,720
And in this episode of the podcast, I'm going to share the first part of a conversation

4
00:00:25,720 --> 00:00:30,860
I recorded just today with Kopernikon, somebody I know from Substack.

5
00:00:30,860 --> 00:00:35,080
This was our first conversation, so I shouldn't really say I know him all that well.

6
00:00:35,080 --> 00:00:39,520
I don't know where he is in the world other than he is in the Mountain Time Zone.

7
00:00:39,520 --> 00:00:43,080
And I also know that his Substack bio reads as follows.

8
00:00:43,080 --> 00:00:48,680
A happy little concerned citizen and humble meme farmer who is quite tired of the BS present

9
00:00:48,680 --> 00:00:53,960
in modern politics and the intentional violence and evil committed by the progressive left.

10
00:00:53,960 --> 00:00:58,080
I have an MS in anthropology.

11
00:00:58,080 --> 00:01:04,280
And I will just say that I was struggling with Skype when it was time to start with

12
00:01:04,280 --> 00:01:05,280
the interview.

13
00:01:05,280 --> 00:01:09,640
I did not get my input set correctly, so even though I was speaking into a nice microphone,

14
00:01:09,640 --> 00:01:15,140
well not that nice, it's a black snowball, but it's better than the one that actually

15
00:01:15,140 --> 00:01:16,140
did the recording.

16
00:01:16,140 --> 00:01:21,760
It sounds for all the world as if it was the built-in mic in my laptop that picked up my

17
00:01:21,760 --> 00:01:22,760
voice.

18
00:01:22,760 --> 00:01:24,080
Sorry about that.

19
00:01:24,080 --> 00:01:27,920
But the conversation was supposed to be about science fiction, and we do return to science

20
00:01:27,920 --> 00:01:29,240
fiction from time to time.

21
00:01:29,240 --> 00:01:33,880
But here's my conversation with Kopernikon.

22
00:01:33,880 --> 00:01:35,360
You're listening to the KMO show.

23
00:01:35,360 --> 00:01:40,600
I'm your host, KMO, and I am joined by Kopernikon, somebody who, like most of my guests these

24
00:01:40,600 --> 00:01:42,400
days, I know through Substack.

25
00:01:42,400 --> 00:01:44,320
Kopernikon, good to talk to you.

26
00:01:44,320 --> 00:01:45,320
Good to meet you.

27
00:01:45,320 --> 00:01:49,480
It's been great getting to chat with you ahead of the show, and I'm happy to get started.

28
00:01:49,480 --> 00:01:56,320
Well, my Substack is all about science fiction and futurism, and you contacted me after I

29
00:01:56,320 --> 00:02:01,200
had posted a comment on one of your posts having to do with cyberpunk.

30
00:02:01,200 --> 00:02:07,000
So I wonder if you'd be interested in recounting briefly the point of that post.

31
00:02:07,000 --> 00:02:13,200
Well, so initially I responded, I remember I was walking around the house reading your

32
00:02:13,200 --> 00:02:19,960
response because I thought it was brilliant, because you actually understood the details

33
00:02:19,960 --> 00:02:23,600
of some of the things that were said in that post, and a lot of people just sort of blazed

34
00:02:23,600 --> 00:02:25,000
past it.

35
00:02:25,000 --> 00:02:31,040
The exact bit is the post itself was a discussion about, in my opinion, the cyberpunk genre

36
00:02:31,040 --> 00:02:33,400
is mostly dead at this point.

37
00:02:33,400 --> 00:02:39,200
And it's dead because the future it described is a future past in the same way as 1950s

38
00:02:39,200 --> 00:02:40,200
futurism is.

39
00:02:40,200 --> 00:02:46,240
So the specific subparticle of that post that you responded to was a discussion about how

40
00:02:46,240 --> 00:02:49,240
neon lights have been replaced by LEDs.

41
00:02:49,240 --> 00:02:54,640
And while that's a minor aesthetic change in the context of something like a film or

42
00:02:54,640 --> 00:03:00,040
a television show in a societal sense, that's very significant because neon lights are much

43
00:03:00,040 --> 00:03:04,880
more expensive than LEDs, so you have much more limited lighting, so you have much reduced

44
00:03:04,880 --> 00:03:05,920
surveillance.

45
00:03:05,920 --> 00:03:09,080
So that means that you can have people sliding underground.

46
00:03:09,080 --> 00:03:13,140
You can have people moving around the edges of society in a very real and physical sense

47
00:03:13,140 --> 00:03:18,000
that can't exist in the modern world because everything is classified and everything is

48
00:03:18,000 --> 00:03:19,000
surveilled.

49
00:03:19,000 --> 00:03:23,900
And we have bizarre forms of pseudo social credit systems in the West and actual social

50
00:03:23,900 --> 00:03:30,600
credit systems in the East that ensure that everybody has to play by the same rules as

51
00:03:30,600 --> 00:03:35,080
those who lead and those who lead make the rules, of course.

52
00:03:35,080 --> 00:03:40,280
I also brought in the notion of Jevons paradox.

53
00:03:40,280 --> 00:03:46,040
For many, many years, my podcast was almost entirely devoted to the notion of a fast collapse

54
00:03:46,040 --> 00:03:53,880
of industrial society because of peak oil or some other bottleneck in the complex support

55
00:03:53,880 --> 00:03:56,920
system that industrial civilization requires.

56
00:03:56,920 --> 00:04:04,040
I have renounced all Doomer participation in Doomer narratives since then, but there's

57
00:04:04,040 --> 00:04:09,400
a lot of tropes that come with that that stick with me that are useful from time to time.

58
00:04:09,400 --> 00:04:14,360
And one of those is Jevons paradox, which is the idea that if we invent a way to do

59
00:04:14,360 --> 00:04:18,920
something that we were doing before but more efficiently rather than banking the savings,

60
00:04:18,920 --> 00:04:20,320
we just do more of it.

61
00:04:20,320 --> 00:04:24,840
Yes, that's definitely something that we've observed through the beginning of the 21st

62
00:04:24,840 --> 00:04:26,960
century, especially when it comes to things like information.

63
00:04:26,960 --> 00:04:32,420
I mean, AI can now pump out a tremendous amount of information, and we're not using this new

64
00:04:32,420 --> 00:04:35,680
ability to make gains.

65
00:04:35,680 --> 00:04:42,080
We're using it to just increase the velocity at which bad information can be created.

66
00:04:42,080 --> 00:04:43,080
That is true.

67
00:04:43,080 --> 00:04:50,240
In the case of the neon lights and the LEDs, though, LEDs, they use less electricity.

68
00:04:50,240 --> 00:04:53,640
They're more expensive to manufacture, but they last longer.

69
00:04:53,640 --> 00:04:59,360
So over time, your amortized cost for the LED is lower than for the neon lights.

70
00:04:59,360 --> 00:05:05,920
And neon is just one noble gas among many that gets used in what we call neon lighting.

71
00:05:05,920 --> 00:05:13,120
But we could just replace the neon lights with LEDs and call it a day, but we don't.

72
00:05:13,120 --> 00:05:14,400
We do more and more.

73
00:05:14,400 --> 00:05:19,560
And that is the application of Jevons paradox in this instance.

74
00:05:19,560 --> 00:05:27,120
I mentioned my my peak oil doomer days because scrolling through your your substack feed,

75
00:05:27,120 --> 00:05:31,680
you are reading a lot of the same books that I read like a decade and a half ago and in

76
00:05:31,680 --> 00:05:35,160
many instances, you know, interviewed the authors.

77
00:05:35,160 --> 00:05:41,840
And in one instance, which is a pretty significant one, given how you've talked about it, longtime

78
00:05:41,840 --> 00:05:46,000
listeners to my podcasting efforts will certainly know the name John Michael Greer.

79
00:05:46,000 --> 00:05:51,680
I have interviewed him a dozen or more times over the years and met him in person many

80
00:05:51,680 --> 00:05:55,920
times participated in the same events, things like that.

81
00:05:55,920 --> 00:06:00,280
We largely disagree about the the trajectory of industrial civilization these days.

82
00:06:00,280 --> 00:06:08,360
But he has full, full respect for me for his work ethic and his creativity and his intelligence.

83
00:06:08,360 --> 00:06:14,440
But the book of his that you speak very highly of is the Eco Eco Technic Future.

84
00:06:14,440 --> 00:06:20,160
So I would encourage you to praise that book in whatever way you know it is meaningful

85
00:06:20,160 --> 00:06:21,160
to you.

86
00:06:21,160 --> 00:06:26,920
Well, so I more or less have spent a little bit of time looking for that book is a good

87
00:06:26,920 --> 00:06:27,920
way to describe it.

88
00:06:27,920 --> 00:06:30,680
I read RKU Futurism.

89
00:06:30,680 --> 00:06:33,800
I thought that was kind of a mess and didn't really enjoy it.

90
00:06:33,800 --> 00:06:39,240
But it was why it's been widely discussed and widely spread that specific book.

91
00:06:39,240 --> 00:06:46,720
I read Breaking Together, which is another book that discusses sort of peak oil limits

92
00:06:46,720 --> 00:06:47,800
to growth, that kind of thing.

93
00:06:47,800 --> 00:06:52,760
And I also read the Limits to Growth, which recently had a new publication in 2022.

94
00:06:52,760 --> 00:06:57,640
The Limits to Growth always seemed overly alarmist and every version that I've seen

95
00:06:57,640 --> 00:07:00,000
of it, they're overly alarmist.

96
00:07:00,000 --> 00:07:03,920
And that's in comparison to somebody who does think that we're going to undergo major, major

97
00:07:03,920 --> 00:07:06,960
restructuring due to resource shortages.

98
00:07:06,960 --> 00:07:11,520
RKU Futurism is a kind of a vanity project.

99
00:07:11,520 --> 00:07:22,720
But Ecotechnic Future, I think, is the only book that provides a social and technical

100
00:07:22,720 --> 00:07:29,280
and pseudo-industrial framework for looking at the way human societies are likely to develop

101
00:07:29,280 --> 00:07:34,960
as they transition from universal resource abundance, which is to say that per person

102
00:07:34,960 --> 00:07:39,680
the amount of energy and the amount of raw resources is very high to a post resource

103
00:07:39,680 --> 00:07:46,160
abundance period, where the amount of energy and the amount of resources per person is

104
00:07:46,160 --> 00:07:47,560
dramatically reduced.

105
00:07:47,560 --> 00:07:54,800
And partially that will be just because there's limits to energy production and there's a

106
00:07:54,800 --> 00:07:56,880
lot of people and a lot of industry using it.

107
00:07:56,880 --> 00:08:03,360
And partially it will be because the quality of our ore has gone down dramatically since

108
00:08:03,360 --> 00:08:07,880
we were first building civilization in the 1800s.

109
00:08:07,880 --> 00:08:13,080
You used to be able to walk up to, there were places you could go in the world where you

110
00:08:13,080 --> 00:08:19,580
could find like copper ore, for example, that was two parts rock and one part copper.

111
00:08:19,580 --> 00:08:24,480
And now it's 199 point parts rock and one part copper.

112
00:08:24,480 --> 00:08:27,520
And that is the standard for good ore.

113
00:08:27,520 --> 00:08:32,560
And while we are going to move toward recycling, most of our industry, most of our civilization

114
00:08:32,560 --> 00:08:34,360
is tooled for immediate development.

115
00:08:34,360 --> 00:08:42,000
So Eco-Technic Future provides an excellent framework for looking at the ways in which

116
00:08:42,000 --> 00:08:48,120
human societies essentially are going to be forced to change in a broader sense without

117
00:08:48,120 --> 00:08:54,800
to a degree, there's a little bit of more doom or oriented collapse, collapse, I don't

118
00:08:54,800 --> 00:08:58,120
want to say theology, collapse messaging.

119
00:08:58,120 --> 00:08:59,120
There we go.

120
00:08:59,120 --> 00:09:02,400
More doom or oriented collapse messaging in the book, but that's not really what it's

121
00:09:02,400 --> 00:09:09,240
about and it is not, in my opinion, integral to the value of the text because the value

122
00:09:09,240 --> 00:09:15,280
of the text is establishing a framework for examining these major social and industrial

123
00:09:15,280 --> 00:09:18,200
transitions that we're likely to see one way or the other.

124
00:09:18,200 --> 00:09:21,200
I like the idea of going out into space and mining asteroids.

125
00:09:21,200 --> 00:09:22,880
Don't think I've discounted that.

126
00:09:22,880 --> 00:09:27,480
A lot of people say, well, we can get more minerals from here or there or whatever.

127
00:09:27,480 --> 00:09:33,040
But I think that there's a lot of people that are good at science fiction and a lot of people

128
00:09:33,040 --> 00:09:39,160
that are good at producing high technology and who understand high technology extremely

129
00:09:39,160 --> 00:09:46,480
well, but they don't understand mining or metallurgy or a lot of the processes that

130
00:09:46,480 --> 00:09:52,200
we currently use in order to extract gold were invented in the 1600s and they're still

131
00:09:52,200 --> 00:09:53,200
used.

132
00:09:53,200 --> 00:10:01,240
And it's because mining and metallurgy is not the same as the production of high technology.

133
00:10:01,240 --> 00:10:08,520
So I think that there's sort of an information imbalance to a degree in a lot of the futurists

134
00:10:08,520 --> 00:10:12,240
because they don't have as much experience looking at where the resources are coming

135
00:10:12,240 --> 00:10:14,460
from in comparison to where the resources are going to.

136
00:10:14,460 --> 00:10:20,200
So that's my plug for why I don't think that we're going to go on a gigantic stellar adventure

137
00:10:20,200 --> 00:10:21,200
soon.

138
00:10:21,200 --> 00:10:25,840
I find it likely it's going to happen, but I don't think it'll happen soon.

139
00:10:25,840 --> 00:10:28,520
I hope it will.

140
00:10:28,520 --> 00:10:35,140
But the structure of ecotechnical future provides an excellent structure and an excellent way

141
00:10:35,140 --> 00:10:40,920
to examine these ideas in a context that it doesn't really approve or disapprove of them.

142
00:10:40,920 --> 00:10:44,000
It simply states this is the patterns that we've seen through history.

143
00:10:44,000 --> 00:10:46,680
This is the patterns we're likely to see in the future.

144
00:10:46,680 --> 00:10:52,520
And assuming current trends are roughly, roughly predictive of future trends, this is probably

145
00:10:52,520 --> 00:10:57,580
what we're going to observe as the resource crunch starts to become severe on various

146
00:10:57,580 --> 00:11:01,080
civilizations in the world.

147
00:11:01,080 --> 00:11:04,520
Is Ecotechnic Future the only book by John Michael Greer that you've read?

148
00:11:04,520 --> 00:11:06,240
Yes, so far it is.

149
00:11:06,240 --> 00:11:07,960
Do you follow his blog?

150
00:11:07,960 --> 00:11:10,720
I don't, but maybe I should.

151
00:11:10,720 --> 00:11:14,800
Well I don't know if he's still keeping up this level of productivity, but for a time

152
00:11:14,800 --> 00:11:17,800
he was putting out two and three books a year.

153
00:11:17,800 --> 00:11:22,400
And his blog at the time was called the Archdruid Report because he really was the archdruid

154
00:11:22,400 --> 00:11:27,240
of one of the major North American druidic organizations.

155
00:11:27,240 --> 00:11:32,960
And he has since rebranded his blog, it's now called the EcoSophia.

156
00:11:32,960 --> 00:11:35,080
But he's written a lot of books.

157
00:11:35,080 --> 00:11:42,880
And I read the Ecotechnic Future.

158
00:11:42,880 --> 00:11:43,880
And there it is, EcoSophia.

159
00:11:43,880 --> 00:11:46,560
All right, go ahead.

160
00:11:46,560 --> 00:11:52,120
He has written a lot of books and I read all of them for a time.

161
00:11:52,120 --> 00:11:54,880
I'm not keeping up with his output anymore.

162
00:11:54,880 --> 00:11:57,360
But two things I want to say about the Ecotechnic Future.

163
00:11:57,360 --> 00:12:03,200
First is that it's nonfiction, but he wrote a novel which basically illustrates what he's

164
00:12:03,200 --> 00:12:04,200
talking about.

165
00:12:04,200 --> 00:12:06,080
It's called Stars Reach.

166
00:12:06,080 --> 00:12:08,760
And you could classify it as a science fiction novel.

167
00:12:08,760 --> 00:12:12,600
There's not much advanced technology in it, but it is a speculative tale about the future.

168
00:12:12,600 --> 00:12:16,960
So from my perspective, it does count as science fiction.

169
00:12:16,960 --> 00:12:20,480
The other thing that I want to say about it, other than that I don't remember the details

170
00:12:20,480 --> 00:12:24,240
all that well because I did read it a long time ago, but I know John Michael Greer's

171
00:12:24,240 --> 00:12:28,880
thoughts on these matters just from conversation with him and reading his blog and things like

172
00:12:28,880 --> 00:12:30,760
that.

173
00:12:30,760 --> 00:12:36,440
He predicts that the first book of his that I ever read was called Apocalypse Knot.

174
00:12:36,440 --> 00:12:43,240
And while he does respect the limits to growth and peak oil narratives, he also has been

175
00:12:43,240 --> 00:12:48,520
critical as long as I've known him of the fast collapse scenario, which he finds to

176
00:12:48,520 --> 00:12:53,480
be I think alarmist and sensational.

177
00:12:53,480 --> 00:12:58,280
And he says, no, the collapse of this civilization is going to take a couple of centuries and

178
00:12:58,280 --> 00:12:59,960
it's going to be a stair step.

179
00:12:59,960 --> 00:13:07,320
He calls it a catabolic collapse where there will be a crisis and a discontinuity and then

180
00:13:07,320 --> 00:13:08,760
we'll get things back together.

181
00:13:08,760 --> 00:13:12,720
We'll tighten it up and it'll seem for the optimists, they'll be able to make a case

182
00:13:12,720 --> 00:13:17,440
for the fact or for the idea that we're back on an upward trajectory.

183
00:13:17,440 --> 00:13:22,600
But really, it's just a stabilizing period before the next step, the next fall down the

184
00:13:22,600 --> 00:13:24,440
staircase.

185
00:13:24,440 --> 00:13:27,000
And he says that's going to take a couple of centuries.

186
00:13:27,000 --> 00:13:35,040
And in the process, we're going to move from faltering growth based high energy throughput

187
00:13:35,040 --> 00:13:40,800
modality that we're in now into a sort of reclamation or scavenger phase where we're

188
00:13:40,800 --> 00:13:46,080
basically just tearing down the remnants of the old in order to repurpose the materials

189
00:13:46,080 --> 00:13:50,520
and to keep going at a low lower energy throughput level.

190
00:13:50,520 --> 00:13:54,480
And eventually we'll get to what he calls the ecotechnic future, where we're much more

191
00:13:54,480 --> 00:14:01,440
integrated with the cycles of the ecosystem.

192
00:14:01,440 --> 00:14:06,680
And in Star's Reach, they're sort of at that transition between those next two stages.

193
00:14:06,680 --> 00:14:09,000
And you've read the book much more recently than I have.

194
00:14:09,000 --> 00:14:11,080
So if I've misrepresented it, please let me know.

195
00:14:11,080 --> 00:14:13,560
No, I think you've represented it pretty accurately.

196
00:14:13,560 --> 00:14:18,520
I think that his big breakthrough, in my opinion, in ecotechnic future that allows for this

197
00:14:18,520 --> 00:14:25,160
framing is the recognition of human civilization as another form of ecology.

198
00:14:25,160 --> 00:14:29,160
With the anticipation that it adheres to more or less the same rules as other ecological

199
00:14:29,160 --> 00:14:30,160
systems.

200
00:14:30,160 --> 00:14:34,240
I think that's the big breakthrough that he made in that book.

201
00:14:34,240 --> 00:14:40,600
And that's what I'm most interested in terms of structure and point.

202
00:14:40,600 --> 00:14:46,480
And that makes sense for him to write through the next sort of the one after next, the transition

203
00:14:46,480 --> 00:14:47,660
after next.

204
00:14:47,660 --> 00:14:55,040
In terms of a stepping, a stepping, that's more or less what I read in the book is that

205
00:14:55,040 --> 00:14:58,120
you're going to have different levels of catastrophe over time.

206
00:14:58,120 --> 00:15:00,880
I'm not fully on board with that.

207
00:15:00,880 --> 00:15:02,720
There's another book I've read.

208
00:15:02,720 --> 00:15:06,680
Have you heard of The Collapse of Complex Societies by Joseph Tater?

209
00:15:06,680 --> 00:15:07,680
Oh, yeah.

210
00:15:07,680 --> 00:15:10,840
Yeah, I read Tater and I've listened to many interviews with him.

211
00:15:10,840 --> 00:15:15,600
As far as I remember, I've never talked to him, but I'm definitely well versed in his,

212
00:15:15,600 --> 00:15:16,600
you know,

213
00:15:16,600 --> 00:15:25,040
I think his work is a more accurate rendition of what social collapse looks like.

214
00:15:25,040 --> 00:15:29,560
And that is frequently referred to as apocalyptic civilization collapse because the kinds of

215
00:15:29,560 --> 00:15:34,040
historians who look into it are the kinds who get paid by a bloated civilization and

216
00:15:34,040 --> 00:15:39,000
not the kinds and not farmers or mechanics who are working on the sidelines during the

217
00:15:39,000 --> 00:15:45,160
collapse because they take generality and increase in quality of life.

218
00:15:45,160 --> 00:15:47,680
I think the first thing I'd point to is the limits of growth.

219
00:15:47,680 --> 00:15:52,280
The limits of growth model made a number of assumptions that don't seem to be accurate

220
00:15:52,280 --> 00:15:53,280
in the long term.

221
00:15:53,280 --> 00:15:55,920
And I was going through this because since the limits of growth was published, we now

222
00:15:55,920 --> 00:16:01,640
have a half century of data, which is why I think it was overly alarmist to begin with.

223
00:16:01,640 --> 00:16:07,040
But the half century of data that we're looking at implies to me that we're more likely to

224
00:16:07,040 --> 00:16:11,480
see there's various models that they use.

225
00:16:11,480 --> 00:16:18,320
We're likely to see a stable population stabilization after a population collapse.

226
00:16:18,320 --> 00:16:21,960
And the population collapse is currently happening is the demographic collapse everybody's talking

227
00:16:21,960 --> 00:16:23,520
about right now.

228
00:16:23,520 --> 00:16:29,200
But it seems very likely to me that we're going to stabilize around four billion people.

229
00:16:29,200 --> 00:16:31,080
Industrial output is probably going to remain the same.

230
00:16:31,080 --> 00:16:37,520
And we're actually going to see a higher quality of life on the other side of this cultural

231
00:16:37,520 --> 00:16:41,720
chasm that we're sort of looking at right now.

232
00:16:41,720 --> 00:16:43,680
But it's going to be very different.

233
00:16:43,680 --> 00:16:47,840
The ecology of humanity and the way that human civilizations relate to the world will be

234
00:16:47,840 --> 00:16:53,840
very different on the other side than they are now, both in terms of industrial and technical

235
00:16:53,840 --> 00:16:54,840
capability.

236
00:16:54,840 --> 00:16:58,720
I'd imagine we'll keep most of our technical knowledge through all of this.

237
00:16:58,720 --> 00:17:04,440
But in terms of average quality of life and average resources consumed, it seems very,

238
00:17:04,440 --> 00:17:12,600
very likely that we're going to shift into a almost an aristocratic culture, an aristocratic

239
00:17:12,600 --> 00:17:18,880
type of civilization where you have those who can afford the factory produced goods

240
00:17:18,880 --> 00:17:22,480
because they will be very expensive because shipping will be expensive and there won't

241
00:17:22,480 --> 00:17:23,720
be a huge demand for them.

242
00:17:23,720 --> 00:17:26,320
So they'll be partially artisanal.

243
00:17:26,320 --> 00:17:30,880
And then those that mostly consume that which is produced in their own communities.

244
00:17:30,880 --> 00:17:36,960
So I'd imagine something like, you know, that's a good way to describe it.

245
00:17:36,960 --> 00:17:45,840
1700 centuries, 1700 peasants with cell phones and internal plumbing kind of situation.

246
00:17:45,840 --> 00:17:48,280
You mentioned aristocracy.

247
00:17:48,280 --> 00:17:54,120
Aristocracy is a word that I've been thinking a lot about just because of the fiction that

248
00:17:54,120 --> 00:17:55,120
I'm writing.

249
00:17:55,120 --> 00:17:58,560
There's an aristocratic class in the setting.

250
00:17:58,560 --> 00:18:03,280
And the word aristocracy, it means rule by the best.

251
00:18:03,280 --> 00:18:10,440
We live in a democratic society and we tend to favor and see, you know, democratic norms

252
00:18:10,440 --> 00:18:17,920
as being genuinely good and like objectively preferable to oligarchy or plutocracy or,

253
00:18:17,920 --> 00:18:20,360
you know, monarchy or particularly aristocracy.

254
00:18:20,360 --> 00:18:26,480
The whole notion of aristocracy seems offensive to us, you know, in a democratic context.

255
00:18:26,480 --> 00:18:31,960
And yet, you know, we also talk about elites and they do tend to be, you know, better educated

256
00:18:31,960 --> 00:18:36,040
and more financially successful.

257
00:18:36,040 --> 00:18:40,160
And you know, you could say that their gains weren't they didn't come upon their gains

258
00:18:40,160 --> 00:18:43,680
fairly, but you know, what's fair.

259
00:18:43,680 --> 00:18:52,200
So I just wanted to put in a word of support for the notion of aristocracy, quay, rule

260
00:18:52,200 --> 00:18:55,480
by people who are more accomplished.

261
00:18:55,480 --> 00:19:00,040
So I actually I'm not sure how far through my sub stack you got, but last summer I published

262
00:19:00,040 --> 00:19:02,160
an article.

263
00:19:02,160 --> 00:19:09,800
I believe it was called The Return of Kings, a Reasoned Argument for Monarchy.

264
00:19:09,800 --> 00:19:15,080
And my current political beliefs is actually I am a monarchist for a couple of complex

265
00:19:15,080 --> 00:19:17,400
reasons, a reasoned case for monarchy, return of kings.

266
00:19:17,400 --> 00:19:20,200
That's the name of the article.

267
00:19:20,200 --> 00:19:27,040
And aristocracy, so there's negative things you can say about aristocracy, especially

268
00:19:27,040 --> 00:19:29,060
when they abuse their power.

269
00:19:29,060 --> 00:19:38,680
We currently have a pseudo aristocratic system globally and in the West, especially because

270
00:19:38,680 --> 00:19:44,840
the elites currently have like the Bezos types and the ones you don't know about have power

271
00:19:44,840 --> 00:19:48,480
on par with most major governments as individuals.

272
00:19:48,480 --> 00:19:53,640
They don't exercise power in the same way, but they have the power.

273
00:19:53,640 --> 00:20:00,960
But with a with a monarchy, I find it likely that under most forms of a structured monarchy,

274
00:20:00,960 --> 00:20:03,900
so my recommendation in this case, and this is one of the articles that I'm working on

275
00:20:03,900 --> 00:20:11,600
right now, is how to actually set up a monarchy is with a monarchy, you know that the kids

276
00:20:11,600 --> 00:20:15,120
of the king are going to be the next king.

277
00:20:15,120 --> 00:20:19,400
And that's really helpful because you can aim their education that way.

278
00:20:19,400 --> 00:20:25,200
One of the big problems we have with our current elites is that they're terrible leaders in

279
00:20:25,200 --> 00:20:26,200
the West.

280
00:20:26,200 --> 00:20:29,800
Our elites are awful leaders and that's because they were trained to be lawyers, bankers,

281
00:20:29,800 --> 00:20:30,800
and businessmen.

282
00:20:30,800 --> 00:20:34,560
They were never trained to be statesmen.

283
00:20:34,560 --> 00:20:38,880
They've probably read Plato here and there, but they were never trained on the works of

284
00:20:38,880 --> 00:20:43,640
Napoleon, they were never trained on a lot of these historical works that good leaders

285
00:20:43,640 --> 00:20:45,120
should be trained on.

286
00:20:45,120 --> 00:20:53,560
So one of the big issues right now in the West is that leadership is the class of scoundrels,

287
00:20:53,560 --> 00:20:57,280
thieves, and lawyers.

288
00:20:57,280 --> 00:21:02,000
Whereas under a monarchy, you can get bad kings, absolutely.

289
00:21:02,000 --> 00:21:06,440
But at least you know the kids going to be the next king or one of the king's kids is

290
00:21:06,440 --> 00:21:07,920
going to be the next king.

291
00:21:07,920 --> 00:21:14,720
And so you can train them for that job instead of training them for whatever the school system

292
00:21:14,720 --> 00:21:17,520
pumps out.

293
00:21:17,520 --> 00:21:25,040
Another advantage of aristocracy or monarchy in this case, monarchy with the nobility specifically,

294
00:21:25,040 --> 00:21:32,960
is that the monarchy can give up the spiritual consensus that we kind of have now in the

295
00:21:32,960 --> 00:21:35,520
West of democratic governance.

296
00:21:35,520 --> 00:21:42,600
And in being able to do that, you can remove the idea of egalitarianism under the law,

297
00:21:42,600 --> 00:21:43,600
because it's the king's law.

298
00:21:43,600 --> 00:21:45,200
If you're in a kingdom, it's the king's law.

299
00:21:45,200 --> 00:21:51,360
Like if you're in Vietnam, you don't insult the royal family or you will get arrested.

300
00:21:51,360 --> 00:21:58,000
But because it's the king's law, or because it's the council, if you live in one of the

301
00:21:58,000 --> 00:22:06,480
cities, then what that means is they don't have to create this bureaucratic apparatus

302
00:22:06,480 --> 00:22:10,240
in order to micromanage people to give the impression that everyone's treated equally

303
00:22:10,240 --> 00:22:14,400
because that's mostly what the bureaucratic apparatus does.

304
00:22:14,400 --> 00:22:18,800
Instead you can have the count say that building's about to fall down, it's in my city, fix it

305
00:22:18,800 --> 00:22:20,680
or I'm taking your hat.

306
00:22:20,680 --> 00:22:21,760
Simple as that.

307
00:22:21,760 --> 00:22:26,840
It's not some faceless bureaucratic drone who's going to give you fines and maybe they'll

308
00:22:26,840 --> 00:22:29,440
check up on you in six months or a year.

309
00:22:29,440 --> 00:22:33,280
It's the guy right there who can put his foot down and make a judgment.

310
00:22:33,280 --> 00:22:37,800
And I think a lot of people would rather respond to the type of leadership where you can actually

311
00:22:37,800 --> 00:22:43,120
speak to the leader who's telling you that you've messed up rather than the type of leadership

312
00:22:43,120 --> 00:22:49,160
that's attempting to get you to have an argument with a stack of city, county and corporate

313
00:22:49,160 --> 00:22:53,160
policies.

314
00:22:53,160 --> 00:22:57,720
It just reminded me of something that John Michael Greer has talked about as well in

315
00:22:57,720 --> 00:23:05,080
terms of the shape of society as we go to a lower energy throughput status and that

316
00:23:05,080 --> 00:23:07,440
is a return of feudalism.

317
00:23:07,440 --> 00:23:14,160
But he defines feudalism as rule by personal relationships, which is to say the feudal

318
00:23:14,160 --> 00:23:16,400
lord knows his serfs.

319
00:23:16,400 --> 00:23:19,800
They are not some faceless mass off in the distance.

320
00:23:19,800 --> 00:23:26,200
They're certainly not some geographically distributed demographic that he's only in

321
00:23:26,200 --> 00:23:30,000
touch with through administrative systems.

322
00:23:30,000 --> 00:23:35,040
You go to the lord of the manor and you pledge your service to him.

323
00:23:35,040 --> 00:23:39,320
And when the king calls upon him to produce soldiers, he knows the names of the people

324
00:23:39,320 --> 00:23:42,200
that he's going to send into the king's army.

325
00:23:42,200 --> 00:23:47,640
So John Michael Greer talks about as larger administrative systems that are dependent

326
00:23:47,640 --> 00:23:51,840
on electricity and computers and that sort of thing as they falter.

327
00:23:51,840 --> 00:23:57,120
We are going to have not necessarily a transition back to something which is definitely recognizable

328
00:23:57,120 --> 00:24:01,360
from some previous historical period, but something which is definitely much more premised

329
00:24:01,360 --> 00:24:06,800
on personal relationships between the lords and the vassals.

330
00:24:06,800 --> 00:24:08,360
I find that to be very likely.

331
00:24:08,360 --> 00:24:12,760
I mean, we're already kind of seeing a push for it in really weird ways.

332
00:24:12,760 --> 00:24:18,200
My monarchy article got a tremendous amount of traction, surprisingly enough, when I first

333
00:24:18,200 --> 00:24:20,880
published it in comparison to what I usually get.

334
00:24:20,880 --> 00:24:27,280
And you're seeing a lot more people attempt to develop personal relationships in their

335
00:24:27,280 --> 00:24:30,840
communities in non-conventional ways.

336
00:24:30,840 --> 00:24:36,320
I would define conventional as between 1900 and 1980.

337
00:24:36,320 --> 00:24:39,240
Between 1900 and 1980, you've got the people in your community.

338
00:24:39,240 --> 00:24:46,280
You've got your mayor and you have these little book clubs and you have your exchange club

339
00:24:46,280 --> 00:24:47,320
and whatnot.

340
00:24:47,320 --> 00:24:53,960
But the way people are forming relationships now are almost best described as microcultures.

341
00:24:53,960 --> 00:24:55,780
They're forming microcultures online.

342
00:24:55,780 --> 00:25:01,760
Young men are forming a lot of different microcultures online, each with its own internal conceptualization

343
00:25:01,760 --> 00:25:04,520
of reality.

344
00:25:04,520 --> 00:25:11,800
And if any of those small communities becomes the dominant community in a place, so if it

345
00:25:11,800 --> 00:25:18,840
localizes from online to a physical location, example is they all move into the same.

346
00:25:18,840 --> 00:25:24,320
You get a Discord server of say 500 or 600 people that are all basically on the same

347
00:25:24,320 --> 00:25:29,160
page and decide to move to the same town of 10,000 people or 20,000 people.

348
00:25:29,160 --> 00:25:32,880
All of a sudden, they're going to become an extremely dominant force because they will

349
00:25:32,880 --> 00:25:41,200
be able to flex a degree of cultural and social control that no one else can do.

350
00:25:41,200 --> 00:25:42,280
And so I agree with you.

351
00:25:42,280 --> 00:25:48,980
I think we're going to see a transition from a system of faceless administration to a more

352
00:25:48,980 --> 00:25:52,280
personal system one way or another.

353
00:25:52,280 --> 00:25:57,160
Although I do think that, so the question was whether this was going to happen soon

354
00:25:57,160 --> 00:26:00,280
or in several hundred years.

355
00:26:00,280 --> 00:26:06,520
And I think that a year ago, I would have said it's probably going to happen soon.

356
00:26:06,520 --> 00:26:13,620
So within the next 30 to 60 years, with the current administration and the recent election,

357
00:26:13,620 --> 00:26:17,760
it seems more likely to me that we're looking at maybe 200 years out before we transition

358
00:26:17,760 --> 00:26:19,300
into those systems.

359
00:26:19,300 --> 00:26:24,160
The reason being that the US more or less had the options of going through and still

360
00:26:24,160 --> 00:26:28,840
has the options of going through either a Balkanization or an imperial transition.

361
00:26:28,840 --> 00:26:31,600
It looks like they're aiming for an imperial transition.

362
00:26:31,600 --> 00:26:34,840
So from Republic to Empire.

363
00:26:34,840 --> 00:26:39,200
So Warhammer 40K is more predictive than half of science fiction out there.

364
00:26:39,200 --> 00:26:42,480
I love the Warhammer lore.

365
00:26:42,480 --> 00:26:45,640
I'm actually listening to a Caiaphas Kane novel right now.

366
00:26:45,640 --> 00:26:46,640
Excellent.

367
00:26:46,640 --> 00:26:47,640
Okay.

368
00:26:47,640 --> 00:26:50,880
Hey, I'm going to go and refill my coffee.

369
00:26:50,880 --> 00:26:56,240
While I do that, let me encourage you to Google, I think it's Pine Bluff, Arkansas.

370
00:26:56,240 --> 00:27:01,440
I think there's a guy with a lot of money who's buying up lots of property in Pine Bluff.

371
00:27:01,440 --> 00:27:09,040
And basically, you know, it's a town in serious economic decline, with lots of abandoned buildings

372
00:27:09,040 --> 00:27:13,360
and lots of sort of derelict properties, and he's just buying them up.

373
00:27:13,360 --> 00:27:18,580
You know, I don't know exactly what his plan is, but he's going to be the Duke of that,

374
00:27:18,580 --> 00:27:21,400
you know, of that little feudal plot.

375
00:27:21,400 --> 00:27:24,760
There's a few wealthy people that are doing that around the country.

376
00:27:24,760 --> 00:27:26,720
I'm going to look up Pine Bluff specifically, though.

377
00:27:26,720 --> 00:27:27,720
Thank you.

378
00:27:27,720 --> 00:27:38,520
So it is very tempting to focus on current events and the Trump-Vance-Mask administration

379
00:27:38,520 --> 00:27:41,400
and historical resistance to it.

380
00:27:41,400 --> 00:27:43,440
But I do want to get back to science fiction.

381
00:27:43,440 --> 00:27:44,440
Absolutely.

382
00:27:44,440 --> 00:27:45,440
Yeah.

383
00:27:45,440 --> 00:27:53,720
You talked about the fact that basically the timeframe that cyberpunk was describing is,

384
00:27:53,720 --> 00:27:57,400
if not entirely past, I mean, we've lived through enough of it that it's clearly not

385
00:27:57,400 --> 00:27:58,400
going.

386
00:27:58,400 --> 00:28:00,360
It doesn't look like Neuromancer.

387
00:28:00,360 --> 00:28:01,360
It doesn't look like Blade Runner.

388
00:28:01,360 --> 00:28:07,000
You know, the original Blade Runner takes place, the movie takes place in 2019, which

389
00:28:07,000 --> 00:28:08,900
is past.

390
00:28:08,900 --> 00:28:14,240
What I love about Blade Runner 2049 is that it says, nope, that's how it went.

391
00:28:14,240 --> 00:28:15,920
That's what things were like in 2019.

392
00:28:15,920 --> 00:28:17,960
We're just, you know, we're sticking with it.

393
00:28:17,960 --> 00:28:18,960
We're not retconning.

394
00:28:18,960 --> 00:28:19,960
We're not apologizing.

395
00:28:19,960 --> 00:28:23,080
We're just going to continue the story in that universe.

396
00:28:23,080 --> 00:28:31,760
You know, if you don't gravitate to cyberpunk, what's your science fiction subgenre of choice?

397
00:28:31,760 --> 00:28:37,740
So I enjoyed cyberpunk as much as the next guy back in the 90s when I was a kid.

398
00:28:37,740 --> 00:28:42,680
But it does seem to be somewhat, for lack of better phrasing, outdated in the same way

399
00:28:42,680 --> 00:28:48,800
that 40s, 30s serials are outdated because you're right, history took a different turn.

400
00:28:48,800 --> 00:28:53,680
40s took a different turn and somehow things are worse because it has a big fake corporate

401
00:28:53,680 --> 00:28:54,840
smile plastered on it.

402
00:28:54,840 --> 00:29:00,920
So the subgenres of science fiction that I really am interested into, I, so I like science

403
00:29:00,920 --> 00:29:06,480
fiction in principle because it allows you to discuss ideas that you can't discuss in

404
00:29:06,480 --> 00:29:10,520
other genres or that are difficult to discuss in other genres.

405
00:29:10,520 --> 00:29:15,600
Science fiction is really good at opening people, opening your eyes to, to potential

406
00:29:15,600 --> 00:29:20,920
futures or potential ideas or potential social concepts or political concepts that don't

407
00:29:20,920 --> 00:29:23,200
even exist otherwise.

408
00:29:23,200 --> 00:29:26,440
So I've always been, been into more odd science fiction.

409
00:29:26,440 --> 00:29:32,120
I just finished Project Hail Mary, which made me think of the three body problem and going

410
00:29:32,120 --> 00:29:36,720
down to the three body problem, I would say that is probably one of the highest rated

411
00:29:36,720 --> 00:29:39,520
science fiction trilogies I've ever read.

412
00:29:39,520 --> 00:29:42,360
If you've, if you've read it, you know what I'm talking about.

413
00:29:42,360 --> 00:29:46,360
It's the only, it's the only science fiction book that I've ever, that's ever given me

414
00:29:46,360 --> 00:29:48,760
really really weird dreams.

415
00:29:48,760 --> 00:29:51,440
But it has.

416
00:29:51,440 --> 00:30:00,960
So the three body problem essentially posits this question in the position of science fiction.

417
00:30:00,960 --> 00:30:06,880
It takes the for me paradox and it also asks the question what happens if there is, is

418
00:30:06,880 --> 00:30:09,560
not a limit to technological development?

419
00:30:09,560 --> 00:30:12,680
What if you can actually develop your way through a brick wall?

420
00:30:12,680 --> 00:30:17,720
What if you, what if there's no limit to how good your technology can get or the things

421
00:30:17,720 --> 00:30:18,720
it can do?

422
00:30:18,720 --> 00:30:22,000
What would, what would the universe look like?

423
00:30:22,000 --> 00:30:23,840
And how would humanity react to that?

424
00:30:23,840 --> 00:30:26,240
It answers a bunch of interesting social questions.

425
00:30:26,240 --> 00:30:31,320
In some ways it predicted, it's impressively good predictive capacity.

426
00:30:31,320 --> 00:30:39,400
It predicted the, some of the social phenomena that have happened both in the East and in

427
00:30:39,400 --> 00:30:47,520
the West from when it was published in 2007 to now.

428
00:30:47,520 --> 00:30:51,760
Basically the way humans behave when all of their problems are taken away and the breakdown

429
00:30:51,760 --> 00:30:54,440
of hierarchy and gender and all sorts of weird stuff.

430
00:30:54,440 --> 00:30:56,280
It talks about that.

431
00:30:56,280 --> 00:31:04,280
It discusses the way that people can respond to problems in self-defeating ways, which

432
00:31:04,280 --> 00:31:10,680
I think is a lot of what much of the environmentalist activists are kind of doing right now.

433
00:31:10,680 --> 00:31:16,400
And what happens when you take that too far and the way society will bounce back from

434
00:31:16,400 --> 00:31:23,400
it or have a social repost, so to speak, a backlash against bad ideas.

435
00:31:23,400 --> 00:31:28,480
I really like the three body problem because it contains all of these really complex social

436
00:31:28,480 --> 00:31:36,600
and scientific concepts in a system, in a, in a narrative that no one else has ever written

437
00:31:36,600 --> 00:31:37,600
before.

438
00:31:37,600 --> 00:31:39,760
And I really appreciate that.

439
00:31:39,760 --> 00:31:42,480
Another book that I really enjoyed is Accelerando.

440
00:31:42,480 --> 00:31:44,880
There's also a review of that.

441
00:31:44,880 --> 00:31:51,920
Accelerando is a book that discusses the limits to AI and our ability to control them and

442
00:31:51,920 --> 00:31:57,640
the way in which AI are likely to function in practice rather than function in theory

443
00:31:57,640 --> 00:32:03,960
and how that can create its own nasty set of paperclip problems with the paperclip maximizer

444
00:32:03,960 --> 00:32:09,480
problems with the way that humans sort of are going to and are currently treating AI

445
00:32:09,480 --> 00:32:14,520
as a magic box that will just give them things if they program it right.

446
00:32:14,520 --> 00:32:17,040
And then I like some of the weirder science fiction out there.

447
00:32:17,040 --> 00:32:21,560
The Strugatsky novels I've really enjoyed so far.

448
00:32:21,560 --> 00:32:24,080
The Doomed City, Roadside Picnic.

449
00:32:24,080 --> 00:32:28,000
I'm currently, excuse me, you asked what I was reading.

450
00:32:28,000 --> 00:32:35,480
I am currently reading through Monday starts on Saturday, which is fascinating to me because

451
00:32:35,480 --> 00:32:39,040
those listeners familiar with the SCP Foundation.

452
00:32:39,040 --> 00:32:43,560
This is the book that I think created the idea for the SCP Foundation.

453
00:32:43,560 --> 00:32:45,480
It's a great book.

454
00:32:45,480 --> 00:32:53,440
And until very recently, there was not really a equivalent to it in modern history until

455
00:32:53,440 --> 00:32:57,160
somebody created the SCP Wiki, which isn't as good as it used to be.

456
00:32:57,160 --> 00:32:58,920
It used to be a lot of fun to read through that.

457
00:32:58,920 --> 00:33:04,720
But Monday starts on Saturday is the Soviet novel that sort of kicked off that genre.

458
00:33:04,720 --> 00:33:09,760
And then the Doomed City and Roadside Picnic kind of kicked off the Soviet fiction genre

459
00:33:09,760 --> 00:33:10,760
in general.

460
00:33:10,760 --> 00:33:16,080
But yeah, I like science fiction with interesting ideas that it wants to discuss more than science

461
00:33:16,080 --> 00:33:19,040
fiction with any specific place in the universe.

462
00:33:19,040 --> 00:33:22,520
I'm planning to start the Zeely sequence here at some point.

463
00:33:22,520 --> 00:33:27,200
I've heard that it's quite good, but I haven't gotten around to reading it yet.

464
00:33:27,200 --> 00:33:30,200
And then the reason I brought up Project Hail Mary right at the beginning is it's sort of

465
00:33:30,200 --> 00:33:35,240
like the opposite of the three body problem.

466
00:33:35,240 --> 00:33:39,960
While the three body problem is a story spanning hundreds of years, complex characters, and

467
00:33:39,960 --> 00:33:46,540
massive scientific arcs and hostile alien intelligences, Project Hail Mary does the

468
00:33:46,540 --> 00:33:49,240
opposite of all of those things.

469
00:33:49,240 --> 00:33:54,400
One character over a very brief period of time with very consistent technological development

470
00:33:54,400 --> 00:33:56,080
and friendly aliens.

471
00:33:56,080 --> 00:34:02,080
So I just find them fascinating inversions of each other, those two pieces of work.

472
00:34:02,080 --> 00:34:06,440
And Project Hail Mary also has a lot less to say politically and socially.

473
00:34:06,440 --> 00:34:12,860
Well, I just recently listened to the audiobook of the Hail Mary project or Project Hail Mary.

474
00:34:12,860 --> 00:34:16,680
And I should tell people who are not familiar with it, it's by Andy Weir, who is the author

475
00:34:16,680 --> 00:34:18,240
of The Martian.

476
00:34:18,240 --> 00:34:22,960
The Martian, which famously the famous quote from that film is, I'm going to science the

477
00:34:22,960 --> 00:34:24,360
shit out of this.

478
00:34:24,360 --> 00:34:25,360
Yes.

479
00:34:25,360 --> 00:34:31,640
Andy Weir loves to write like hard sci fi in the universe is a problem to be solved

480
00:34:31,640 --> 00:34:38,360
mode, which a lot of people hate science fiction because it takes that viewpoint, or at least

481
00:34:38,360 --> 00:34:39,680
it used to.

482
00:34:39,680 --> 00:34:46,000
Like the pre new wave, like what you're calling the sort of outdated pulps and cereals.

483
00:34:46,000 --> 00:34:47,800
Yeah, it's pulpy.

484
00:34:47,800 --> 00:34:48,800
That's a good way to describe it.

485
00:34:48,800 --> 00:34:54,960
Yeah, I mean, but you know, he he approaches it with humor and goodwill and, you know,

486
00:34:54,960 --> 00:34:58,840
a modern sort of up to date scientific acumen.

487
00:34:58,840 --> 00:35:01,200
And I absolutely love his books.

488
00:35:01,200 --> 00:35:03,240
So I enjoyed the book.

489
00:35:03,240 --> 00:35:05,260
I absolutely enjoyed the book.

490
00:35:05,260 --> 00:35:10,640
It's just bizarre, absolutely the opposite of the three body problem trilogy in every

491
00:35:10,640 --> 00:35:11,720
way.

492
00:35:11,720 --> 00:35:13,800
Now I have not read the three body problem.

493
00:35:13,800 --> 00:35:18,920
I did watch the Netflix the first season of the Netflix show, and there's also a Chinese

494
00:35:18,920 --> 00:35:21,480
TV adaptation of it, which I've seen.

495
00:35:21,480 --> 00:35:26,560
I don't know how much of the novel I'm missing, you know, by relying on those TV adaptations.

496
00:35:26,560 --> 00:35:29,360
Well, so I can I can give you a breakdown.

497
00:35:29,360 --> 00:35:32,200
The first thing is that the novel is big.

498
00:35:32,200 --> 00:35:35,760
So in the TV show, you've got like these five main characters, and that's because they could

499
00:35:35,760 --> 00:35:39,640
afford five, you know, B list actors.

500
00:35:39,640 --> 00:35:43,080
And in the novel, you're talking about hundreds of different people and hundreds of different

501
00:35:43,080 --> 00:35:45,120
government agencies kind of thing.

502
00:35:45,120 --> 00:35:50,300
So is the difference between like a James Bond film and what an actual intelligence

503
00:35:50,300 --> 00:35:53,360
apparatus looks like is a good description of the difference between the TV show and

504
00:35:53,360 --> 00:35:54,400
the book.

505
00:35:54,400 --> 00:35:58,960
The other thing I would say about the three body problem is it's very likely to to kind

506
00:35:58,960 --> 00:36:02,520
of fall apart in the second and third seasons of the television show.

507
00:36:02,520 --> 00:36:08,440
And the main reason for that is because the first book, the whole book one of the three

508
00:36:08,440 --> 00:36:11,640
body problem is best thought of as a prologue to the other two.

509
00:36:11,640 --> 00:36:17,200
Well, it's my understanding that the Netflix series is not taking the story in sequence,

510
00:36:17,200 --> 00:36:21,080
that it is bringing in elements from later novels in the earlier in the first season.

511
00:36:21,080 --> 00:36:23,160
Yeah, it kind of has to do that.

512
00:36:23,160 --> 00:36:24,280
It is doing that.

513
00:36:24,280 --> 00:36:32,160
But so there in the Netflix sequence, Netflix series is adapting it chronologically, whereas

514
00:36:32,160 --> 00:36:37,120
the books are telling stories about different people that each lived through the same like

515
00:36:37,120 --> 00:36:38,720
three or four hundred years.

516
00:36:38,720 --> 00:36:45,000
Which Project Hail Mary is is fun in that there's two two timelines.

517
00:36:45,000 --> 00:36:50,760
One is the current timeline where this guy by himself, this human guy is in a different

518
00:36:50,760 --> 00:36:55,920
solar system trying to figure out basically how to save the world.

519
00:36:55,920 --> 00:36:59,440
And then we're getting alternate chapters where, you know, it's the backstory of how

520
00:36:59,440 --> 00:37:00,440
he came here.

521
00:37:00,440 --> 00:37:01,440
Yeah.

522
00:37:01,440 --> 00:37:06,160
And of course, you know, late in the novel, there are twists revealed in the backstory

523
00:37:06,160 --> 00:37:09,680
chapters which paint everything in a new light.

524
00:37:09,680 --> 00:37:10,680
Very clever.

525
00:37:10,680 --> 00:37:11,680
Yeah, I thought it was clever.

526
00:37:11,680 --> 00:37:12,800
I enjoyed it.

527
00:37:12,800 --> 00:37:18,200
One of my favorite book series, if you've read this one is We Are Legion, We Are Bob.

528
00:37:18,200 --> 00:37:24,120
I listened to the audiobooks of the first few of those and they are fun, but they didn't

529
00:37:24,120 --> 00:37:26,960
they didn't grab me enough that I've listened to all of them.

530
00:37:26,960 --> 00:37:29,520
OK, I just finished the fourth or fifth.

531
00:37:29,520 --> 00:37:32,120
I think it's the fifth one.

532
00:37:32,120 --> 00:37:33,120
Then they are a serial.

533
00:37:33,120 --> 00:37:37,800
They're there, I described them as dumb, fun science fiction.

534
00:37:37,800 --> 00:37:41,560
They're hard science in that they're sublight, but they're soft science in that they kind

535
00:37:41,560 --> 00:37:43,520
of start getting out there pretty quick.

536
00:37:43,520 --> 00:37:48,200
But Andy Weir is a much harder science fiction author.

537
00:37:48,200 --> 00:37:52,080
I think that he puts a lot of effort into doing things the right way.

538
00:37:52,080 --> 00:37:53,780
And I really appreciate that.

539
00:37:53,780 --> 00:37:57,760
And I enjoyed Project Real Mary and I will be writing a review of it on my on my sub

540
00:37:57,760 --> 00:37:58,760
stack pretty soon here.

541
00:37:58,760 --> 00:37:59,760
Let's see.

542
00:37:59,760 --> 00:38:02,520
Have you read much of the Stravinsky work?

543
00:38:02,520 --> 00:38:06,920
You know, I started to read Roadside Picnic many, many years ago and I don't know how

544
00:38:06,920 --> 00:38:07,920
far I got into it.

545
00:38:07,920 --> 00:38:10,320
I don't think I finished it.

546
00:38:10,320 --> 00:38:15,520
I really enjoy the Stravinsky novels, but they're very avant garde in comparison to

547
00:38:15,520 --> 00:38:18,280
normal science fiction is a good way to put it.

548
00:38:18,280 --> 00:38:23,440
As the film stalker would would support its very well on guard.

549
00:38:23,440 --> 00:38:27,200
Well and the video game to a degree.

550
00:38:27,200 --> 00:38:32,360
What I really appreciate about I think Soviet fiction is that to keep the spirit of the

551
00:38:32,360 --> 00:38:40,080
to keep the spirit of the book, they had to completely rewrite the plot and the characters

552
00:38:40,080 --> 00:38:44,000
in the film and to keep the spirit of the book, they had to completely rewrite the plot,

553
00:38:44,000 --> 00:38:48,100
the characters in the setting in the video game.

554
00:38:48,100 --> 00:38:51,220
But the spirit is the same between all three of them, which I think is fascinating because

555
00:38:51,220 --> 00:38:53,520
most genres don't do that.

556
00:38:53,520 --> 00:38:58,880
Most genres don't need to completely uproot everything but but the core themes in order

557
00:38:58,880 --> 00:39:01,520
to switch medium.

558
00:39:01,520 --> 00:39:05,720
But for Soviet fiction, it does seem to be that way, which I find fascinating.

559
00:39:05,720 --> 00:39:10,520
I also identify with Soviet for Soviet fiction partially because the United States at this

560
00:39:10,520 --> 00:39:16,480
point and this is why the current administration is such a big deal and why I keep oscillating

561
00:39:16,480 --> 00:39:20,720
back the politics, but I'm going to try to stay out of it for this conversation.

562
00:39:20,720 --> 00:39:24,000
But the United States was 35 percent public spending.

563
00:39:24,000 --> 00:39:28,440
Our GDP was 35 percent public spending in the United Kingdom is 50 percent of their

564
00:39:28,440 --> 00:39:30,440
GDP is public spending.

565
00:39:30,440 --> 00:39:35,920
We are on the edge of just living in a Soviet style economy in the West in most countries

566
00:39:35,920 --> 00:39:36,920
at this point.

567
00:39:36,920 --> 00:39:42,200
And so it makes sense that people will that some people might identify more with Soviet

568
00:39:42,200 --> 00:39:43,960
style fiction.

569
00:39:43,960 --> 00:39:49,320
You know, one place where current concerns and science fiction definitely overlap to

570
00:39:49,320 --> 00:39:54,200
a degree that one simply cannot ignore is the role of emerging artificial intelligence

571
00:39:54,200 --> 00:39:58,320
and particularly how it's going to impact people in terms of, you know, any individual's

572
00:39:58,320 --> 00:40:00,520
ability to make a living.

573
00:40:00,520 --> 00:40:04,040
You've mentioned in passing and I don't think you've you've said it explicitly, but I know

574
00:40:04,040 --> 00:40:09,800
that you're referencing Rudyard Lynch and his references to the mouse utopia.

575
00:40:09,800 --> 00:40:15,040
A you know, a total singularity, cornucopia situation where everybody gets everything

576
00:40:15,040 --> 00:40:21,840
they want is not necessarily the best case scenario or a favorable outcome, but nor is

577
00:40:21,840 --> 00:40:27,600
you know, everybody is suddenly being rendered useless and, you know, feeling as though they

578
00:40:27,600 --> 00:40:32,120
have to rely on the charity of the oligarchs.

579
00:40:32,120 --> 00:40:36,360
You know, people need to be able to make a living or at least believe that they're making

580
00:40:36,360 --> 00:40:37,680
a living.

581
00:40:37,680 --> 00:40:43,640
There's a nonfiction writer named Kaifu Lee, who wrote a book called AI Superpowers.

582
00:40:43,640 --> 00:40:49,040
And then he wrote another book after that that was addressing, you know, the challenges

583
00:40:49,040 --> 00:40:51,700
of artificial intelligence in the near future.

584
00:40:51,700 --> 00:40:55,520
But he's doing so in partnership with a fiction writer and he's creating these little fictional

585
00:40:55,520 --> 00:41:03,360
vignettes and at least one of the chapters in the book is about a company that is supposedly

586
00:41:03,360 --> 00:41:08,960
their whole reason for being is to take workers who have been displaced by automation and

587
00:41:08,960 --> 00:41:11,680
finding new jobs for them.

588
00:41:11,680 --> 00:41:16,000
And another company that does the same thing starts up and the first company has got a

589
00:41:16,000 --> 00:41:18,360
success rate of like 20 percent.

590
00:41:18,360 --> 00:41:21,160
And the second company has a success rate in the high 90s.

591
00:41:21,160 --> 00:41:24,160
And you know, the CEO of the first company is like, that's impossible.

592
00:41:24,160 --> 00:41:25,240
There's no way.

593
00:41:25,240 --> 00:41:28,840
And he assigns one of his employees basically to infiltrate this other company and figure

594
00:41:28,840 --> 00:41:29,840
out what they're doing.

595
00:41:29,840 --> 00:41:34,160
And they're just making, you know, they're just basically having displaced people play

596
00:41:34,160 --> 00:41:36,560
video games and call it work.

597
00:41:36,560 --> 00:41:41,560
You know, they people they're told that they're consulting on construction projects and things

598
00:41:41,560 --> 00:41:45,440
or that they're driving telepresence construction robots or things like this.

599
00:41:45,440 --> 00:41:46,920
But they're really not doing anything.

600
00:41:46,920 --> 00:41:52,120
They're just the company exists to take government subsidies and distribute them to displaced

601
00:41:52,120 --> 00:41:53,120
workers.

602
00:41:53,120 --> 00:41:56,120
But, you know, to give the word, it's basically adult daycare.

603
00:41:56,120 --> 00:42:01,680
But the adults who are being taken care of believe they're at work, which is necessary

604
00:42:01,680 --> 00:42:03,880
for their sense of worth.

605
00:42:03,880 --> 00:42:09,040
And you know, there are possible futures that look kind of like that, in which case, do

606
00:42:09,040 --> 00:42:13,360
you want to actually know the real nature of the world in which you live?

607
00:42:13,360 --> 00:42:18,160
Or do you want to live with the comforting illusion that you are actually somebody who,

608
00:42:18,160 --> 00:42:23,360
you know, supports a family and does something meaningful and useful for his society?

609
00:42:23,360 --> 00:42:26,520
Well, I mean, like, I'd rather leave.

610
00:42:26,520 --> 00:42:30,160
I think that's what many of the large social movements in the West are about right now

611
00:42:30,160 --> 00:42:35,600
is different groups of people attempting to leave the social system in different ways

612
00:42:35,600 --> 00:42:37,560
to make an exit.

613
00:42:37,560 --> 00:42:38,560
Yeah.

614
00:42:38,560 --> 00:42:45,240
In terms of AI development, it's gotten I don't know how much better it's going to get.

615
00:42:45,240 --> 00:42:50,760
My suspicion is that we're closing in on the limit of how good we can get it.

616
00:42:50,760 --> 00:42:53,920
You had deep sea come out and they said it was a $5 million system.

617
00:42:53,920 --> 00:43:02,400
But I doubt that the Chinese government was heavily involved in releasing deep sea.

618
00:43:02,400 --> 00:43:08,080
I don't know what AI will be able to replace well, it seems like it's going to be able

619
00:43:08,080 --> 00:43:12,760
to replace a lot of the bureaucracy well, and is likely to be used as a replacement

620
00:43:12,760 --> 00:43:16,680
for a lot of the bureaucracy as the current administration is ripping chunks of the bureaucracy

621
00:43:16,680 --> 00:43:19,480
out.

622
00:43:19,480 --> 00:43:29,620
But in terms of more of a long term view, AI might be able to emulate a lot of the things

623
00:43:29,620 --> 00:43:30,880
that people can do.

624
00:43:30,880 --> 00:43:36,960
But I'm kind of worried about are you familiar with the with the training data issue that

625
00:43:36,960 --> 00:43:37,960
AI seems to have?

626
00:43:37,960 --> 00:43:39,560
We'll say more about that.

627
00:43:39,560 --> 00:43:40,560
That's a big time.

628
00:43:40,560 --> 00:43:46,640
So AI has a hard time training on AI generated data.

629
00:43:46,640 --> 00:43:50,680
Now you can somewhat train an AI on AI generated data.

630
00:43:50,680 --> 00:43:52,640
That's less true now than it was a year ago.

631
00:43:52,640 --> 00:43:54,720
Yeah, it's less true now than it was a year ago.

632
00:43:54,720 --> 00:44:00,920
But I wouldn't I find it a lot of institutions are going to require highly specialized AI

633
00:44:00,920 --> 00:44:03,720
systems in order to run.

634
00:44:03,720 --> 00:44:07,120
So if you're running a factory with an AI or something like that, you have your own

635
00:44:07,120 --> 00:44:11,240
AI system, and you've trained it based on the things that people have done.

636
00:44:11,240 --> 00:44:15,200
What that effectively does is it locks your factory into its current structure.

637
00:44:15,200 --> 00:44:19,180
And if you change anything, the AI is going to start having problems with that.

638
00:44:19,180 --> 00:44:23,320
Because the only way to train it on the new things that you're doing is to train it on

639
00:44:23,320 --> 00:44:25,880
the things that it itself has done.

640
00:44:25,880 --> 00:44:30,760
So I do think that we're looking at a future economy that's going to be radically different.

641
00:44:30,760 --> 00:44:37,680
Everybody we're seeing academia is getting hammered right now, partially through the

642
00:44:37,680 --> 00:44:40,720
administration's bureaucratic changes, but also partially because large swaths of people

643
00:44:40,720 --> 00:44:43,720
no longer think it's worth the money.

644
00:44:43,720 --> 00:44:46,580
And it's developed into sort of a factory system.

645
00:44:46,580 --> 00:44:53,640
In many ways, bureaucracies are kind of like AI in that you program them to do a thing.

646
00:44:53,640 --> 00:44:55,720
They do the thing relatively well.

647
00:44:55,720 --> 00:45:01,200
But if you change it, they're extremely sluggish and slow to respond.

648
00:45:01,200 --> 00:45:05,520
Manual labor is going to be a human thing for the foreseeable future.

649
00:45:05,520 --> 00:45:08,840
Complex data and analytics, that's going to be a human thing for most of the foreseeable

650
00:45:08,840 --> 00:45:12,800
future outside of a few specific applications.

651
00:45:12,800 --> 00:45:21,000
The real issue is going to come down to how much AI can get integrated into governance

652
00:45:21,000 --> 00:45:22,720
and management.

653
00:45:22,720 --> 00:45:27,080
Because if it gets fully integrated economically, it's going to be able to replace a lot of

654
00:45:27,080 --> 00:45:28,080
people.

655
00:45:28,080 --> 00:45:33,520
Now, it could be just that instead of banking the savings, we'll just do more stuff.

656
00:45:33,520 --> 00:45:37,680
But a lot of people aren't really inclined to keep themselves busy when they're not

657
00:45:37,680 --> 00:45:41,760
otherwise being stimulated by an economic need.

658
00:45:41,760 --> 00:45:47,400
So what I could see and what I think is that we're in an evolutionary sort of Darwinian

659
00:45:47,400 --> 00:45:55,880
bottleneck where the people that have a difficult time acting outside of a group expectations

660
00:45:55,880 --> 00:46:02,000
mentality are not going to be having a whole lot of kids for the next generation or two.

661
00:46:02,000 --> 00:46:07,200
And what you're going to get is you're going to get people that either have very poor impulse

662
00:46:07,200 --> 00:46:14,040
control or people that have a personal sense of duty and ambition are going to be the ones

663
00:46:14,040 --> 00:46:18,240
who lead humanity after this, because those are going to be the only true groups that

664
00:46:18,240 --> 00:46:19,240
have kids.

665
00:46:19,240 --> 00:46:25,640
And that itself is going to lend itself to a more aristocratic type of social transition.

666
00:46:25,640 --> 00:46:29,520
Even if it's not official, what you're going to end up with is two groups of people with

667
00:46:29,520 --> 00:46:37,600
phenotypes that are distinctly different in terms of their capacity for future planning.

668
00:46:37,600 --> 00:46:40,720
All right.

669
00:46:40,720 --> 00:46:42,080
That was Copernican.

670
00:46:42,080 --> 00:46:46,740
There will be more, but sorry, folks, the remainder of the conversation will be behind

671
00:46:46,740 --> 00:46:48,040
a paywall.

672
00:46:48,040 --> 00:46:53,480
So it will be in an episode of the Sea Realm Vault podcast and also on my sub stack for

673
00:46:53,480 --> 00:46:54,960
paid subscribers there.

674
00:46:54,960 --> 00:47:00,360
I will say that in the second part of the conversation, we do find something to disagree

675
00:47:00,360 --> 00:47:01,400
about.

676
00:47:01,400 --> 00:47:04,960
So it's not like we have a knockdown drag out argument or anything like that.

677
00:47:04,960 --> 00:47:13,960
But we do go back and forth a bit on just how likely or unlikely it is that companies

678
00:47:13,960 --> 00:47:19,600
like, well, any tech company, really, there's no need to pick one out, would be very abusive

679
00:47:19,600 --> 00:47:27,480
in their use of any implants that one might get, particularly cognitive implants, cybernetic

680
00:47:27,480 --> 00:47:29,300
implants in the brain.

681
00:47:29,300 --> 00:47:33,520
If you are not listening to this on sub stack, I would recommend that you check out my sub

682
00:47:33,520 --> 00:47:39,840
stack and from there you can find links to other people's stuff that I have really enjoyed.

683
00:47:39,840 --> 00:47:42,800
Substack is like a blog in the posts section.

684
00:47:42,800 --> 00:47:46,360
It's more like your typical social media in the notes section.

685
00:47:46,360 --> 00:47:49,640
And then there's also a like section or liked section.

686
00:47:49,640 --> 00:47:53,760
So you can see which items I have read and liked, but not bothered to comment on.

687
00:47:53,760 --> 00:47:56,720
And if you're listening to this and you are a sub stack user, but you do not follow my

688
00:47:56,720 --> 00:47:59,840
account, well, I would invite you to do so.

689
00:47:59,840 --> 00:48:04,280
Also, if you are new to my podcasting work or just new to my work in general, you don't

690
00:48:04,280 --> 00:48:09,640
know who the heck I am, I would recommend that you check out Fear and Loathing in the

691
00:48:09,640 --> 00:48:10,640
Kuiper Belts.

692
00:48:10,640 --> 00:48:15,280
It is available as a Kindle book or in paperback.

693
00:48:15,280 --> 00:48:20,440
In the fullness of time, it will also be an audio book, but not just yet.

694
00:48:20,440 --> 00:48:21,520
All right.

695
00:48:21,520 --> 00:48:23,640
Thank you very much for listening.

696
00:48:23,640 --> 00:48:30,640
Say well.

