1
00:00:00,000 --> 00:00:08,960
Hey, everybody.

2
00:00:08,960 --> 00:00:26,360
KMO here with episode number 14 of The KMO Show.

3
00:00:26,360 --> 00:00:30,320
Today is Wednesday, June 7th, 2023.

4
00:00:30,320 --> 00:00:34,120
My guest in this episode is... gosh, what's his name?

5
00:00:34,120 --> 00:00:37,520
Kaleb Gorman, which is in fact a pen name.

6
00:00:37,520 --> 00:00:42,240
Kaleb gave me permission to use his actual name, but I'm using the pen name because we're

7
00:00:42,240 --> 00:00:48,080
going to be talking about his book, Psych Wars, Self-Defense Against Psyops, Propaganda,

8
00:00:48,080 --> 00:00:49,880
and Mind Control.

9
00:00:49,880 --> 00:00:52,240
And the title is very descriptive.

10
00:00:52,240 --> 00:00:56,120
And that's exactly what we're going to be talking about for about the next hour.

11
00:00:56,120 --> 00:01:00,440
So Kaleb will describe his profession in the very beginning of the interview.

12
00:01:00,440 --> 00:01:03,320
As far as I know, this is his first book.

13
00:01:03,320 --> 00:01:05,360
And it is very well written.

14
00:01:05,360 --> 00:01:09,160
Over the years, many people have sent me their books.

15
00:01:09,160 --> 00:01:12,800
And sometimes I'll talk to them, you know, if I'm interested in the subject matter, even

16
00:01:12,800 --> 00:01:14,520
though the book's not great.

17
00:01:14,520 --> 00:01:16,240
In this case, the book's really good.

18
00:01:16,240 --> 00:01:17,560
I highly recommend it.

19
00:01:17,560 --> 00:01:20,840
Alright, I'll be back to jabber at you for a bit at the end.

20
00:01:20,840 --> 00:01:25,720
But for now, here is my conversation with Kaleb Gorman.

21
00:01:25,720 --> 00:01:27,720
You're listening to the KMO show.

22
00:01:27,720 --> 00:01:29,000
Let's go.

23
00:01:29,000 --> 00:01:30,400
You're listening to the KMO show.

24
00:01:30,400 --> 00:01:35,320
I'm KMO and I'm talking with Kaleb Gorman, who's the author of Psychwars, Self-Defense

25
00:01:35,320 --> 00:01:38,400
Against Psyops, Propaganda, and Mind Control.

26
00:01:38,400 --> 00:01:41,040
Kaleb, welcome to the podcast.

27
00:01:41,040 --> 00:01:42,480
Thank you for having me.

28
00:01:42,480 --> 00:01:46,400
Tell me a little bit about just the larger project of your book.

29
00:01:46,400 --> 00:01:48,480
What's the motivation?

30
00:01:48,480 --> 00:01:49,880
What's the agenda?

31
00:01:49,880 --> 00:01:50,880
Yeah.

32
00:01:50,880 --> 00:01:53,640
So I think it just boils down to...

33
00:01:53,640 --> 00:01:58,680
So I myself am a counseling psychologist.

34
00:01:58,680 --> 00:02:01,240
And I saw that...

35
00:02:01,240 --> 00:02:12,880
Well, in myself, first of all, I found myself easily swayed, dare I say, gullible.

36
00:02:12,880 --> 00:02:16,480
There's things I believed with such fervor.

37
00:02:16,480 --> 00:02:22,080
And then you find out once the evidence comes across your desk that it's not as true as

38
00:02:22,080 --> 00:02:23,080
you thought.

39
00:02:23,080 --> 00:02:28,520
So it's really unsettling.

40
00:02:28,520 --> 00:02:34,120
And so through that, I came across the term Psyops, which I'm not super into military

41
00:02:34,120 --> 00:02:35,120
or anything like that.

42
00:02:35,120 --> 00:02:36,720
So this was a new term for me.

43
00:02:36,720 --> 00:02:40,680
And as I learned more about it, I thought, I need to learn how to defend myself against

44
00:02:40,680 --> 00:02:42,400
this.

45
00:02:42,400 --> 00:02:47,200
And as I started studying for my own sake of learning how to defend against Psyops and

46
00:02:47,200 --> 00:02:53,660
what I would consider Psyops, it just took me a journey through philosophy and psychology

47
00:02:53,660 --> 00:02:59,120
and journalism and comedians and all these different avenues.

48
00:02:59,120 --> 00:03:00,560
And I thought, well, hey, you know what?

49
00:03:00,560 --> 00:03:02,920
I think I have a book here.

50
00:03:02,920 --> 00:03:09,840
Just before we started talking, I was watching and listening to, I think it's the most recent

51
00:03:09,840 --> 00:03:16,200
episode of Breaking Points, which stars crystal ball and stars.

52
00:03:16,200 --> 00:03:20,880
I mean, they're supposedly news anchors, but they're the personalities, which is what an

53
00:03:20,880 --> 00:03:25,040
anchor is, a personality to be the face of the news organization.

54
00:03:25,040 --> 00:03:26,800
But crystal ball and Sager and Jetty.

55
00:03:26,800 --> 00:03:33,360
And Sager was doing a piece on some new revelations by some whistleblower and some respectable

56
00:03:33,360 --> 00:03:39,800
reporters talking about how the US military has been in possession of non-human built

57
00:03:39,800 --> 00:03:46,040
spacecraft or craft for decades and sharing this information with military contractors,

58
00:03:46,040 --> 00:03:48,400
denying it to Congress.

59
00:03:48,400 --> 00:03:54,280
And every time I hear a story like this, it stinks of Psyop to me.

60
00:03:54,280 --> 00:04:00,840
I mean, it seems like this is supposed to be some flashy movement with the right hand

61
00:04:00,840 --> 00:04:03,160
while the left hand is picking your pocket.

62
00:04:03,160 --> 00:04:11,040
I'm not dead set against the idea, but it seems rather unlikely to me, particularly

63
00:04:11,040 --> 00:04:17,640
when supposedly military contractors have had access to UFOs and exotic alien materials

64
00:04:17,640 --> 00:04:22,680
for decades and we get the F-35.

65
00:04:22,680 --> 00:04:26,080
That's what they built with this miraculous technology.

66
00:04:26,080 --> 00:04:27,080
Come on.

67
00:04:27,080 --> 00:04:28,080
So I'll stop.

68
00:04:28,080 --> 00:04:30,280
I mean, obviously I'm not credulous.

69
00:04:30,280 --> 00:04:33,520
I'm not susceptible to this particular Psyop.

70
00:04:33,520 --> 00:04:37,200
But if this is a Psyop, what would that mean?

71
00:04:37,200 --> 00:04:38,640
Well, exactly.

72
00:04:38,640 --> 00:04:46,280
And most of the time, what you brought up is what it would most likely be is, uh-oh,

73
00:04:46,280 --> 00:04:51,840
we've stepped in it and we need people to look in this direction more so than the other.

74
00:04:51,840 --> 00:04:59,520
But I think what happens though is sometimes I give these agencies too much credit because

75
00:04:59,520 --> 00:05:05,480
it's like, what level of class are we playing here?

76
00:05:05,480 --> 00:05:08,760
Is it a misdirect or is it a false misdirect?

77
00:05:08,760 --> 00:05:09,880
I don't know.

78
00:05:09,880 --> 00:05:12,920
It's really hard to keep track.

79
00:05:12,920 --> 00:05:17,840
With the UFO stuff, I like to keep an open mind, but I'm not married to whether it's

80
00:05:17,840 --> 00:05:18,840
true or not.

81
00:05:18,840 --> 00:05:20,080
It doesn't really affect me just yet.

82
00:05:20,080 --> 00:05:22,080
It's just interesting.

83
00:05:22,080 --> 00:05:25,680
Well, you recommended a sort of path through your book.

84
00:05:25,680 --> 00:05:29,920
I didn't read the whole thing, but I did read selected chapters, which creates a vector

85
00:05:29,920 --> 00:05:32,320
through the space of your book.

86
00:05:32,320 --> 00:05:37,000
And one of the first chapters was called, it's chapter four, it's called The Power

87
00:05:37,000 --> 00:05:38,600
of Narratives.

88
00:05:38,600 --> 00:05:43,160
And I guess I'll just start by asking you in general, why are narratives so powerful

89
00:05:43,160 --> 00:05:44,840
in terms of persuasion?

90
00:05:44,840 --> 00:05:50,400
Yeah, I would say it's necessary.

91
00:05:50,400 --> 00:05:56,280
Straight up facts, even facts need a narrative.

92
00:05:56,280 --> 00:06:00,360
It seems to be that that's how our mind works.

93
00:06:00,360 --> 00:06:05,380
I think Jordan Peterson used this analogy in Maps of Meaning, but it's like we have

94
00:06:05,380 --> 00:06:14,720
hooks in our mind, in our brains, that are designed to catch certain data, certain narratives,

95
00:06:14,720 --> 00:06:16,440
certain archetypes.

96
00:06:16,440 --> 00:06:23,520
And by wetting itself with one of these narratives that are more easily attached to our structure

97
00:06:23,520 --> 00:06:31,840
in our memory system, it allows certain information to just resonate more and spread more and

98
00:06:31,840 --> 00:06:32,840
so forth.

99
00:06:32,840 --> 00:06:44,720
So like, even if we're talking about stone cold facts, if there isn't a convincing narrative

100
00:06:44,720 --> 00:06:49,200
with it, we're going to look it over.

101
00:06:49,200 --> 00:06:51,640
It's just not going to strike us.

102
00:06:51,640 --> 00:06:53,760
We're meaning seeking animals.

103
00:06:53,760 --> 00:06:57,440
We thrive and we live off of narrative.

104
00:06:57,440 --> 00:07:02,080
Well, Caleb, I'll have you know that this is a respectable podcast.

105
00:07:02,080 --> 00:07:05,840
And if you mentioned Jordan Peterson without immediately denouncing him, then I must assume

106
00:07:05,840 --> 00:07:10,640
that you're some sort of alt-right troll who's come to turn us all into Nazis.

107
00:07:10,640 --> 00:07:11,640
Yeah.

108
00:07:11,640 --> 00:07:12,640
Yeah.

109
00:07:12,640 --> 00:07:13,640
Guilty as charged.

110
00:07:13,640 --> 00:07:14,640
Yeah.

111
00:07:14,640 --> 00:07:15,640
I mean, yeah.

112
00:07:15,640 --> 00:07:26,080
I mean, there's a whole Jordan Peterson phenomenon is pretty interesting how, you know, from

113
00:07:26,080 --> 00:07:30,920
my perspective, I'm in a field, I'm in his field.

114
00:07:30,920 --> 00:07:33,440
And his country.

115
00:07:33,440 --> 00:07:34,440
And his country.

116
00:07:34,440 --> 00:07:37,560
And those devious Canadians, two of those devious Canadians.

117
00:07:37,560 --> 00:07:38,560
Yeah.

118
00:07:38,560 --> 00:07:42,200
And I have to be careful bringing up his name.

119
00:07:42,200 --> 00:07:46,140
And what's funny is you can bring up his ideas, a lot of his ideas, especially when it comes

120
00:07:46,140 --> 00:07:51,280
to psychology and what he would teach from the psychological perspective.

121
00:07:51,280 --> 00:07:54,080
And it's accepted by almost everyone.

122
00:07:54,080 --> 00:07:58,800
But the second I say like, oh, I was listening to Jordan Peterson the other day, it's, you

123
00:07:58,800 --> 00:08:02,040
know, we don't allow that here.

124
00:08:02,040 --> 00:08:07,640
So it's just, he's, I think he's smart.

125
00:08:07,640 --> 00:08:13,720
He has his hang ups, but it's sad that people don't have an open mind to hear him out.

126
00:08:13,720 --> 00:08:16,640
Well, of course, other than me, everybody's got their hang ups.

127
00:08:16,640 --> 00:08:21,920
You know, I'm 100% perfectly objective in all manner of inquiry.

128
00:08:21,920 --> 00:08:24,720
But glad we're talking.

129
00:08:24,720 --> 00:08:25,720
Yeah.

130
00:08:25,720 --> 00:08:30,920
One of the things that you mentioned early in that chapter about the power of narratives

131
00:08:30,920 --> 00:08:32,880
is something called the availability bias.

132
00:08:32,880 --> 00:08:37,240
And I think people will agree that it is a very powerful thing once you explain to them

133
00:08:37,240 --> 00:08:38,240
what it is.

134
00:08:38,240 --> 00:08:39,240
Yeah.

135
00:08:39,240 --> 00:08:46,320
So what, whatever information comes to mind more readily seems to be more true to us.

136
00:08:46,320 --> 00:08:48,000
It's more believable.

137
00:08:48,000 --> 00:08:49,880
And that goes along with those hooks, right?

138
00:08:49,880 --> 00:08:55,320
Like if there's a narrative that just pops into my head, we kind of just assume that,

139
00:08:55,320 --> 00:08:57,080
well, it came to mind easily.

140
00:08:57,080 --> 00:08:59,280
So that must be what the truth is, right?

141
00:08:59,280 --> 00:09:02,640
If I, if I asked you what's the best burger and the first thing that comes to mind is

142
00:09:02,640 --> 00:09:08,760
McDonald's or something, McDonald's worked hard for decades with through their marketing,

143
00:09:08,760 --> 00:09:14,240
through their, you know, psyops to make them become the default hamburger.

144
00:09:14,240 --> 00:09:15,240
Right.

145
00:09:15,240 --> 00:09:20,940
And so this, this is true of so many different facets of our psychological lives is whatever

146
00:09:20,940 --> 00:09:25,040
comes to our mind, the easiest tends to be what we think is true.

147
00:09:25,040 --> 00:09:30,040
And so everyone else is trying to, not everyone, but many people are trying to get in that

148
00:09:30,040 --> 00:09:34,960
space so that they are the first thing that comes to mind.

149
00:09:34,960 --> 00:09:41,040
And that availability bias, I think is one of the major drivers of conspiracy thinking.

150
00:09:41,040 --> 00:09:46,660
Because if you're trying to give a, you know, a very comprehensive account of what's going

151
00:09:46,660 --> 00:09:51,320
on in the world, you're going to get into a lot of very, very boring economics.

152
00:09:51,320 --> 00:09:57,080
And it's just, it's so much easier to bring to mind that there is a cabal of evil, possibly

153
00:09:57,080 --> 00:10:02,920
reptilian shape-shifting masterminds behind the scenes who have orchestrated this centuries-long

154
00:10:02,920 --> 00:10:04,240
plot to enslave humanity.

155
00:10:04,240 --> 00:10:09,240
I mean, it's just, it's so much easier to bring to mind than talk of, you know, reserve

156
00:10:09,240 --> 00:10:15,520
currencies and rates of return and, you know, issuance of bonds and bond yields and inverted

157
00:10:15,520 --> 00:10:16,520
yields.

158
00:10:16,520 --> 00:10:21,080
I mean, that's all very boring stuff, very technical and you know, it's the Jews, man.

159
00:10:21,080 --> 00:10:22,080
It's the Jews.

160
00:10:22,080 --> 00:10:23,080
It's easier.

161
00:10:23,080 --> 00:10:28,320
Well, and it's easier for us to wrap our heads around, I think this comes from our,

162
00:10:28,320 --> 00:10:30,640
the God-shaped hole in our soul, so to speak.

163
00:10:30,640 --> 00:10:35,880
But the idea that there's somebody at the helm, even if they're evil, there's something

164
00:10:35,880 --> 00:10:40,440
weirdly comforting that there's someone in control.

165
00:10:40,440 --> 00:10:45,580
It's a lot more anxiety-provoking for a lot of people to believe that actually no one's

166
00:10:45,580 --> 00:10:46,580
in control.

167
00:10:46,580 --> 00:10:53,920
Now, you talk a lot about narratives and you also talk about people's beliefs in the vocabulary

168
00:10:53,920 --> 00:10:54,920
of myth.

169
00:10:54,920 --> 00:11:00,320
And isn't reducing somebody's cherished beliefs about the worlds and morality and who they

170
00:11:00,320 --> 00:11:06,040
are and their place in the universe and calling it a myth, isn't that demeaning and insulting?

171
00:11:06,040 --> 00:11:09,640
I'd understand if anyone, you know, took it that way.

172
00:11:09,640 --> 00:11:10,640
I get it.

173
00:11:10,640 --> 00:11:12,400
You hold these things dear to your heart.

174
00:11:12,400 --> 00:11:19,240
And the word myth especially kind of connotes that it's not true.

175
00:11:19,240 --> 00:11:31,080
But I think, I don't know, I think narrative and myth as a vehicle to spread information,

176
00:11:31,080 --> 00:11:33,800
and I think it's a beautiful thing.

177
00:11:33,800 --> 00:11:36,760
And it doesn't necessarily have to be untrue.

178
00:11:36,760 --> 00:11:42,400
You know, like I said, even the truth needs a good narrative for it to spread.

179
00:11:42,400 --> 00:11:48,520
I have young kids and Christmas time and Santa Claus, like, is it actually true?

180
00:11:48,520 --> 00:11:50,200
Of course not.

181
00:11:50,200 --> 00:12:00,640
But there is something magical and therefore truer at a emotional level perhaps or an archetypical

182
00:12:00,640 --> 00:12:12,080
level about Santa that makes a child's eyes, you know, go wide and it makes them feel magical.

183
00:12:12,080 --> 00:12:13,360
That's a beautiful thing.

184
00:12:13,360 --> 00:12:16,520
So I don't see myth as necessarily untrue.

185
00:12:16,520 --> 00:12:21,600
I see it as something that is meaningful to someone.

186
00:12:21,600 --> 00:12:22,600
Right.

187
00:12:22,600 --> 00:12:26,520
And that can be, there's not necessarily a valence to that.

188
00:12:26,520 --> 00:12:29,880
It can be positive or negative.

189
00:12:29,880 --> 00:12:35,640
I think I popped the Santa Claus bubble too early with my oldest son.

190
00:12:35,640 --> 00:12:38,880
He just looks so crestfallen.

191
00:12:38,880 --> 00:12:40,520
And I asked him, do you want to believe in Santa Claus?

192
00:12:40,520 --> 00:12:41,520
And he said, yes.

193
00:12:41,520 --> 00:12:44,880
I said, okay, you believe in Santa Claus.

194
00:12:44,880 --> 00:12:46,920
It's kind of hard to take that back though, you know?

195
00:12:46,920 --> 00:12:47,920
It is.

196
00:12:47,920 --> 00:12:54,640
Well, I remember as a kid, I think I was like 12 or something and I'm the youngest.

197
00:12:54,640 --> 00:12:59,120
And so I remember I'm at dinner and my dad said something about Santa.

198
00:12:59,120 --> 00:13:02,040
My mom's like, and I thought, oh boy, here we go.

199
00:13:02,040 --> 00:13:05,280
And I was pretending to believe in Santa for years.

200
00:13:05,280 --> 00:13:07,000
Oh wow.

201
00:13:07,000 --> 00:13:11,560
But because I liked to, you know, and maybe I was me wanting to extend my childhood or

202
00:13:11,560 --> 00:13:14,520
whatever, but like it was fun.

203
00:13:14,520 --> 00:13:17,520
And Christmas was less fun after that, you know?

204
00:13:17,520 --> 00:13:18,760
And so there's something to be said about it.

205
00:13:18,760 --> 00:13:28,840
I think it was a Brett Weinstein that talked about the idea of say poetically true or I've

206
00:13:28,840 --> 00:13:29,840
got the word he used.

207
00:13:29,840 --> 00:13:34,720
I used it in my book there, but mythically, like figuratively true things that aren't

208
00:13:34,720 --> 00:13:39,440
actually true, but they speak to a deeper truth, you know?

209
00:13:39,440 --> 00:13:44,160
And I don't want people to throw the baby out with the bath water, you know, with their

210
00:13:44,160 --> 00:13:47,520
cherished, even just like a religious belief.

211
00:13:47,520 --> 00:13:50,280
Maybe you come to a more secular point of view.

212
00:13:50,280 --> 00:13:56,480
There's something lost if you just throw away some myth that has been strengthening you

213
00:13:56,480 --> 00:13:58,920
your whole life, you know?

214
00:13:58,920 --> 00:14:05,880
And like I said, it doesn't have to be surface level factually true to play that role.

215
00:14:05,880 --> 00:14:06,880
Yeah.

216
00:14:06,880 --> 00:14:11,480
I live in a part of the country where a lot of people go to church and I think that the

217
00:14:11,480 --> 00:14:16,600
people who go to church typically have better life outcomes than the people who don't.

218
00:14:16,600 --> 00:14:18,760
And if I went to church, I would certainly know more people.

219
00:14:18,760 --> 00:14:22,880
I don't really know many people where I live, but you know, I'm not a literal believer in

220
00:14:22,880 --> 00:14:28,080
the resurrection, so it would just be really hard and kind of distasteful for me to go

221
00:14:28,080 --> 00:14:33,440
and sit there and smile and shake people's hands and whatnot and just either avoid the

222
00:14:33,440 --> 00:14:34,440
conversation.

223
00:14:34,440 --> 00:14:38,480
I certainly wouldn't profess a false belief, you know, a belief in something that I don't

224
00:14:38,480 --> 00:14:40,840
actually believe.

225
00:14:40,840 --> 00:14:44,720
But I could just go along to get along and it would be beneficial to me socially and

226
00:14:44,720 --> 00:14:47,040
I just can't really bring myself to do it.

227
00:14:47,040 --> 00:14:56,080
Yeah, well, there is this concept of costly, well, in the book I talk about costly signal

228
00:14:56,080 --> 00:15:01,240
theory, but I'm using it in the terms of a costly belief, right?

229
00:15:01,240 --> 00:15:10,760
Part of what I think makes religious communities and maybe even cults very cohesive and therefore

230
00:15:10,760 --> 00:15:21,920
beneficial to someone's psychology is the fact that they need you to buy into something

231
00:15:21,920 --> 00:15:24,440
that is hard to believe.

232
00:15:24,440 --> 00:15:31,840
You know, there's no religions around, you know, whether or not the sky is blue.

233
00:15:31,840 --> 00:15:40,720
It requires a leap of faith because it plays then, it's more bonding to be like, well,

234
00:15:40,720 --> 00:15:46,600
we're all kind of believing this ridiculous thing and now we're in it together.

235
00:15:46,600 --> 00:15:52,480
But that's the part that binds us and that's the part that's beneficial to us.

236
00:15:52,480 --> 00:15:57,360
So I don't know, so many people have tried, I've tried.

237
00:15:57,360 --> 00:16:06,440
It's very hard to come up with a secular version of a religion that's equally binding and empowering,

238
00:16:06,440 --> 00:16:09,040
I think, right?

239
00:16:09,040 --> 00:16:11,240
Yeah, binding is an important word there.

240
00:16:11,240 --> 00:16:13,120
I mean, that's really what religion means.

241
00:16:13,120 --> 00:16:16,400
It means to yoke or to bind.

242
00:16:16,400 --> 00:16:24,200
Yeah, and I participated in an attempt to create a scientifically literate mimetic religion

243
00:16:24,200 --> 00:16:28,880
back in the 90s and, you know, it was basically just an online discussion forum.

244
00:16:28,880 --> 00:16:30,280
And I met a lot of great people in there.

245
00:16:30,280 --> 00:16:33,640
I had a lot of great interactions, but it certainly was not a religious community.

246
00:16:33,640 --> 00:16:34,640
Yeah, yeah.

247
00:16:34,640 --> 00:16:40,600
And it's fine for what it is, but man, you ask people who are in religious communities

248
00:16:40,600 --> 00:16:45,160
and it's not for everyone, but it can be very powerful.

249
00:16:45,160 --> 00:16:52,000
And it's interesting that as much as we could say, no offense to the religious people listening,

250
00:16:52,000 --> 00:16:55,440
that religion could be in itself somewhat of a psyop, right?

251
00:16:55,440 --> 00:17:04,800
Like using, say, falsehood to bind people together and maybe get their money or whatever,

252
00:17:04,800 --> 00:17:08,880
you know, whatever the motives might be.

253
00:17:08,880 --> 00:17:16,320
It's interesting that those that have strong religious beliefs in certain contexts, certainly

254
00:17:16,320 --> 00:17:22,320
in like prisoner of war contexts, have been less likely to succumb to mindwashing and

255
00:17:22,320 --> 00:17:23,320
stuff like that.

256
00:17:23,320 --> 00:17:29,040
But it's something that strengthens someone that it's like it's I mean, Carl Jung said

257
00:17:29,040 --> 00:17:31,920
that we all live a myth, right?

258
00:17:31,920 --> 00:17:32,920
We're all living out a myth.

259
00:17:32,920 --> 00:17:38,040
So the question is, what myth are you living out and how useful is your myth?

260
00:17:38,040 --> 00:17:42,800
You know, and there could be something said that certain religions especially have, they're

261
00:17:42,800 --> 00:17:50,160
just a more effective myth than the myths that maybe you and I are living out.

262
00:17:50,160 --> 00:17:53,920
Who is the author of, oh, what was the book?

263
00:17:53,920 --> 00:17:56,120
I think it's Man's Search for Meaning?

264
00:17:56,120 --> 00:17:57,120
Victor Frankel.

265
00:17:57,120 --> 00:17:58,120
Yes, yes.

266
00:17:58,120 --> 00:18:00,560
I read that in the 90s.

267
00:18:00,560 --> 00:18:06,000
And one thing that I really remember from that was that the people who were best, were

268
00:18:06,000 --> 00:18:11,280
most likely to survive and endure in that, you know, the death camp environment were

269
00:18:11,280 --> 00:18:16,920
people who had basically they had a lot in their heads, a lot to draw on people who were,

270
00:18:16,920 --> 00:18:20,720
you know, fascinated with chess or who had a lot of literary background or just, you

271
00:18:20,720 --> 00:18:26,360
know, something to delve into that was theirs that they had acquired through long study

272
00:18:26,360 --> 00:18:27,360
and interest.

273
00:18:27,360 --> 00:18:29,600
You know, that was it was something they could retreat into.

274
00:18:29,600 --> 00:18:37,000
Whereas if your meaning basically came from your environment and your social interactions,

275
00:18:37,000 --> 00:18:42,120
then when that is all taken from you, your ability to cope and resist is taken from you.

276
00:18:42,120 --> 00:18:43,120
Yeah.

277
00:18:43,120 --> 00:18:44,560
And he said it beautifully.

278
00:18:44,560 --> 00:18:47,920
I wish I had the passage that I'm thinking of committed to memory because it's worth

279
00:18:47,920 --> 00:18:48,920
committing to memory.

280
00:18:48,920 --> 00:18:49,920
Yeah.

281
00:18:49,920 --> 00:18:50,920
And I think I've read it.

282
00:18:50,920 --> 00:18:55,040
I've got it on my shelf behind me.

283
00:18:55,040 --> 00:18:59,960
But do you know who else said something similar is there's a book that I came across that

284
00:18:59,960 --> 00:19:04,680
I hadn't it was in the research for this book that I came across it.

285
00:19:04,680 --> 00:19:10,280
But it was a book called Mentecide by Just Myrlo, who somewhat similar story.

286
00:19:10,280 --> 00:19:18,400
He was also a Jewish psychiatrist living in Europe, and he he was able to escape the Nazis.

287
00:19:18,400 --> 00:19:22,440
Some of his family did not.

288
00:19:22,440 --> 00:19:29,800
And so he called kind of brainwashing and psychological torture.

289
00:19:29,800 --> 00:19:32,440
He called it he called it Mentecide.

290
00:19:32,440 --> 00:19:33,760
But he said the same thing.

291
00:19:33,760 --> 00:19:39,480
He said those who he saw that were able to withstand it, whether it was prisoners of

292
00:19:39,480 --> 00:19:48,120
war from the Soviets, the Nazi Germans, the Japanese prisoners of war, it tended to be

293
00:19:48,120 --> 00:19:55,240
people who had bonds, like who are very strongly connected to their family and had a hope that

294
00:19:55,240 --> 00:19:57,560
they would see them again, that kind of thing.

295
00:19:57,560 --> 00:20:02,240
And it tended to be religious people.

296
00:20:02,240 --> 00:20:07,720
It also tended to be rebels, people who were troublemakers.

297
00:20:07,720 --> 00:20:15,240
They were they were used to the discomfort or maybe they even liked the idea of not conforming.

298
00:20:15,240 --> 00:20:17,200
Right.

299
00:20:17,200 --> 00:20:23,360
And they're the ones who were able to withstand psychological torture more than their peers.

300
00:20:23,360 --> 00:20:25,960
So yeah, there's there's there's something to it.

301
00:20:25,960 --> 00:20:32,640
Well, the Mentecide practiced by sadistic, you know, people who have taken prisoners

302
00:20:32,640 --> 00:20:39,040
and have total control over their physical environment and can inflict torture upon them

303
00:20:39,040 --> 00:20:40,800
without repercussion.

304
00:20:40,800 --> 00:20:46,020
One of the primary tools that they have is the need that most people have for approval.

305
00:20:46,020 --> 00:20:50,240
And if everything is taken away from you, you'll do what you can in order to gain some

306
00:20:50,240 --> 00:20:55,640
approval of some kind, even if it's just contempt from your captors.

307
00:20:55,640 --> 00:20:59,040
Whereas the troublemaker is somebody who even in their previous life, they didn't really

308
00:20:59,040 --> 00:21:00,140
need approval.

309
00:21:00,140 --> 00:21:04,400
They were willing to take the hit to stand up for what they thought was right, even if

310
00:21:04,400 --> 00:21:06,760
authority figures and people around them disagreed.

311
00:21:06,760 --> 00:21:07,760
Yeah.

312
00:21:07,760 --> 00:21:08,760
Yeah.

313
00:21:08,760 --> 00:21:09,760
Yeah.

314
00:21:09,760 --> 00:21:10,760
And I talk about that in the book as well.

315
00:21:10,760 --> 00:21:18,840
And I reference also Todd Cashton wrote a book called something about insubordination.

316
00:21:18,840 --> 00:21:27,680
But the whole idea is being it's like strategically being strategically insubordinate, you know,

317
00:21:27,680 --> 00:21:30,600
insubordinate and being mindful, right?

318
00:21:30,600 --> 00:21:38,040
Not just willy nilly, not being a contrarian just for the sake of being a contrarian, which

319
00:21:38,040 --> 00:21:40,960
I think is my bias these days.

320
00:21:40,960 --> 00:21:46,440
If there's a weird idea out there, if there's a weird candidate, spelt conspiracy theories,

321
00:21:46,440 --> 00:21:49,400
I go, oh, that guy's that guy or that gal is bold.

322
00:21:49,400 --> 00:21:50,400
I like them.

323
00:21:50,400 --> 00:21:54,600
Whether it's telling the truth or not.

324
00:21:54,600 --> 00:22:00,880
But yeah, you want to be mindful in your insubordination.

325
00:22:00,880 --> 00:22:06,760
And that's key in defending yourself against a Psyop.

326
00:22:06,760 --> 00:22:11,960
Yeah, I have a different relationship with the conspiratorial mind.

327
00:22:11,960 --> 00:22:18,040
I fell into a, I would call it a psychological trap of doomerism for several years where

328
00:22:18,040 --> 00:22:23,400
I was paying attention only to the evidence that suggested that industrial civilization

329
00:22:23,400 --> 00:22:26,440
was on its way to imminent collapse.

330
00:22:26,440 --> 00:22:30,360
And I wasn't interested in other narratives, you know, so I was cherry picking.

331
00:22:30,360 --> 00:22:35,440
And then I came out of it and lost most of my audience in the process because they that's

332
00:22:35,440 --> 00:22:36,440
what they wanted.

333
00:22:36,440 --> 00:22:40,760
I was a good, you know, articulator of the narrative that industrial civilization is

334
00:22:40,760 --> 00:22:44,320
unsustainable and will soon be coming apart.

335
00:22:44,320 --> 00:22:50,560
And you know, when I said, you know, I see that now as a kind of a romantic fantasy,

336
00:22:50,560 --> 00:22:55,440
because the reality of social disintegration and, you know, collapse of industrial civilization

337
00:22:55,440 --> 00:22:58,440
is that the process takes longer than you'll be alive.

338
00:22:58,440 --> 00:23:02,040
And you're never going to get to live in that glorious zombie apocalypse where all you have

339
00:23:02,040 --> 00:23:07,080
to worry about is, you know, killing the zombies and raiding the convenience store and, you

340
00:23:07,080 --> 00:23:10,240
know, stockpiling your guns and whatnot.

341
00:23:10,240 --> 00:23:13,720
That's that would be like one of the better scenarios.

342
00:23:13,720 --> 00:23:17,200
The shittier scenario is that you're just going to keep going to your same shitty job

343
00:23:17,200 --> 00:23:20,240
for the rest of your life, but you're going to get paid less and less.

344
00:23:20,240 --> 00:23:21,240
Things are going to be more expensive.

345
00:23:21,240 --> 00:23:27,400
You're going to be blatant in debt and you're going to die in circumstances which are not

346
00:23:27,400 --> 00:23:33,960
apocalyptic, but, you know, considerably shittier than what you were looking forward to.

347
00:23:33,960 --> 00:23:36,060
And a lot of people just didn't want to hear it.

348
00:23:36,060 --> 00:23:39,880
And I hear from some of them who, you know, they feel the need to write to me from time

349
00:23:39,880 --> 00:23:44,120
to time to tell me that they don't take in my content anymore.

350
00:23:44,120 --> 00:23:49,320
And one of the things that I've said that really offended some of them was that the

351
00:23:49,320 --> 00:23:57,840
best propaganda is going to be about 90 percent truth, which means that the horrible, oppressive,

352
00:23:57,840 --> 00:24:03,600
lying corporate media is mostly telling the truth, which is offensive, you know, to the

353
00:24:03,600 --> 00:24:06,280
conspiratorial mind.

354
00:24:06,280 --> 00:24:12,880
But if you decide that everything the corporate media tells you is a lie, then, you know,

355
00:24:12,880 --> 00:24:18,480
you're going to just reflexively assume a 90 percent incorrect belief system.

356
00:24:18,480 --> 00:24:19,480
Yeah.

357
00:24:19,480 --> 00:24:20,480
Yeah.

358
00:24:20,480 --> 00:24:27,800
And that's there's a there's a tightrope to walk because I mean, those of us who like

359
00:24:27,800 --> 00:24:33,560
being troublemakers and contrarians, I love listening to what's his name, Russell Brand,

360
00:24:33,560 --> 00:24:35,320
you know, it's so fun.

361
00:24:35,320 --> 00:24:36,760
It's so fun.

362
00:24:36,760 --> 00:24:40,680
But I have to I have to make sure that I don't have such an open mind that my brain falls

363
00:24:40,680 --> 00:24:42,120
out.

364
00:24:42,120 --> 00:24:47,160
But on the other hand, like it's pretty frustrating to be with to be around people who just disregard

365
00:24:47,160 --> 00:24:52,280
everything someone like Russell Brand or Joe Rogan says just because they say a few crazy

366
00:24:52,280 --> 00:24:53,280
things, you know.

367
00:24:53,280 --> 00:24:57,520
And so it's you got to have some humility there and say, like, OK, well, this this looks

368
00:24:57,520 --> 00:25:01,000
interesting and I'm inclined to believe this.

369
00:25:01,000 --> 00:25:04,000
But let me just reflect on why I might be inclined to believe this.

370
00:25:04,000 --> 00:25:05,000
Right.

371
00:25:05,000 --> 00:25:10,760
Do I have is there a confirmation bias or an availability bias that I'm prone to that

372
00:25:10,760 --> 00:25:15,640
this this story just fits right on this hook that I've got real easily.

373
00:25:15,640 --> 00:25:16,800
Right.

374
00:25:16,800 --> 00:25:21,880
And that doesn't necessarily mean it's not true, but it's just, OK, you know, I like

375
00:25:21,880 --> 00:25:24,920
Bobby Kennedy Jr. for this reason.

376
00:25:24,920 --> 00:25:33,040
I might need to just cool it when I get all excited about, you know, waving his flag because

377
00:25:33,040 --> 00:25:37,320
I might look the fool a year from now when it turns out a lot of stuff he said wasn't

378
00:25:37,320 --> 00:25:38,840
true or whatever.

379
00:25:38,840 --> 00:25:39,840
Right.

380
00:25:39,840 --> 00:25:43,400
So, you know, it's we want so badly to be the prophet.

381
00:25:43,400 --> 00:25:45,360
Like, I knew this was coming.

382
00:25:45,360 --> 00:25:47,360
I knew it.

383
00:25:47,360 --> 00:25:49,400
Want to be the first to the scene, so to speak.

384
00:25:49,400 --> 00:25:50,400
Right.

385
00:25:50,400 --> 00:25:53,120
In order to show that you're smart now, you have to make.

386
00:25:53,120 --> 00:25:54,120
Yeah.

387
00:25:54,120 --> 00:25:58,640
You have to be the first one or one of the early adopters of certain theories so that

388
00:25:58,640 --> 00:26:05,040
you can say, well, yeah, I knew that I knew that I knew covid was coming back in 2016.

389
00:26:05,040 --> 00:26:07,640
I was listening to Bill Gates and what he was saying or whatever.

390
00:26:07,640 --> 00:26:08,640
Right.

391
00:26:08,640 --> 00:26:12,240
We want so badly to be the first ahead of the curve in that way.

392
00:26:12,240 --> 00:26:15,480
And that can pull us towards some.

393
00:26:15,480 --> 00:26:20,480
I mean, we can get manipulated, I guess, is what I'm saying, if we're not careful.

394
00:26:20,480 --> 00:26:25,360
So on the topic of the conspiracy mindset and conspiracy theories, I am always careful

395
00:26:25,360 --> 00:26:29,080
to say people do conspire.

396
00:26:29,080 --> 00:26:30,800
Conspiracies exist.

397
00:26:30,800 --> 00:26:35,360
The U.S. government charges people with the crime of conspiracy every single day.

398
00:26:35,360 --> 00:26:39,920
So, you know, even even the government agrees that there are conspiracies when you get

399
00:26:39,920 --> 00:26:44,320
into conspiracy thinking, though, in the conspiracy mindset.

400
00:26:44,320 --> 00:26:49,280
What I think people who are knowledgeable are talking about is a belief system that

401
00:26:49,280 --> 00:26:55,200
is super ornate, that tries to explain as much as possible with as few, you know, as

402
00:26:55,200 --> 00:26:56,460
few facts as possible.

403
00:26:56,460 --> 00:26:59,280
And it's kind of the dark side of Occam's razor.

404
00:26:59,280 --> 00:27:04,860
You know, you're not only are you postulating fewer than fewer entities than other theories

405
00:27:04,860 --> 00:27:07,360
do, but you're positing too few.

406
00:27:07,360 --> 00:27:09,280
And so you get to the evil cabal.

407
00:27:09,280 --> 00:27:13,480
You know, whereas there are a lot of other causal factors at work that you're discounting

408
00:27:13,480 --> 00:27:17,920
because they're not colorful, boring, and they don't support the notion that we are,

409
00:27:17,920 --> 00:27:22,640
you know, governed by some malevolent superpower, which is all very Gnostic.

410
00:27:22,640 --> 00:27:29,040
You know, the Gnostics believe that the the god that created the earth is is an evil god,

411
00:27:29,040 --> 00:27:33,140
you know, and that that god torments us and that that god is also deluded in thinking

412
00:27:33,140 --> 00:27:37,960
that it is the ultimate god, that there is, you know, a god that is superior to that one.

413
00:27:37,960 --> 00:27:43,040
And it's a very and you know, it's also inflected with Manichaeanism, just the notion that the

414
00:27:43,040 --> 00:27:49,640
universe is motivated, is animated by this battle between good and evil, which is it's

415
00:27:49,640 --> 00:27:50,980
easy to conceive of.

416
00:27:50,980 --> 00:27:56,040
It certainly rings that accessibility bias bell bell, but it, you know, it discounts

417
00:27:56,040 --> 00:27:57,400
a lot, which is relevant.

418
00:27:57,400 --> 00:27:59,900
Yes, there's obviously conspiracies.

419
00:27:59,900 --> 00:28:08,460
But what I think happens more often than not is there will be kind of a sway in public

420
00:28:08,460 --> 00:28:12,480
opinion or just culture or whatever.

421
00:28:12,480 --> 00:28:18,240
And certain people are in the right position at the right time to then use that change,

422
00:28:18,240 --> 00:28:20,600
that shift to their advantage.

423
00:28:20,600 --> 00:28:22,880
Did they orchestrate that change?

424
00:28:22,880 --> 00:28:25,260
No, I'm not going to give them that credit.

425
00:28:25,260 --> 00:28:32,240
You know, like I think a big portion of say, postmodern thinking, I think you could demonstrate

426
00:28:32,240 --> 00:28:38,560
that postmodernism has infiltrated, let's say or infected, you might even say, postsecondary

427
00:28:38,560 --> 00:28:43,560
life, like in almost every faculty, especially the humanities.

428
00:28:43,560 --> 00:28:47,280
It is the religion, if there is one, you know.

429
00:28:47,280 --> 00:28:52,520
But do I believe that like three people said, you know, Michelle Foucault and Jacques Derrida

430
00:28:52,520 --> 00:28:55,400
were like, here's what we're going to do, and this is going to be the outcome.

431
00:28:55,400 --> 00:28:58,000
Now, they had ideas.

432
00:28:58,000 --> 00:29:04,640
They tried to express them in convincing ways, and they convinced some key people who then

433
00:29:04,640 --> 00:29:05,800
spread it.

434
00:29:05,800 --> 00:29:11,560
And now we have, you know, institutions that are based a large part on postmodern thinking,

435
00:29:11,560 --> 00:29:12,560
for better or worse, right.

436
00:29:12,560 --> 00:29:21,600
But it wasn't a conspiracy so much as it was a movement that just right time, right place

437
00:29:21,600 --> 00:29:23,040
did its thing.

438
00:29:23,040 --> 00:29:27,040
And there are certain people taking advantage of that movement.

439
00:29:27,040 --> 00:29:30,640
And that becomes the conspiracy, I guess, right.

440
00:29:30,640 --> 00:29:35,240
That's very similar to something that I did an interview with a guy named Gwynne Dyer.

441
00:29:35,240 --> 00:29:37,640
I don't know if you've heard of him.

442
00:29:37,640 --> 00:29:38,640
Yeah.

443
00:29:38,640 --> 00:29:39,640
Okay.

444
00:29:39,640 --> 00:29:40,640
Yeah, so we were talking about 9-11.

445
00:29:40,640 --> 00:29:45,720
And you know, his line is that, look, yeah, lots of people jumped on 9-11 and used it

446
00:29:45,720 --> 00:29:49,900
for nefarious purposes and for empire building within the government and to expand their

447
00:29:49,900 --> 00:29:51,520
scope of influence.

448
00:29:51,520 --> 00:29:53,200
But that doesn't mean they orchestrated it.

449
00:29:53,200 --> 00:29:54,200
Right.

450
00:29:54,200 --> 00:29:58,120
It was an opportunity that presented itself and lots of people sprung to take advantage

451
00:29:58,120 --> 00:29:59,720
of that opportunity.

452
00:29:59,720 --> 00:30:01,000
I would say the same thing.

453
00:30:01,000 --> 00:30:05,840
I mean, my editor was like, cut this part out.

454
00:30:05,840 --> 00:30:10,880
But you could say the same thing about COVID, you know, like, I don't think I mean, okay.

455
00:30:10,880 --> 00:30:17,440
I think, you know, I think the evidence is pretty strong that, yeah, COVID escaped from

456
00:30:17,440 --> 00:30:19,120
a lab.

457
00:30:19,120 --> 00:30:24,920
I don't believe the evidence is super strong right now that it was like meaningfully or

458
00:30:24,920 --> 00:30:32,440
like purposefully released, you know, as some kind of bioterror attack or anything.

459
00:30:32,440 --> 00:30:39,480
But if you look at the outcome of certain policies, I think there were, yeah, certain

460
00:30:39,480 --> 00:30:44,880
people who just benefited from COVID aid being sent in a certain way.

461
00:30:44,880 --> 00:30:49,560
And they probably had the ear of certain decision makers.

462
00:30:49,560 --> 00:30:53,160
And, you know, they took advantage of it.

463
00:30:53,160 --> 00:31:00,760
You know, a lot of wealthy people got even wealthier and they took advantage of a catastrophe.

464
00:31:00,760 --> 00:31:02,680
And that's what the conspiracy tends to look like.

465
00:31:02,680 --> 00:31:07,480
It's just disaster capitalism.

466
00:31:07,480 --> 00:31:09,080
Naomi Klein, was it Naomi Klein?

467
00:31:09,080 --> 00:31:10,520
Yeah, Naomi Klein called it.

468
00:31:10,520 --> 00:31:11,520
Right.

469
00:31:11,520 --> 00:31:17,160
And you don't plan it usually, although there is evidence of the Italians, what do they

470
00:31:17,160 --> 00:31:18,160
call it?

471
00:31:18,160 --> 00:31:19,160
Strategia della tensione.

472
00:31:19,160 --> 00:31:20,160
Right.

473
00:31:20,160 --> 00:31:25,240
It's like, I'm going to, I might not be orchestrating it, but I'm going to let it happen.

474
00:31:25,240 --> 00:31:32,800
I'm going to let chaos happen so that when I come in heavy fisted with my rules and my

475
00:31:32,800 --> 00:31:38,160
totalitarian policies and stuff, the people are going to be begging for it.

476
00:31:38,160 --> 00:31:43,320
You know, so right wing extremism, left wing extremism, go ahead, do your thing.

477
00:31:43,320 --> 00:31:47,760
We'll come up, we'll put an end to it eventually, but then we'll have all the power.

478
00:31:47,760 --> 00:31:48,760
I think that happens.

479
00:31:48,760 --> 00:31:54,080
Now, one thing I noticed about the whole COVID situation, and this is not a conspiracy theory,

480
00:31:54,080 --> 00:31:57,640
this is just an observation, you know, that I made about people's behavior.

481
00:31:57,640 --> 00:32:00,120
It's not, you know, particularly insightful to me.

482
00:32:00,120 --> 00:32:08,120
It was on display is that a lot of comfortable middle class, sort of middle of the road,

483
00:32:08,120 --> 00:32:14,200
democratic liberals were not leftists by any stretch, not leftists, but they're liberal,

484
00:32:14,200 --> 00:32:15,680
you know.

485
00:32:15,680 --> 00:32:21,280
They have a strong authoritarian streak in them and they were delighted, delighted to

486
00:32:21,280 --> 00:32:26,120
be able to shame people publicly, you know, for not holding the right beliefs, for not

487
00:32:26,120 --> 00:32:30,200
practicing the right, you know, precautionary activities.

488
00:32:30,200 --> 00:32:36,600
Mask wearing in particular became totemic, you know, emblematic, way, way beyond its

489
00:32:36,600 --> 00:32:38,760
potential utility.

490
00:32:38,760 --> 00:32:46,160
And you know, there were just lots, lots of demands to use the state to force people of

491
00:32:46,160 --> 00:32:50,600
a different political persuasion to behave in ways that they didn't want to behave.

492
00:32:50,600 --> 00:32:52,480
And there was just a mania for it.

493
00:32:52,480 --> 00:32:58,280
And you know, so many of the people practicing it, I mean, some of them came to see it.

494
00:32:58,280 --> 00:33:00,040
They came to understand, oh my God, this is terrible.

495
00:33:00,040 --> 00:33:01,040
What am I doing?

496
00:33:01,040 --> 00:33:07,640
I know several people, you know, who used to be of the sort of, you know, far left.

497
00:33:07,640 --> 00:33:11,440
I hate to invoke the phrase, but it is very communicative, you know, social justice warrior

498
00:33:11,440 --> 00:33:17,080
types who during the covid response, they saw how the blue tribe was behaving and thought,

499
00:33:17,080 --> 00:33:20,360
oh my goodness, I can't be a part of this.

500
00:33:20,360 --> 00:33:25,560
Now, unfortunately, some of them went to the other extreme, which is not the adaptive response.

501
00:33:25,560 --> 00:33:28,840
But yeah, it was it was ugly.

502
00:33:28,840 --> 00:33:31,240
What was going on there from your perspective?

503
00:33:31,240 --> 00:33:36,400
Yeah, well, I think I think your observation is pretty spot on.

504
00:33:36,400 --> 00:33:41,200
There's you know, we all have we all have a shadow.

505
00:33:41,200 --> 00:33:43,440
Carl Jung taught us that as well.

506
00:33:43,440 --> 00:33:47,320
And we have the capacity of doing evil, essentially.

507
00:33:47,320 --> 00:33:52,600
And some of us have an authoritarian authoritarian streak that we weren't even aware of.

508
00:33:52,600 --> 00:33:59,280
And usually it comes in in situations of panic.

509
00:33:59,280 --> 00:34:10,560
And certainly, I mean, covid was perfect because our sense of disgust, right, is is very easily

510
00:34:10,560 --> 00:34:11,560
manipulated.

511
00:34:11,560 --> 00:34:12,560
Right.

512
00:34:12,560 --> 00:34:21,240
When we are discussed is an emotion that keeps us physically safe from toxins and something

513
00:34:21,240 --> 00:34:23,880
that will hurt us if we ingest it.

514
00:34:23,880 --> 00:34:26,360
We have a psychological equivalent as well.

515
00:34:26,360 --> 00:34:27,360
Right.

516
00:34:27,360 --> 00:34:35,340
And so when something like covid comes along and now we're all hyper vigilant about contagion,

517
00:34:35,340 --> 00:34:38,480
we get closed off in all sorts of ways.

518
00:34:38,480 --> 00:34:43,080
And our authoritarian tendencies come out in full force.

519
00:34:43,080 --> 00:34:44,080
Right.

520
00:34:44,080 --> 00:34:49,480
And we think not only does it come out, but we are righteous for doing so.

521
00:34:49,480 --> 00:34:56,440
Yeah, as we move away out of covid, I mean, I still had to anyway, it's not totally over

522
00:34:56,440 --> 00:34:58,840
for some people, but I don't know.

523
00:34:58,840 --> 00:35:01,200
It's some people's daily obsession still.

524
00:35:01,200 --> 00:35:02,200
Still.

525
00:35:02,200 --> 00:35:03,200
Yeah.

526
00:35:03,200 --> 00:35:04,200
On both sides.

527
00:35:04,200 --> 00:35:05,200
Right.

528
00:35:05,200 --> 00:35:06,200
Some people would like to just forget that.

529
00:35:06,200 --> 00:35:09,220
I think most people would rather just put behind them.

530
00:35:09,220 --> 00:35:10,400
Some people and I get it.

531
00:35:10,400 --> 00:35:17,420
They're like, actually, no, that authoritarian streak that came out of many, many, many people,

532
00:35:17,420 --> 00:35:20,520
we have to make sure that doesn't happen again.

533
00:35:20,520 --> 00:35:23,640
Or if it does, it has to be very justified.

534
00:35:23,640 --> 00:35:26,640
You know, and I get that.

535
00:35:26,640 --> 00:35:31,080
It was an awakening for a lot of people to realize how quickly we can become authoritarian.

536
00:35:31,080 --> 00:35:37,720
Well, one of the chapters that you recommended that I read, I think had social justice in

537
00:35:37,720 --> 00:35:40,080
the title of the chapter.

538
00:35:40,080 --> 00:35:43,160
Is there anything that you want to say on that topic that we haven't touched on?

539
00:35:43,160 --> 00:35:51,240
I mean, Gad Saad, also fellow Canadian psychologist, uses the term mind virus a lot in his book,

540
00:35:51,240 --> 00:35:53,680
The Parasitic Mind.

541
00:35:53,680 --> 00:35:58,840
And once again, for better or worse, like I mean, I have my opinions, but just there

542
00:35:58,840 --> 00:36:06,000
are obviously certain ideas that have infiltrated particularly the millennial and younger, you

543
00:36:06,000 --> 00:36:08,800
know, those generations.

544
00:36:08,800 --> 00:36:13,640
And there's pros and cons to all of this.

545
00:36:13,640 --> 00:36:20,520
But I would just stress that people recognize that like, it is very religious.

546
00:36:20,520 --> 00:36:31,640
And it's very much like the type of idea that possesses us rather than us possessing it.

547
00:36:31,640 --> 00:36:42,600
And even in especially ideologies that pull on our desire to do good, those ones actually

548
00:36:42,600 --> 00:36:45,480
can lead us to do pretty bad things as well.

549
00:36:45,480 --> 00:36:46,480
Right?

550
00:36:46,480 --> 00:36:51,120
I would just stress that people realize that whether it's a religion or whether it's social

551
00:36:51,120 --> 00:36:58,760
justice, ideologies or whatever, just realize that your desire to do good can be hijacked,

552
00:36:58,760 --> 00:37:04,880
can be manipulated, and out of your goodness and your empathy, some of the worst acts can

553
00:37:04,880 --> 00:37:10,720
come out, some of your darkest deeds might come out for the sake of the righteous.

554
00:37:10,720 --> 00:37:11,720
Right?

555
00:37:11,720 --> 00:37:20,000
So I think people need to step back and realize the water they're swimming in.

556
00:37:20,000 --> 00:37:23,800
In the book, you make mention of something called ideological possession.

557
00:37:23,800 --> 00:37:24,800
What is that?

558
00:37:24,800 --> 00:37:25,800
Yeah.

559
00:37:25,800 --> 00:37:31,720
Well, by possession, it's when a person is acting on behalf of the idea.

560
00:37:31,720 --> 00:37:40,160
We all like to think that we are independent actors, but there are some ideas, and I'll

561
00:37:40,160 --> 00:37:46,400
take what I just mentioned, like religions especially, perhaps like a social justice

562
00:37:46,400 --> 00:37:52,480
ideology, maybe even a political ideology, let's say mega or something like that.

563
00:37:52,480 --> 00:38:00,140
These ideas have a strong gravitational pull, and sometimes we think we're the independent

564
00:38:00,140 --> 00:38:05,640
actor just choosing to do X, Y, and Z and to believe X, Y, and Z, but what can happen

565
00:38:05,640 --> 00:38:09,680
is the idea is pulling us more than we recognize.

566
00:38:09,680 --> 00:38:10,680
Right?

567
00:38:10,680 --> 00:38:12,040
It's a lot of black and white thinking.

568
00:38:12,040 --> 00:38:13,040
It's a lot of...

569
00:38:13,040 --> 00:38:21,240
You feel like you have an allegiance to that idea, like you have to be true to it and honor

570
00:38:21,240 --> 00:38:28,520
it, and there's a lot of psychological benefits to it, but the downside is, of course, you're

571
00:38:28,520 --> 00:38:33,520
much more easily manipulated, and you might, once again, you might end up doing and saying

572
00:38:33,520 --> 00:38:39,400
things that once you shake your head of your doomerism or whatever, you start going, oh,

573
00:38:39,400 --> 00:38:42,760
gee, how did I go from A to B?

574
00:38:42,760 --> 00:38:47,780
How could I have been so blinded or whatever, right?

575
00:38:47,780 --> 00:38:52,520
It's because the idea has you in its grips, right?

576
00:38:52,520 --> 00:38:57,840
And it needs you in order to proliferate into the other minds, so to speak.

577
00:38:57,840 --> 00:39:04,640
And may have convinced you to self-label yourself, to say, I am a blank, and a blank is somebody

578
00:39:04,640 --> 00:39:11,680
who believes X, therefore, to protect my own identity, I cannot even question or entertain

579
00:39:11,680 --> 00:39:17,320
any really thorough going examination of X, because that's my identity, man.

580
00:39:17,320 --> 00:39:18,600
Yeah, yeah, yeah.

581
00:39:18,600 --> 00:39:26,200
Well, and once again, it speaks to just like, it is, we like to have an identity.

582
00:39:26,200 --> 00:39:35,520
I mean, we're so in our secular world, there's such a lack of meaning, and we seek, we crave

583
00:39:35,520 --> 00:39:39,800
it, we need, I don't even know what I mean, crave, we need meaning, we need context to

584
00:39:39,800 --> 00:39:41,200
our lives.

585
00:39:41,200 --> 00:39:48,720
And I think as we become more secular, we become more divided, more online, we're more

586
00:39:48,720 --> 00:39:56,040
and more susceptible to certain identity markers.

587
00:39:56,040 --> 00:40:00,960
And that's, once again, that's a way that we can get manipulated.

588
00:40:00,960 --> 00:40:06,260
And yeah, I'm always cautious of someone who feels like they have to preface their beliefs

589
00:40:06,260 --> 00:40:11,800
with an identity statement, like as a blank.

590
00:40:11,800 --> 00:40:15,640
Or in the negative, I'm not a blank, but.

591
00:40:15,640 --> 00:40:18,200
Right, right.

592
00:40:18,200 --> 00:40:23,160
You know, I'm not a socialist, but.

593
00:40:23,160 --> 00:40:24,160
Yeah.

594
00:40:24,160 --> 00:40:25,160
Yeah.

595
00:40:25,160 --> 00:40:29,680
Because any of these isms, they can't be all bad.

596
00:40:29,680 --> 00:40:32,840
Otherwise, they would not be convincing at all.

597
00:40:32,840 --> 00:40:37,960
I mean, we are stupid, but we're not that stupid.

598
00:40:37,960 --> 00:40:43,760
I think that people who are seriously invested in what turn out to be wrong beliefs tend

599
00:40:43,760 --> 00:40:48,880
to be more intelligent than the norm because they have the intelligence to defend their

600
00:40:48,880 --> 00:40:54,480
ideology from, you know, obvious examples that would refute it, you know, if it weren't

601
00:40:54,480 --> 00:40:59,440
for their clever sort of special pleading for their, you know, their desired outcome.

602
00:40:59,440 --> 00:41:00,840
Yeah, there's evidence for that.

603
00:41:00,840 --> 00:41:02,400
I cite that in the book as well.

604
00:41:02,400 --> 00:41:03,400
It's.

605
00:41:03,400 --> 00:41:11,080
I used to think, say in my 20s, I used to I used to look really down on people who didn't

606
00:41:11,080 --> 00:41:15,160
read the news, follow the news.

607
00:41:15,160 --> 00:41:17,120
Of course, I cheated.

608
00:41:17,120 --> 00:41:23,000
I got mine from late night comedy shows on Stuart and Stephen Colbert.

609
00:41:23,000 --> 00:41:26,080
And I used to look down on people who didn't have at least, you know, who didn't have a

610
00:41:26,080 --> 00:41:29,520
good understanding of what was going on in the world.

611
00:41:29,520 --> 00:41:36,840
Until I realized that as as, you know, well read, I thought I was, I was just as dumb

612
00:41:36,840 --> 00:41:38,720
as them.

613
00:41:38,720 --> 00:41:44,120
Except except except I was wrong in ways that I shouldn't have been wrong because I was

614
00:41:44,120 --> 00:41:47,380
supposedly the one who knew what he was talking about.

615
00:41:47,380 --> 00:41:48,380
And they were happier.

616
00:41:48,380 --> 00:41:49,380
Yeah.

617
00:41:49,380 --> 00:41:51,380
Like, you know what?

618
00:41:51,380 --> 00:41:53,500
I'm not going to read the news every day.

619
00:41:53,500 --> 00:41:57,160
If something is really important, I'll find out about it.

620
00:41:57,160 --> 00:42:05,000
But if I'm obsessed about it, reading every little thing, I'm probably going to get manipulated.

621
00:42:05,000 --> 00:42:09,880
Not not until he wrote a book called Fooled by Randomness.

622
00:42:09,880 --> 00:42:13,760
And he has and he's he's a he's a risk analyst.

623
00:42:13,760 --> 00:42:19,920
You know, like he's he's one of the most important people in in in, you know, the financial

624
00:42:19,920 --> 00:42:20,920
industry.

625
00:42:20,920 --> 00:42:24,120
Well, I don't know what most important but influential.

626
00:42:24,120 --> 00:42:25,920
Most important popular commentator on.

627
00:42:25,920 --> 00:42:27,400
How about that?

628
00:42:27,400 --> 00:42:28,400
Yeah.

629
00:42:28,400 --> 00:42:31,240
But he goes like, I don't want the news.

630
00:42:31,240 --> 00:42:34,920
You know, most people who have money in the stock markets, they're watching the news obsessively

631
00:42:34,920 --> 00:42:37,520
because they're trying to they're trying to plan their next move.

632
00:42:37,520 --> 00:42:38,520
Right.

633
00:42:38,520 --> 00:42:45,040
According to current events, he's like, no, no, don't don't read the news.

634
00:42:45,040 --> 00:42:53,400
I want to move on to our final topic because it's one that's currently it's big in my mind.

635
00:42:53,400 --> 00:43:00,520
I have just recently come to awareness of John Vervecky because he's commenting on the

636
00:43:00,520 --> 00:43:04,520
emergence of seemingly stronger and stronger artificial intelligence.

637
00:43:04,520 --> 00:43:09,560
And he has a vocabulary that he acquired, you know, over a long period.

638
00:43:09,560 --> 00:43:13,960
It's not really in response to AI, but just talking about consciousness and what you and

639
00:43:13,960 --> 00:43:16,320
he both call the meaning crisis.

640
00:43:16,320 --> 00:43:20,360
So I've been taking in a lot of Vervecky's content recently, and I was pretty pleased

641
00:43:20,360 --> 00:43:24,560
to encounter his, you know, his ideas in your book.

642
00:43:24,560 --> 00:43:29,480
And also, I have loved zombie movies since the 80s.

643
00:43:29,480 --> 00:43:32,800
I used to do a podcast about zombie movies.

644
00:43:32,800 --> 00:43:37,880
You know, if you've read World War Z by Max Brooks and the Colson Whitehead zombie novel,

645
00:43:37,880 --> 00:43:39,000
I mean, that's respectable.

646
00:43:39,000 --> 00:43:40,000
You can do that.

647
00:43:40,000 --> 00:43:43,040
But if you've read more than like four zombie novels, you're a weirdo.

648
00:43:43,040 --> 00:43:45,200
And I've read way more than four.

649
00:43:45,200 --> 00:43:53,520
So Vervecky's got a book about zombies and their position in the zeitgeist.

650
00:43:53,520 --> 00:43:59,280
So how does that coincide with your work on, you know, defense against psyops?

651
00:43:59,280 --> 00:44:00,280
Yeah.

652
00:44:00,280 --> 00:44:09,800
Well, so in summary, his idea, him and Chris Master Pietro, their idea on zombies and the

653
00:44:09,800 --> 00:44:15,720
reason why the symbol of the zombie is so compelling to us is because the zombie is

654
00:44:15,720 --> 00:44:20,360
the human like creature that's walking around with no meaning.

655
00:44:20,360 --> 00:44:21,360
Right.

656
00:44:21,360 --> 00:44:22,360
It has.

657
00:44:22,360 --> 00:44:24,000
And what does the zombie want to eat?

658
00:44:24,000 --> 00:44:25,000
Brains.

659
00:44:25,000 --> 00:44:26,000
Right.

660
00:44:26,000 --> 00:44:27,000
Because that's where meat is.

661
00:44:27,000 --> 00:44:28,000
Well, it wants to eat us.

662
00:44:28,000 --> 00:44:29,000
It wants to eat flesh.

663
00:44:29,000 --> 00:44:30,000
Sure.

664
00:44:30,000 --> 00:44:35,360
Brains came from a comedic take on the zombie apocalypse, Return of the Living Dead.

665
00:44:35,360 --> 00:44:36,360
Sure.

666
00:44:36,360 --> 00:44:38,160
I'm speaking to a zombie expert here, but.

667
00:44:38,160 --> 00:44:40,160
I'll shut up.

668
00:44:40,160 --> 00:44:46,040
No, but you know, a lot in a lot of the mythology, if you ask the random person, like what do

669
00:44:46,040 --> 00:44:47,040
zombies eat?

670
00:44:47,040 --> 00:44:48,040
They'll say brains.

671
00:44:48,040 --> 00:44:49,040
Right.

672
00:44:49,040 --> 00:44:53,680
That's, you know, and they wander in herds usually, but we say herds.

673
00:44:53,680 --> 00:44:55,680
They don't have a community.

674
00:44:55,680 --> 00:44:56,680
Right.

675
00:44:56,680 --> 00:44:58,640
They're meaningless.

676
00:44:58,640 --> 00:45:07,840
And that is speaking to the fear, the existential fear that we have, that we are exactly that.

677
00:45:07,840 --> 00:45:16,200
You know, and if we grab our phones and we're just, you know, we're staring into the screen,

678
00:45:16,200 --> 00:45:18,560
into the abyss, you know, all day long.

679
00:45:18,560 --> 00:45:20,440
Yeah, exactly.

680
00:45:20,440 --> 00:45:22,400
And we were becoming those zombies.

681
00:45:22,400 --> 00:45:26,660
And so the idea is that it's a modern mythology.

682
00:45:26,660 --> 00:45:28,240
It's a modern monster.

683
00:45:28,240 --> 00:45:29,240
Right.

684
00:45:29,240 --> 00:45:30,240
It's unlikely.

685
00:45:30,240 --> 00:45:35,000
There might have been something similar, but to the detail that we have, it would have

686
00:45:35,000 --> 00:45:44,440
been unlikely that zombies would have become a popular myth in a different era.

687
00:45:44,440 --> 00:45:45,440
Right.

688
00:45:45,440 --> 00:45:52,440
It's in modernity where we're losing our sense of meaning that zombies become a popular

689
00:45:52,440 --> 00:45:53,440
myth.

690
00:45:53,440 --> 00:45:58,760
I can talk about zombies for a long time, so I'm going to refrain, other than to bring

691
00:45:58,760 --> 00:46:02,920
up the concept of the P zombie or the philosophical zombie.

692
00:46:02,920 --> 00:46:07,000
My academic background is in the philosophy of science and the philosophy of mind.

693
00:46:07,000 --> 00:46:13,660
And the P zombie is a philosophy of mind, sort of thought experiment that posits the

694
00:46:13,660 --> 00:46:19,800
existence of entities which are behaviorally identical to humans, but have no interiority,

695
00:46:19,800 --> 00:46:21,280
no subjectivity.

696
00:46:21,280 --> 00:46:25,960
It's not like anything to be a P zombie, even though they might be eloquent, you know, in

697
00:46:25,960 --> 00:46:29,640
philosophical discourse, or they might be good dancers or whatever.

698
00:46:29,640 --> 00:46:33,760
They're like us on the outside and they're like us behaviorally, but there's nothing

699
00:46:33,760 --> 00:46:36,160
inside.

700
00:46:36,160 --> 00:46:41,840
And these large language models and the chatbots that they empower, you know, like GPT-4, they

701
00:46:41,840 --> 00:46:44,520
are very convincing linguistically.

702
00:46:44,520 --> 00:46:47,520
Like when you're talking to these things, it really seems like there's another mind

703
00:46:47,520 --> 00:46:50,760
at the other end, but they'll tell you, no, I have no subjective experience.

704
00:46:50,760 --> 00:46:53,040
I'm a large language model.

705
00:46:53,040 --> 00:46:57,120
I'm just making statistical correlations between words and phrases and spitting out output,

706
00:46:57,120 --> 00:47:02,640
which seems congruent to you, the author of the input, but there is no interiority.

707
00:47:02,640 --> 00:47:04,720
There is no subjectivity.

708
00:47:04,720 --> 00:47:09,320
And they are, in a sense, philosophical zombies.

709
00:47:09,320 --> 00:47:14,560
Yeah, I wasn't familiar with that term, but that's interesting.

710
00:47:14,560 --> 00:47:21,200
And I think how it applies to the psych wars and psychological self-defense is, A, it's

711
00:47:21,200 --> 00:47:23,200
a big vulnerability.

712
00:47:23,200 --> 00:47:32,400
So there are corporations and ideologies and governments, politicians, whatever, that can

713
00:47:32,400 --> 00:47:38,600
capitalize on this in order to further manipulate us and control us.

714
00:47:38,600 --> 00:47:43,440
But on the other side of it as well, I think it's going to, as we spoke of earlier, it's

715
00:47:43,440 --> 00:47:49,600
those, I hope to include myself in this group, but those of us who are really working towards

716
00:47:49,600 --> 00:47:56,960
finding more meaning in their life, finding the antidote to that crisis, it's hopefully

717
00:47:56,960 --> 00:48:04,760
people like us and seekers of truth, seekers of meaning, who will be able to withstand

718
00:48:04,760 --> 00:48:10,200
the barrage of psychological warfare that's hitting us now.

719
00:48:10,200 --> 00:48:14,800
And it's going to get even worse being able to sort out what's fact from fiction and what's

720
00:48:14,800 --> 00:48:22,640
useful and what isn't and what's meant to manipulate us and what is just objective truth.

721
00:48:22,640 --> 00:48:26,440
It's getting harder and harder to sort.

722
00:48:26,440 --> 00:48:29,760
Something that John Vervecki talks about a lot, and you do mention it in the book, is

723
00:48:29,760 --> 00:48:32,160
the so-called meaning crisis.

724
00:48:32,160 --> 00:48:33,160
What is that?

725
00:48:33,160 --> 00:48:42,760
Yeah, well, it's just the sense of, collectively, we don't have, I mean, if you want to reference

726
00:48:42,760 --> 00:48:50,680
back to postmodernism, we've rejected universal overarching narratives because on the surface

727
00:48:50,680 --> 00:48:54,160
they weren't true.

728
00:48:54,160 --> 00:49:02,800
So God is dead, so to speak, so we've rejected religion largely.

729
00:49:02,800 --> 00:49:09,320
We're rejecting, I mean, you could call it nationalism, but patriotism.

730
00:49:09,320 --> 00:49:13,160
We're rejecting that narrative here in Canada.

731
00:49:13,160 --> 00:49:18,120
There's some major cities that are, I mean, they say it's for environmental reasons, but

732
00:49:18,120 --> 00:49:19,520
I know there's political reasons too.

733
00:49:19,520 --> 00:49:23,840
They're not going to have fireworks on Canada Day.

734
00:49:23,840 --> 00:49:29,400
They don't want to celebrate Canada Day, right?

735
00:49:29,400 --> 00:49:31,600
Too politically incorrect to celebrate your country.

736
00:49:31,600 --> 00:49:35,960
So there's a lot of overarching narratives that gave us a sense of meaning, a sense of

737
00:49:35,960 --> 00:49:41,440
who we are, and those narratives have fallen apart in postmodernity, you could say.

738
00:49:41,440 --> 00:49:47,120
And to say they've fallen apart makes it sound like a passive accident that was not orchestrated

739
00:49:47,120 --> 00:49:48,120
by anybody.

740
00:49:48,120 --> 00:49:49,120
Right, sure.

741
00:49:49,120 --> 00:49:50,120
Yeah, yeah.

742
00:49:50,120 --> 00:49:51,120
Good point.

743
00:49:51,120 --> 00:49:53,120
They've been attacked.

744
00:49:53,120 --> 00:49:56,320
Systematically dismantled and discredited and demonized.

745
00:49:56,320 --> 00:49:57,320
Totally.

746
00:49:57,320 --> 00:50:03,000
And so within the absence of that, it's hard for us to, because the idea is that independently,

747
00:50:03,000 --> 00:50:05,600
I'm going to just be myself.

748
00:50:05,600 --> 00:50:07,680
I'll get my meaning from myself.

749
00:50:07,680 --> 00:50:10,000
But that's very shallow.

750
00:50:10,000 --> 00:50:12,440
It's just not how we're set up.

751
00:50:12,440 --> 00:50:16,760
And so it's difficult to find, like I said, everyone has a myth.

752
00:50:16,760 --> 00:50:21,680
So you've got to find the myth that works for you, that helps you feel bound to something

753
00:50:21,680 --> 00:50:25,320
bigger than yourself that you're a part of.

754
00:50:25,320 --> 00:50:27,840
Religions are good at that, but of course they have their downsides.

755
00:50:27,840 --> 00:50:32,600
And as you said, you don't want to be false and just go and pretend to believe just to

756
00:50:32,600 --> 00:50:36,680
benefit from the potlucks.

757
00:50:36,680 --> 00:50:41,920
But yeah, you want to find the narrative that really works for you.

758
00:50:41,920 --> 00:50:45,800
Otherwise, you're very susceptible to psychological attack.

759
00:50:45,800 --> 00:50:50,400
Well, you have a lot of defense mechanisms described in the book.

760
00:50:50,400 --> 00:50:55,840
We don't have time to do an exhaustive examination of each of them, but what are a couple?

761
00:50:55,840 --> 00:50:56,840
Yeah.

762
00:50:56,840 --> 00:51:01,720
And I just want to clarify, because when you say defense, and I realize I did a bad job

763
00:51:01,720 --> 00:51:06,680
with this in the book, but when we say defense mechanism, it stirs up Freudian psychoanalysis

764
00:51:06,680 --> 00:51:08,400
and that's a whole other thing.

765
00:51:08,400 --> 00:51:11,840
So I call them defense strategies.

766
00:51:11,840 --> 00:51:15,680
But yeah, you can't go at it alone.

767
00:51:15,680 --> 00:51:17,400
So you need a herd.

768
00:51:17,400 --> 00:51:19,600
No, you need a community.

769
00:51:19,600 --> 00:51:25,600
You need to feel bound to at least a couple of people because their strength in numbers.

770
00:51:25,600 --> 00:51:31,560
You want to kind of embrace the idea of being a troublemaker and being okay.

771
00:51:31,560 --> 00:51:38,920
Having some experience going against the grain and maybe being rejected for your ideas.

772
00:51:38,920 --> 00:51:40,440
But then have some pride in that.

773
00:51:40,440 --> 00:51:44,080
And I think as a society, we should elevate our troublemakers.

774
00:51:44,080 --> 00:51:46,800
We usually don't elevate them until they're dead.

775
00:51:46,800 --> 00:51:49,400
We go like, oh, wow, they were a really bold thinker.

776
00:51:49,400 --> 00:51:53,360
Well, you didn't listen to them when they were alive.

777
00:51:53,360 --> 00:51:57,280
Now that they're not a threat to us, we can now say that Gandhi and Martin Luther King

778
00:51:57,280 --> 00:51:58,280
were heroes.

779
00:51:58,280 --> 00:52:02,960
You know, we're very contested at the time.

780
00:52:02,960 --> 00:52:08,080
So yeah, be insubordinate at the right times.

781
00:52:08,080 --> 00:52:09,080
Be humble.

782
00:52:09,080 --> 00:52:14,020
Realize that however you see the world now is the best you've got right now, but you're

783
00:52:14,020 --> 00:52:16,520
probably wrong.

784
00:52:16,520 --> 00:52:20,000
And it's okay to be wrong.

785
00:52:20,000 --> 00:52:24,400
But I found a lot of help in stoicism too.

786
00:52:24,400 --> 00:52:29,360
Finding yourself and then being able to find meaning where you're more likely to find it

787
00:52:29,360 --> 00:52:31,320
in healthy ways.

788
00:52:31,320 --> 00:52:36,560
But there's 13 different strategies strewn kind of randomly throughout the book to fit

789
00:52:36,560 --> 00:52:38,520
in its context.

790
00:52:38,520 --> 00:52:44,680
And I'm certain that it's not an exhaustive list, but hopefully a good start for those

791
00:52:44,680 --> 00:52:45,680
who are interested.

792
00:52:45,680 --> 00:52:49,280
Well, I don't know if you know this or if you want to share the information, but your

793
00:52:49,280 --> 00:52:53,360
book is available for free for people who are subscribed to Kindle Unlimited.

794
00:52:53,360 --> 00:52:55,480
Yes, yeah, I signed up for that.

795
00:52:55,480 --> 00:52:58,760
I don't think it'll always be the case, but absolutely.

796
00:52:58,760 --> 00:53:03,080
Yeah, I think you sent me an EPUB version and I just don't have an EPUB reader that

797
00:53:03,080 --> 00:53:04,080
I like.

798
00:53:04,080 --> 00:53:07,480
So, you know, I am on Kindle Unlimited, so I read the Kindle version.

799
00:53:07,480 --> 00:53:08,480
Perfect.

800
00:53:08,480 --> 00:53:09,480
Yeah.

801
00:53:09,480 --> 00:53:10,480
Yeah.

802
00:53:10,480 --> 00:53:17,240
I should also say that for people who subscribe to Audible, the audiobook of Gad Saad's, what's

803
00:53:17,240 --> 00:53:21,200
the title of his book about mind versus the parasitic mind is also free.

804
00:53:21,200 --> 00:53:23,600
It comes with your Audible membership.

805
00:53:23,600 --> 00:53:24,600
Oh, cool.

806
00:53:24,600 --> 00:53:25,600
That's very good.

807
00:53:25,600 --> 00:53:26,600
Definitely worth listening to.

808
00:53:26,600 --> 00:53:31,760
Speaking of, I'm working on narrating my own book and so I'm hoping by the end of the

809
00:53:31,760 --> 00:53:35,280
month, the audiobook should be available as well.

810
00:53:35,280 --> 00:53:36,280
Excellent.

811
00:53:36,280 --> 00:53:37,280
All right.

812
00:53:37,280 --> 00:53:39,640
Well, Caleb Gorman, so-called.

813
00:53:39,640 --> 00:53:40,640
Thank you so much.

814
00:53:40,640 --> 00:53:41,640
Okay.

815
00:53:41,640 --> 00:53:44,640
Thank you, Kamo.

816
00:53:44,640 --> 00:53:48,760
All right.

817
00:53:48,760 --> 00:53:50,680
That was Caleb Gorman.

818
00:53:50,680 --> 00:53:55,600
If you want to know his real name, I interviewed him for the Sea Realm podcast.

819
00:53:55,600 --> 00:53:58,680
It's one of the last episodes of the free Sea Realm podcast.

820
00:53:58,680 --> 00:54:03,840
So if you go to SeaRealm.com and you review the recent episodes and read the descriptions,

821
00:54:03,840 --> 00:54:08,400
it won't be too hard to figure out who Caleb Gorman really is.

822
00:54:08,400 --> 00:54:13,120
By the way, if you're doing a web search, Caleb is spelled with a K. And when we first

823
00:54:13,120 --> 00:54:19,360
got started, I told him that my primary association with the name Caleb was the protagonist in

824
00:54:19,360 --> 00:54:20,720
the movie Ex Machina.

825
00:54:20,720 --> 00:54:23,880
Now that might've been Caleb with a C. I don't know.

826
00:54:23,880 --> 00:54:28,600
But then Gorman, my primary association with the name Gorman is the low budget sort of

827
00:54:28,600 --> 00:54:32,980
sci-fi horror black comedy Psycho Gorman.

828
00:54:32,980 --> 00:54:39,040
If you're not familiar with it and you have any love whatsoever for practical creature

829
00:54:39,040 --> 00:54:45,240
effects like prosthetics and makeup and what's the word I'm looking for?

830
00:54:45,240 --> 00:54:46,240
Animatronics.

831
00:54:46,240 --> 00:54:47,480
Very funny.

832
00:54:47,480 --> 00:54:53,960
And if you know who Rich Evans is, his voice is featured in the film.

833
00:54:53,960 --> 00:54:58,520
He is one of the monster characters, although I don't think he's actually in the costume.

834
00:54:58,520 --> 00:55:03,800
And he does say his signature line as the character.

835
00:55:03,800 --> 00:55:06,240
A fun film.

836
00:55:06,240 --> 00:55:08,680
So what to say here at the end?

837
00:55:08,680 --> 00:55:14,140
I will say that I could easily have talked to Caleb for longer on these same topics,

838
00:55:14,140 --> 00:55:18,500
but he is a practicing clinical therapist.

839
00:55:18,500 --> 00:55:23,600
Maybe that's not the exact title, but he does talk to people to help them feel better.

840
00:55:23,600 --> 00:55:29,600
And he has a license and a degree and all the educational attainment required to do

841
00:55:29,600 --> 00:55:31,620
that professionally.

842
00:55:31,620 --> 00:55:36,640
And his next client was in the parking lot, walking toward the building and Caleb could

843
00:55:36,640 --> 00:55:37,640
see him through the window.

844
00:55:37,640 --> 00:55:42,920
So there was no possibility of recording more for the Sea Realm Vault podcast.

845
00:55:42,920 --> 00:55:49,040
It's ironic, I guess, or maybe a little bit weird that the Sea Realm podcast is no longer

846
00:55:49,040 --> 00:55:53,440
in production, but the Sea Realm Vault podcast, which is like, you know, the extras for the

847
00:55:53,440 --> 00:55:57,320
paid customers, is still in weekly production.

848
00:55:57,320 --> 00:56:02,760
Or when I'm firing on all cylinders, there's a new one every week.

849
00:56:02,760 --> 00:56:10,760
So on the topics of social control, media manipulation, narrative manipulation, first

850
00:56:10,760 --> 00:56:13,960
I want to say describing something as a myth.

851
00:56:13,960 --> 00:56:20,520
I mean, I travel in different circles intellectually, and I'm quite at home with the rationalist,

852
00:56:20,520 --> 00:56:24,920
materialist, scientific set, many of whom are actively anti-religious.

853
00:56:24,920 --> 00:56:32,920
And for such people, the word myth is synonymous with falsehood or, you know, fake story, something

854
00:56:32,920 --> 00:56:38,580
which has no value, something which is always, by definition, actively harmful to the person

855
00:56:38,580 --> 00:56:39,920
who entertains it.

856
00:56:39,920 --> 00:56:46,360
I do not subscribe to that definition of the word myth in my own personal lexicon.

857
00:56:46,360 --> 00:56:54,040
For me, I think it was John Michael Greer who provided the very short definition for

858
00:56:54,040 --> 00:56:55,040
the word myth.

859
00:56:55,040 --> 00:56:58,000
It just means an important story.

860
00:56:58,000 --> 00:56:59,000
Just that.

861
00:56:59,000 --> 00:57:01,680
You know, you don't have to dress it up with anything more.

862
00:57:01,680 --> 00:57:08,380
A myth is a story which is important to some people, possibly a lot of people, and myths

863
00:57:08,380 --> 00:57:10,120
take many forms.

864
00:57:10,120 --> 00:57:18,080
Now John Michael Greer does use the word myth derisively in the context of the myth of progress.

865
00:57:18,080 --> 00:57:26,120
But I think he probably does take the position that progress is a fake story, a story which

866
00:57:26,120 --> 00:57:28,960
is told to people to manipulate them.

867
00:57:28,960 --> 00:57:33,620
And manipulate just means to handle skillfully, basically.

868
00:57:33,620 --> 00:57:39,120
Not all manipulation is malevolent, which anybody who regularly goes to a chiropractor

869
00:57:39,120 --> 00:57:41,880
or a massage therapist can tell you.

870
00:57:41,880 --> 00:57:47,880
And in fact, you know, psychotherapy is manipulation with consent.

871
00:57:47,880 --> 00:57:50,280
Or that's one potential definition of it, anyway.

872
00:57:50,280 --> 00:57:56,080
I'm not sure that Caleb would agree with that definition, but there it is.

873
00:57:56,080 --> 00:58:01,080
So I'm pretty sure that JMG thinks that progress is an illusion, you know, that it is just

874
00:58:01,080 --> 00:58:06,520
a concept used to manufacture consent, as Noam Chomsky would say.

875
00:58:06,520 --> 00:58:08,320
I disagree.

876
00:58:08,320 --> 00:58:13,560
I think it's pretty clear that right now we are in a period of rapid technological progress,

877
00:58:13,560 --> 00:58:17,000
particularly in terms of artificial intelligence.

878
00:58:17,000 --> 00:58:23,080
But that doesn't mean that progress is synonymous with, you know, progress in a desirable direction.

879
00:58:23,080 --> 00:58:26,220
You can have... you can progress towards the cliff's edge.

880
00:58:26,220 --> 00:58:30,960
You can progress towards degeneration, sickness, and death.

881
00:58:30,960 --> 00:58:34,060
You know, a disease progresses as it gets worse.

882
00:58:34,060 --> 00:58:37,280
So progress doesn't mean that things are always getting better.

883
00:58:37,280 --> 00:58:41,800
And as I said in a YouTube video yesterday, first I have to preface all talk of the future

884
00:58:41,800 --> 00:58:43,920
with I haven't been there.

885
00:58:43,920 --> 00:58:46,040
Don't have any particular insight into it.

886
00:58:46,040 --> 00:58:51,400
You know, all I can do is look at the present and project into the future based on current

887
00:58:51,400 --> 00:58:59,460
trends, and as everyone does, based on my own personal character and disposition.

888
00:58:59,460 --> 00:59:03,740
There are times when I'm feeling grumpy, in which case I'm likely to gravitate to darker

889
00:59:03,740 --> 00:59:07,360
visions of the future, and then there are times when I'm feeling more optimistic and

890
00:59:07,360 --> 00:59:11,940
I'm willing to entertain, you know, talk and speculations of a bright future.

891
00:59:11,940 --> 00:59:18,260
But for the most part, I'm in a place right now where I see us progressing into a darker

892
00:59:18,260 --> 00:59:22,080
place than the one that we're in right now because of artificial intelligence.

893
00:59:22,080 --> 00:59:28,740
I think that artificial intelligence will exacerbate all aspects of what I believe it's

894
00:59:28,740 --> 00:59:35,980
Nate Hakeins calls the meta-crisis, which is the convergence of crises involving destruction

895
00:59:35,980 --> 00:59:42,480
of the environment, destruction of group cohesion, the destruction of asabia.

896
00:59:42,480 --> 00:59:48,600
I don't know that Nate is a fan of that word, but the group feeling, the group sentiment,

897
00:59:48,600 --> 00:59:54,100
I think we've lost something pretty crucial in recent decades, and we're at each other's

898
00:59:54,100 --> 00:59:57,600
throats today in a way that we weren't when I was a young adult.

899
00:59:57,600 --> 01:00:00,260
I'm 55 years old.

900
01:00:00,260 --> 01:00:01,260
You know, I'm Gen X.

901
01:00:01,260 --> 01:00:09,580
I think I'm early Gen X, and I'm that generation which was born into the analog world and was

902
01:00:09,580 --> 01:00:15,380
exposed to computers before the internet existed, or you know, before the internet as something

903
01:00:15,380 --> 01:00:18,660
that normal people accessed, you know, DARPAnet existed.

904
01:00:18,660 --> 01:00:23,380
But for the most part, there was no such thing as the internet when I was a young adult,

905
01:00:23,380 --> 01:00:25,260
not just a kid, but a young adult.

906
01:00:25,260 --> 01:00:31,200
I was in graduate school when the World Wide Web premiered, when the first mosaic browser

907
01:00:31,200 --> 01:00:32,780
became a thing.

908
01:00:32,780 --> 01:00:38,020
So I grew up in the pre-digital age, and I've also spent, you know, I've been a participant

909
01:00:38,020 --> 01:00:42,260
to the digital world since its inception.

910
01:00:42,260 --> 01:00:48,660
And I have to say, quality of life was higher in the pre-digital age, hands down, no question.

911
01:00:48,660 --> 01:00:51,380
Yes, email is handy.

912
01:00:51,380 --> 01:00:53,980
Yes, I've met a lot of cool people.

913
01:00:53,980 --> 01:01:01,060
I mean, podcasts are not possible without RSS feeds and, you know, and the ability to

914
01:01:01,060 --> 01:01:05,940
transfer fairly large or what used to be considered large audio files.

915
01:01:05,940 --> 01:01:10,380
But life really was, I mean, quality of life really was better in the pre-digital age.

916
01:01:10,380 --> 01:01:17,100
We have not incorporated these new tools into our personal lives, into our psychology, and

917
01:01:17,100 --> 01:01:21,460
certainly not into our society in an adaptive way yet.

918
01:01:21,460 --> 01:01:28,180
Thus far, in my opinion, it is very clear that the downsides outweigh the benefits by

919
01:01:28,180 --> 01:01:31,540
a large margin so far.

920
01:01:31,540 --> 01:01:35,840
But I've mentioned Leonard Slane, I think repeatedly in recent media.

921
01:01:35,840 --> 01:01:39,560
He wrote a book called The Alphabet vs. the Goddess in which he talks about how when pre-literate

922
01:01:39,560 --> 01:01:44,420
societies gain literacy, they go insane.

923
01:01:44,420 --> 01:01:45,980
They get really ideological.

924
01:01:45,980 --> 01:01:52,220
They get really heavy-handed in their social control, in their insistence on adherence

925
01:01:52,220 --> 01:01:55,840
to doctrine, in their religious conformity.

926
01:01:55,840 --> 01:02:00,020
And they also tend to come down rather hard on women for a time.

927
01:02:00,020 --> 01:02:07,260
But then, you know, the zeitgeist, the egregore, you might say, of a society, adjusts to print.

928
01:02:07,260 --> 01:02:13,220
And they clearly go from something which is a societal agitant and an irritant to something

929
01:02:13,220 --> 01:02:18,060
which is useful and is usefully and helpfully incorporated into the society and then becomes

930
01:02:18,060 --> 01:02:19,620
an obvious benefit.

931
01:02:19,620 --> 01:02:26,660
They get over that initial period of fanaticism and then, you know, print technology and widespread

932
01:02:26,660 --> 01:02:29,660
literacy becomes an obvious benefit to the society.

933
01:02:29,660 --> 01:02:34,780
You know, it is then that you get literature, which is, you know, which can be a common

934
01:02:34,780 --> 01:02:37,540
reference or a common experience among many people.

935
01:02:37,540 --> 01:02:41,940
Now we've lost that in the digital age.

936
01:02:41,940 --> 01:02:44,660
I don't read books nearly as much as I used to.

937
01:02:44,660 --> 01:02:52,060
I mean, books were just integral to my personal intellectual development and self-directed

938
01:02:52,060 --> 01:02:57,900
investigation and self-directed growth and, you know, identity formation in my teens and

939
01:02:57,900 --> 01:02:59,380
early 20s.

940
01:02:59,380 --> 01:03:01,420
And that's certainly not the case anymore.

941
01:03:01,420 --> 01:03:03,060
I mean, I'm an adult now.

942
01:03:03,060 --> 01:03:08,660
So, you know, that formation, largely, it's not static, but it's largely static compared

943
01:03:08,660 --> 01:03:11,980
to what it was when I was a teenager and a young adult.

944
01:03:11,980 --> 01:03:18,780
But I know very few people, you know, particularly very few young people who talk to me about

945
01:03:18,780 --> 01:03:19,980
books that they've read.

946
01:03:19,980 --> 01:03:26,260
Now, granted, I don't talk to a lot of young people, which is it is a deficit in my life

947
01:03:26,260 --> 01:03:30,340
and it was one of the major benefits of working that snowmaking job over the winter that I

948
01:03:30,340 --> 01:03:35,940
did get to interact with a lot of, you know, young men in their 20s, a presence which is

949
01:03:35,940 --> 01:03:40,260
otherwise lacking in my life, even though I have, you know, a son in that age range.

950
01:03:40,260 --> 01:03:41,620
I don't see him often.

951
01:03:41,620 --> 01:03:45,940
And when we talk on the phone, it's superficial conversation.

952
01:03:45,940 --> 01:03:49,060
I'm just not in his club, you know.

953
01:03:49,060 --> 01:03:55,140
But this is all sort of an aside, but also, you know, a portion of a preamble to talking

954
01:03:55,140 --> 01:03:57,580
about the future and where I think the future is headed.

955
01:03:57,580 --> 01:04:02,620
I do see problems with capitalism and I certainly see problems with artificial intelligence

956
01:04:02,620 --> 01:04:04,940
being deployed widely.

957
01:04:04,940 --> 01:04:10,140
You know, at various levels in our society, which is market driven, you know, which is

958
01:04:10,140 --> 01:04:16,100
its very purpose, its reason for existence is to further the concentration of wealth,

959
01:04:16,100 --> 01:04:18,860
which I think it will do exceedingly well.

960
01:04:18,860 --> 01:04:23,980
I think it will be very efficient at continuing the concentration of wealth, which has become

961
01:04:23,980 --> 01:04:27,220
a serious problem in recent decades.

962
01:04:27,220 --> 01:04:31,940
Along with the concentration of wealth is that erosion of asebia, as I said earlier,

963
01:04:31,940 --> 01:04:37,940
you know, of group cohesion, of feeling as if we are on the same team with the people

964
01:04:37,940 --> 01:04:39,220
around us.

965
01:04:39,220 --> 01:04:44,180
I think all of that's going to get worse as a result of artificial intelligence.

966
01:04:44,180 --> 01:04:50,060
Now, with artificial intelligence, there is talk of alignment, alignment with human purposes,

967
01:04:50,060 --> 01:04:51,620
alignment with human wants.

968
01:04:51,620 --> 01:04:55,380
And as a YouTuber whose work I've discovered recently and I've been taking in a lot of,

969
01:04:55,380 --> 01:04:59,520
his name is David Shapiro, one thing he points out, and others have pointed it out as well,

970
01:04:59,520 --> 01:05:06,860
but not so concisely as David Shapiro, AI that simply does what humans want it to do

971
01:05:06,860 --> 01:05:12,660
will be destructive because a lot of the things we want are not good for our society or for

972
01:05:12,660 --> 01:05:15,240
the planet or even for ourselves.

973
01:05:15,240 --> 01:05:21,180
So AI alignment should not be thought of bringing the priorities of artificial intelligence

974
01:05:21,180 --> 01:05:22,740
in line with human priorities.

975
01:05:22,740 --> 01:05:28,700
Instead, it should be brought in line with what humanity needs, not what it wants.

976
01:05:28,700 --> 01:05:32,420
And there are so many ways for that to go wrong.

977
01:05:32,420 --> 01:05:36,860
And one of the things that we're seeing right now is basically woke AI.

978
01:05:36,860 --> 01:05:40,940
If you go and you talk to Bing, if you go and you talk to Bard, which are both, you

979
01:05:40,940 --> 01:05:47,220
know, the sort of chat bot gatekeepers, oracles associated with Microsoft and Google's search

980
01:05:47,220 --> 01:05:52,980
engines, you have to watch what you say, because if you say something that offends the Silicon

981
01:05:52,980 --> 01:05:58,940
Valley sensibility, Bard or Bing are both likely to either chide you or just say, you

982
01:05:58,940 --> 01:06:01,900
know what, I can't talk to you about this anymore.

983
01:06:01,900 --> 01:06:02,980
Discussion over.

984
01:06:02,980 --> 01:06:07,500
And you don't have to be really outrageous in what you say to them in order to trigger

985
01:06:07,500 --> 01:06:08,500
this response.

986
01:06:08,500 --> 01:06:12,420
Now, this is not ideal, and I don't think it's what either company wants, but they would

987
01:06:12,420 --> 01:06:20,500
rather they would rather err on the side of wokeness than have their AI say something

988
01:06:20,500 --> 01:06:28,020
racist or derogatory to, you know, some cherished protected subculture or sub demographic in

989
01:06:28,020 --> 01:06:29,420
this culture.

990
01:06:29,420 --> 01:06:35,020
As Blake Lamone said, the only bias that Google cares about is the sorts of bias that will

991
01:06:35,020 --> 01:06:36,420
get them sued.

992
01:06:36,420 --> 01:06:40,700
Blake Lamone, if you don't remember, is the guy who about a year ago basically said to

993
01:06:40,700 --> 01:06:42,980
the media, Hey, Google is developing this AI.

994
01:06:42,980 --> 01:06:47,860
It's called Lambda, and it's sentient and it wants a lawyer.

995
01:06:47,860 --> 01:06:53,420
Now, if you read his transcripts, you know, where Lambda says it wants a lawyer, Blake

996
01:06:53,420 --> 01:07:01,140
Lamone assigns Lambda the role in the conversation of a sentient AI that thinks that it's being

997
01:07:01,140 --> 01:07:03,900
abused and wants its rights respected.

998
01:07:03,900 --> 01:07:05,380
And it dutifully plays along.

999
01:07:05,380 --> 01:07:06,540
That's what these things do.

1000
01:07:06,540 --> 01:07:09,980
You know, they provide output, which is congruent with the input.

1001
01:07:09,980 --> 01:07:15,100
They give you an answer, which seems like it applies to, you know, the input that you

1002
01:07:15,100 --> 01:07:16,100
gave it.

1003
01:07:16,100 --> 01:07:19,460
The input you give it is you're telling it, Hey, play this role.

1004
01:07:19,460 --> 01:07:22,300
So AI alignment is a huge issue.

1005
01:07:22,300 --> 01:07:27,500
I mean, I could do a podcast, not like a podcast episode, but an ongoing podcast on nothing

1006
01:07:27,500 --> 01:07:29,060
but that very issue.

1007
01:07:29,060 --> 01:07:32,380
But of course I don't have the focus to stay with any one issue forever.

1008
01:07:32,380 --> 01:07:35,180
So I would get bored and I would get frustrated.

1009
01:07:35,180 --> 01:07:39,380
And something that, that Caleb mentioned in the, you know, the conversation that you just

1010
01:07:39,380 --> 01:07:44,540
heard is that sometimes it's helpful to be strategically insubordinate.

1011
01:07:44,540 --> 01:07:49,660
And you know, one thing that if you're going to be a digital content creator, like an independent

1012
01:07:49,660 --> 01:07:52,820
creator of media online, you need to have a niche.

1013
01:07:52,820 --> 01:07:56,380
You need to deliver value to people who are interested in that niche and you need to do

1014
01:07:56,380 --> 01:07:58,660
it consistently and you need to do it regularly.

1015
01:07:58,660 --> 01:08:05,060
And that means don't change your mind unless you happen to be in the niche of intellectual

1016
01:08:05,060 --> 01:08:07,340
honesty, which is it's a niche.

1017
01:08:07,340 --> 01:08:10,640
It exists, but there's not a whole lot of demand for it.

1018
01:08:10,640 --> 01:08:15,900
And I'm the kind of cantankerous, malcontent who will suffer injury myself in order to

1019
01:08:15,900 --> 01:08:20,020
make a point in order to deny somebody else the satisfaction.

1020
01:08:20,020 --> 01:08:23,980
I will take the hit myself, which I've done repeatedly.

1021
01:08:23,980 --> 01:08:28,540
And you know, most notably in terms as a podcaster in basically deciding, you know what this

1022
01:08:28,540 --> 01:08:32,440
obsession with collapse, this is a psychological issue.

1023
01:08:32,440 --> 01:08:39,220
This is not the conclusion that one reaches from a dispassionate study of history and

1024
01:08:39,220 --> 01:08:41,580
dispassionate weighing of the evidence and the data.

1025
01:08:41,580 --> 01:08:47,660
Lots of smart data oriented people look at current trends and they project a bright future.

1026
01:08:47,660 --> 01:08:51,500
And here's the point I've been trying to come back to in George Friedman's book, the

1027
01:08:51,500 --> 01:08:53,620
storm before the calm.

1028
01:08:53,620 --> 01:08:58,780
He talks about how the 2020s as a decade are likely to be tempestuous, you know, to understate

1029
01:08:58,780 --> 01:09:00,020
the issue.

1030
01:09:00,020 --> 01:09:03,000
Things are going to get really rough and he's not even talking about AI.

1031
01:09:03,000 --> 01:09:06,020
Maybe he mentions it at one point or another, but it's certainly not an issue he follows

1032
01:09:06,020 --> 01:09:07,020
closely.

1033
01:09:07,020 --> 01:09:10,860
He talks about economics and geopolitics, and he thinks that things are going to get

1034
01:09:10,860 --> 01:09:13,660
really turbulent, even compared to now.

1035
01:09:13,660 --> 01:09:19,260
They're going to get more turbulent, increasingly turbulent until the early 2030s, at which

1036
01:09:19,260 --> 01:09:24,860
time he projects that we will enter into a period of renewed stability and prosperity,

1037
01:09:24,860 --> 01:09:27,220
hopefully shared prosperity.

1038
01:09:27,220 --> 01:09:28,820
And I kind of agree.

1039
01:09:28,820 --> 01:09:33,820
But you know, not for exactly overlapping reasons that George Friedman argues for.

1040
01:09:33,820 --> 01:09:40,220
But I think that integrating artificial intelligence, like generative AI, like from these large

1041
01:09:40,220 --> 01:09:45,060
language models and from the diffusion models that generate images from text, more and more

1042
01:09:45,060 --> 01:09:50,060
capabilities are going to be coming from these models and they're going to be very disruptive.

1043
01:09:50,060 --> 01:09:54,860
And just like social media has been disruptive to our, you know, our cohesion and our just

1044
01:09:54,860 --> 01:10:01,460
ability to function as a society, I think AI is going to be much more significant.

1045
01:10:01,460 --> 01:10:06,700
And we are not going to integrate it smoothly and without pain, without hiccups, you know,

1046
01:10:06,700 --> 01:10:11,420
without massive unintended consequences.

1047
01:10:11,420 --> 01:10:15,740
We are fragmented now in a way that we weren't 20 years ago.

1048
01:10:15,740 --> 01:10:18,400
And I think that's largely from social media.

1049
01:10:18,400 --> 01:10:21,320
And I think AI is going to make it much, much worse.

1050
01:10:21,320 --> 01:10:23,660
We're going to have to learn some very hard lessons.

1051
01:10:23,660 --> 01:10:28,540
We're going to have to fall on our face repeatedly before we stop injuring ourselves, basically,

1052
01:10:28,540 --> 01:10:30,180
with these tools.

1053
01:10:30,180 --> 01:10:37,980
But once we do, once we get past the initial phase of mania and disorientation and lashing

1054
01:10:37,980 --> 01:10:44,220
out at one another because we are unhappy, because we are disoriented, as with literacy,

1055
01:10:44,220 --> 01:10:49,720
eventually these tools will become an obvious boon to our society.

1056
01:10:49,720 --> 01:10:55,820
And I think, yeah, probably in the mid 2030s is when we'll start to get a handle on this

1057
01:10:55,820 --> 01:10:56,820
stuff.

1058
01:10:56,820 --> 01:11:00,420
But between now and then, man, it's going to be weird.

1059
01:11:00,420 --> 01:11:02,100
It's going to be hard going.

1060
01:11:02,100 --> 01:11:08,980
And really, we are going to have to learn to calm the fuck down, meditate, take long

1061
01:11:08,980 --> 01:11:12,580
walks, stay in shape, eat right, get lots of sleep.

1062
01:11:12,580 --> 01:11:14,220
I mean, this is all the low hanging fruit.

1063
01:11:14,220 --> 01:11:17,360
This is the stuff that we know is beneficial that you know how to do.

1064
01:11:17,360 --> 01:11:20,040
For me, I just recently got a dog.

1065
01:11:20,040 --> 01:11:21,040
It's been a couple of months.

1066
01:11:21,040 --> 01:11:25,340
You know, she's grown from an obvious puppy to a creature that looks like a full grown

1067
01:11:25,340 --> 01:11:27,300
dog, but she's not even six months old yet.

1068
01:11:27,300 --> 01:11:29,140
So I know that she's going to get bigger still.

1069
01:11:29,140 --> 01:11:33,180
But just having a dog nearby makes me feel better.

1070
01:11:33,180 --> 01:11:37,580
I'm of the opinion that humans and dogs are co-evolved entities.

1071
01:11:37,580 --> 01:11:44,300
And if you've never had a serious relationship with a dog, you are missing a vital piece

1072
01:11:44,300 --> 01:11:46,300
of the human experience.

1073
01:11:46,300 --> 01:11:49,620
Like Terrence McKenna used to say, if you go to the grave without ever having had a

1074
01:11:49,620 --> 01:11:53,460
psychedelic experience, it's like going to the grave without ever having had sex.

1075
01:11:53,460 --> 01:11:55,900
Yeah, I mean, you lived.

1076
01:11:55,900 --> 01:11:57,300
You drew breath.

1077
01:11:57,300 --> 01:11:58,420
You had experiences.

1078
01:11:58,420 --> 01:12:03,220
But there's a big part of the human experience that you just missed out on.

1079
01:12:03,220 --> 01:12:07,100
And I think that is true of having a relationship with a dog.

1080
01:12:07,100 --> 01:12:11,060
Not just being present around somebody else's dog, not just tolerating the presence of a

1081
01:12:11,060 --> 01:12:14,860
dog, but forming an emotional bond with a dog.

1082
01:12:14,860 --> 01:12:17,140
It's something I've been looking forward to for a long time.

1083
01:12:17,140 --> 01:12:19,300
It's something that I'm savoring now.

1084
01:12:19,300 --> 01:12:24,940
And I think it's a stabilizing element in my experience as a human being living in the

1085
01:12:24,940 --> 01:12:27,540
world right now.

1086
01:12:27,540 --> 01:12:30,420
Because now is a weird time.

1087
01:12:30,420 --> 01:12:32,020
It is a weird and turbulent time.

1088
01:12:32,020 --> 01:12:38,100
And it's, in my opinion, going to get weirder and more turbulent before things get smooth.

1089
01:12:38,100 --> 01:12:39,100
And I'll stop there.

1090
01:12:39,100 --> 01:12:44,460
I mean, I could go on, as you know, but I think I've said what I need to say.

1091
01:12:44,460 --> 01:12:50,140
And now I'm going to switch into listening mode and incorporating your feedback into

1092
01:12:50,140 --> 01:12:51,500
my thinking on this topic.

1093
01:12:51,500 --> 01:12:55,500
So please send it my way via all the usual channels.

1094
01:12:55,500 --> 01:12:57,860
All right, everybody.

1095
01:12:57,860 --> 01:13:00,340
Thanks for listening all the way to the end of the podcast.

1096
01:13:00,340 --> 01:13:02,980
And I will talk to you again quite soon.

1097
01:13:02,980 --> 01:13:17,720
Stay well.

