1
00:00:00,000 --> 00:00:24,760
Hey everybody.

2
00:00:24,760 --> 00:00:27,360
Today my guest is Kenneth E. Harrell.

3
00:00:27,360 --> 00:00:32,040
He is a cybersecurity professional and a science fiction novelist.

4
00:00:32,040 --> 00:00:36,960
And he came to my attention when I got a notification from Substack saying that Kenneth had started

5
00:00:36,960 --> 00:00:39,360
to recommend my Substack blog.

6
00:00:39,360 --> 00:00:41,720
So thanks to Kenneth for that.

7
00:00:41,720 --> 00:00:49,640
Anyway, Substack's full of men who write science fiction novels who are unpublished by the

8
00:00:49,640 --> 00:00:52,680
official publishing apparatus.

9
00:00:52,680 --> 00:00:56,480
Not surprising given the direction of the cultural winds these days.

10
00:00:56,480 --> 00:01:01,520
But they may not get published, may not be widely read, but they're sure fun to talk

11
00:01:01,520 --> 00:01:02,520
to.

12
00:01:02,520 --> 00:01:06,720
So here's a conversation with Kenneth.

13
00:01:06,720 --> 00:01:14,040
I am joined by Kenneth Harrell, a fellow Substack author and science fiction author and thinker

14
00:01:14,040 --> 00:01:16,960
of things technological and futuristic.

15
00:01:16,960 --> 00:01:19,080
Kenneth, it is good to talk to you.

16
00:01:19,080 --> 00:01:21,120
Hey, it's good to be here.

17
00:01:21,120 --> 00:01:26,720
Alright, so I'm looking at the Amazon description of one of your novels.

18
00:01:26,720 --> 00:01:29,400
It's called Awakening.

19
00:01:29,400 --> 00:01:34,360
And I'm going to read just a little bit of the description here.

20
00:01:34,360 --> 00:01:39,240
In the aftermath of humanity's last golden age, civilization reached unimaginable heights,

21
00:01:39,240 --> 00:01:42,480
only to crumble into dust in an instant.

22
00:01:42,480 --> 00:01:46,540
Now thousands of years later, Earth is a transformed world.

23
00:01:46,540 --> 00:01:51,240
Nature has reclaimed the planet's greatest cities and colossal abandoned megastructures

24
00:01:51,240 --> 00:01:54,400
stand as eerie testaments to a forgotten past.

25
00:01:54,400 --> 00:02:00,600
So I'll stop there and let you continue in a more informal way talking about this book.

26
00:02:00,600 --> 00:02:05,560
Okay, so I had this idea in my head for quite a while.

27
00:02:05,560 --> 00:02:07,320
The story idea had been rolling around.

28
00:02:07,320 --> 00:02:12,620
I had images, concepts in my head, talked to friends about it, but I just hadn't had

29
00:02:12,620 --> 00:02:18,360
a chance to actually sit down and write it.

30
00:02:18,360 --> 00:02:23,520
After a layoff, I was sort of doing a reassessment of what did I really want to put my energy

31
00:02:23,520 --> 00:02:28,640
and efforts into, because frankly, I'd kind of been frustrated by work.

32
00:02:28,640 --> 00:02:34,520
I reached a reasonable level in my career, but I wasn't getting a whole lot of life satisfaction

33
00:02:34,520 --> 00:02:35,520
out of it.

34
00:02:35,520 --> 00:02:42,600
And there were a lot of things that I'd already sort of, excuse me, had already done and experienced.

35
00:02:42,600 --> 00:02:47,160
And I just started to decenter work a lot in life in terms of what's really important

36
00:02:47,160 --> 00:02:48,160
to me.

37
00:02:48,160 --> 00:02:51,080
I didn't want to move into the director level.

38
00:02:51,080 --> 00:02:53,840
So I started to think about what do you really want to do?

39
00:02:53,840 --> 00:02:54,840
Talk to my wife about it.

40
00:02:54,840 --> 00:02:59,600
And she's like, well, you've been wanting to write that story, you know, and at first

41
00:02:59,600 --> 00:03:03,800
the idea was in the form of a screenplay, but I've been thinking of it more in the form

42
00:03:03,800 --> 00:03:07,960
of a novel or maybe a graphic novel.

43
00:03:07,960 --> 00:03:15,080
So one day in 2018, I sat down and started writing out my basic ideas on a whiteboard.

44
00:03:15,080 --> 00:03:21,160
I started thinking about the history of how this world was going to work, what the technologies

45
00:03:21,160 --> 00:03:26,000
were, did a ton of research, read a lot of different papers.

46
00:03:26,000 --> 00:03:30,040
And all of the technologies in the book are based on something.

47
00:03:30,040 --> 00:03:36,160
It's just I sort of add, you know, sort of science fiction elements to it.

48
00:03:36,160 --> 00:03:39,000
But that's what that's what got me into writing.

49
00:03:39,000 --> 00:03:44,840
Then once the pandemic hit, I had a lot of time on my hands.

50
00:03:44,840 --> 00:03:52,160
I was working at home, you know, full time and just had a lot of time to think wasn't

51
00:03:52,160 --> 00:03:54,000
really going out much.

52
00:03:54,000 --> 00:03:58,520
So I just dove into book world, both reading and writing.

53
00:03:58,520 --> 00:04:04,080
And so as scenes would come to me, once I had my outline, I would write them out because

54
00:04:04,080 --> 00:04:08,320
I write completely out of sequence, whatever scene happens to come to me, I just write

55
00:04:08,320 --> 00:04:10,000
it and then I put it in.

56
00:04:10,000 --> 00:04:11,520
Where does this go in my outline?

57
00:04:11,520 --> 00:04:12,520
And then I put it there.

58
00:04:12,520 --> 00:04:18,600
So I make I might be thinking about a scene between, you know, say a character and their,

59
00:04:18,600 --> 00:04:22,560
you know, their parents or between two different characters or flashbacks.

60
00:04:22,560 --> 00:04:25,840
I'll just write that scene and I know where it's going to go.

61
00:04:25,840 --> 00:04:29,360
And I sort of put it in that area and and and move on.

62
00:04:29,360 --> 00:04:32,280
But I have a habit of writing out of sequence.

63
00:04:32,280 --> 00:04:36,720
So it sounds like you you plot out the story before you start writing it.

64
00:04:36,720 --> 00:04:41,400
I usually do, but it sort of evolves like it's never locked in.

65
00:04:41,400 --> 00:04:47,240
Things happen, as you know, during the course of writing a story, you get different ideas.

66
00:04:47,240 --> 00:04:52,880
You know, my approach is such that I will outline it.

67
00:04:52,880 --> 00:04:56,920
But I don't know if you've ever had this experience, but it really does feel like the story is

68
00:04:56,920 --> 00:04:58,880
kind of coming from somewhere else.

69
00:04:58,880 --> 00:04:59,880
Oh, yeah.

70
00:04:59,880 --> 00:05:03,800
And I'm basically just sitting there trying to very quickly write down what it is that

71
00:05:03,800 --> 00:05:04,880
I hear.

72
00:05:04,880 --> 00:05:09,680
But once that starts going, like once you can get that sort of voice of the muse going,

73
00:05:09,680 --> 00:05:11,640
the story then there writes itself.

74
00:05:11,640 --> 00:05:15,100
You know, you just set up a scene, you know, you know what you want to achieve by the end

75
00:05:15,100 --> 00:05:16,640
of that scene.

76
00:05:16,640 --> 00:05:21,800
And then it becomes a matter of really just trusting the muse and trust in what you hear

77
00:05:21,800 --> 00:05:22,800
and then writing it down.

78
00:05:22,800 --> 00:05:26,960
But I'd say it's a it's kind of a collaborative process is the way that I see writing.

79
00:05:26,960 --> 00:05:30,840
It's basically you and the muse or whatever your process is.

80
00:05:30,840 --> 00:05:34,520
And basically just listening to the story as it comes to you.

81
00:05:34,520 --> 00:05:42,600
Well, you know, earlier today, I was looking at your substack and you don't seem to post

82
00:05:42,600 --> 00:05:45,760
long form essays all that regularly.

83
00:05:45,760 --> 00:05:50,560
But you can learn a lot by somebody by just going, you know, clicking over to their likes

84
00:05:50,560 --> 00:05:55,800
and seeing what other things on substack they've liked and also their activity.

85
00:05:55,800 --> 00:05:58,240
You can see what they've reposted.

86
00:05:58,240 --> 00:06:04,160
And it seems that a lot of your your interest and concern, at least when you're on substack,

87
00:06:04,160 --> 00:06:12,840
involves presentation to the public, basically relating to an audience as a creator online.

88
00:06:12,840 --> 00:06:16,000
Is that is that central to your your thoughts these days?

89
00:06:16,000 --> 00:06:24,280
Yeah, I mean, I joined substack essentially to to, you know, market the the novels.

90
00:06:24,280 --> 00:06:29,800
I sort of talked to some marketing people, but I just kind of didn't like the approach

91
00:06:29,800 --> 00:06:30,880
that they wanted to take.

92
00:06:30,880 --> 00:06:34,200
They really wanted to go hard and heavy into social media.

93
00:06:34,200 --> 00:06:35,920
I've kind of given up on social media.

94
00:06:35,920 --> 00:06:40,320
I've got a lot of opinions about it, but I just really didn't want to do that.

95
00:06:40,320 --> 00:06:45,720
I wanted to have someplace where I could go, where I could talk about, you know, my novels,

96
00:06:45,720 --> 00:06:50,320
where I could sort of explain what my perspective is on my own stories.

97
00:06:50,320 --> 00:06:54,640
And I didn't really think that that any of the other platforms are really appropriate

98
00:06:54,640 --> 00:06:55,840
for that.

99
00:06:55,840 --> 00:06:57,640
But I had heard about substack for a while.

100
00:06:57,640 --> 00:07:02,680
I had seen articles there for a while, and it seems like a space where you could do that.

101
00:07:02,680 --> 00:07:08,400
And that's why I started writing on substack.

102
00:07:08,400 --> 00:07:12,360
And what sort of people are you connecting with on substack?

103
00:07:12,360 --> 00:07:16,480
Mostly other writers, but I'm also interested in people that are interested in things like

104
00:07:16,480 --> 00:07:23,280
AI, cybersecurity, technology, books in general.

105
00:07:23,280 --> 00:07:28,600
I'm really not on substack for political stuff just because I think that political things

106
00:07:28,600 --> 00:07:32,760
right now have an outsized influence on everyone's life.

107
00:07:32,760 --> 00:07:35,280
And I'm not sure that's the healthiest way to live.

108
00:07:35,280 --> 00:07:36,280
I agree.

109
00:07:36,280 --> 00:07:41,040
I was thinking if you were alive back in the 1800s and something went down in Washington,

110
00:07:41,040 --> 00:07:44,640
you found out about it like a month and a half later.

111
00:07:44,640 --> 00:07:46,760
Now you can watch it as it happens.

112
00:07:46,760 --> 00:07:53,160
And I'm just not so sure that that was the original intent is for us to be so our minds

113
00:07:53,160 --> 00:07:55,800
are just so heavily involved in politics.

114
00:07:55,800 --> 00:07:57,240
And I just don't like it.

115
00:07:57,240 --> 00:07:59,800
I've grown exhausted of it all.

116
00:07:59,800 --> 00:08:03,040
So I try to stay in book world.

117
00:08:03,040 --> 00:08:08,960
I listen to I read other authors, I write my own stories.

118
00:08:08,960 --> 00:08:13,140
I do a lot of reading of just things that I find interesting.

119
00:08:13,140 --> 00:08:17,680
Like I read a Russian study on gravitational waves a while back.

120
00:08:17,680 --> 00:08:18,840
Didn't understand most of it.

121
00:08:18,840 --> 00:08:21,280
But when I was going up, my mother challenged me.

122
00:08:21,280 --> 00:08:24,160
She said, if you don't understand something, basically just keep studying it until you

123
00:08:24,160 --> 00:08:25,360
do.

124
00:08:25,360 --> 00:08:28,380
And so I will often read things that are way over my head.

125
00:08:28,380 --> 00:08:30,800
But I try to understand them.

126
00:08:30,800 --> 00:08:35,920
And when it comes to certain things, especially with regards to like physics, the only thing

127
00:08:35,920 --> 00:08:40,640
that I really regret is that my the extent of my understanding of some things in physics

128
00:08:40,640 --> 00:08:44,760
really stops at metaphor, because to truly understand it, you really have to get into

129
00:08:44,760 --> 00:08:45,920
the mathematics of it.

130
00:08:45,920 --> 00:08:47,600
And I just I'm just not there.

131
00:08:47,600 --> 00:08:49,360
Yeah, I'm not either.

132
00:08:49,360 --> 00:08:54,400
And also, I'm not all that interested in reading something that's really super technical.

133
00:08:54,400 --> 00:08:55,400
Yeah, yeah.

134
00:08:55,400 --> 00:09:02,800
Well, my novel is technical to a degree, but it's mostly explanations of how the technologies

135
00:09:02,800 --> 00:09:04,120
work.

136
00:09:04,120 --> 00:09:08,040
So there's I remember when I was when I was looking around for an editor, this was when

137
00:09:08,040 --> 00:09:12,840
I was going to take the traditional publishing route, which I have now given up on, came

138
00:09:12,840 --> 00:09:17,960
across a couple of editors that just told me rather strange things like, oh, it's it's

139
00:09:17,960 --> 00:09:18,960
too techie for me.

140
00:09:18,960 --> 00:09:23,400
I'm like, well, I kind of explain how every technology works.

141
00:09:23,400 --> 00:09:28,400
Had another guy, he just couldn't rock the idea of a artificial planetary ring.

142
00:09:28,400 --> 00:09:31,440
He's like, well, I just don't see how you would ever be able to build that.

143
00:09:31,440 --> 00:09:36,480
And I'm like, well, I kind of did explain it in the book.

144
00:09:36,480 --> 00:09:38,280
So yeah, so it's.

145
00:09:38,280 --> 00:09:40,040
Well, what's more?

146
00:09:40,040 --> 00:09:45,880
I mean, like Larry Niven's Ringworld is is, you know, it is a decades old classic, well

147
00:09:45,880 --> 00:09:48,000
established chestnut in science fiction.

148
00:09:48,000 --> 00:09:54,080
So you know, if you can't if you're your editor can't understand a planetary ring, you know,

149
00:09:54,080 --> 00:09:58,940
much less a ring world or like a bank's orbital, then they're probably just not suited to editing

150
00:09:58,940 --> 00:09:59,940
a sci fi novel.

151
00:09:59,940 --> 00:10:00,940
Yeah.

152
00:10:00,940 --> 00:10:02,160
And it really wasn't even on that scale.

153
00:10:02,160 --> 00:10:05,920
The planetary ring I was talking about is something that would be orbital.

154
00:10:05,920 --> 00:10:07,360
It is sort of around the planet.

155
00:10:07,360 --> 00:10:12,100
The Larry Niven basically took a planet and turned it into a strip like you take a planet,

156
00:10:12,100 --> 00:10:18,200
you unfurl it and you put it around on the inside of some type of incredible megastructure

157
00:10:18,200 --> 00:10:21,680
made out of God knows what.

158
00:10:21,680 --> 00:10:23,720
And yeah, mine wasn't even at that scale.

159
00:10:23,720 --> 00:10:25,880
It's just a planetary rings like not that.

160
00:10:25,880 --> 00:10:26,880
And I explained it in the book.

161
00:10:26,880 --> 00:10:29,160
I'm like, hey, it's molecular self assembly.

162
00:10:29,160 --> 00:10:32,200
It's you know, it's fully explainable.

163
00:10:32,200 --> 00:10:37,280
But because I've been thinking about different ways that you could build large scale, you

164
00:10:37,280 --> 00:10:39,160
know, large scale structures in space.

165
00:10:39,160 --> 00:10:41,600
And so I was like, well, what about just, you know, self assembly?

166
00:10:41,600 --> 00:10:47,080
And I got the idea from reading a couple of articles a while back about experiments that

167
00:10:47,080 --> 00:10:52,960
have been done with essentially robots that, you know, sort of build and construct things.

168
00:10:52,960 --> 00:10:55,160
And I extrapolated that to nanotechnology.

169
00:10:55,160 --> 00:11:00,000
I'm like, well, what if we took nanotechnology and just combined it with AI, where you could

170
00:11:00,000 --> 00:11:03,960
have like generative physicality.

171
00:11:03,960 --> 00:11:11,400
So you know, like in the same way that we will use a an LLM or a generative AI generator

172
00:11:11,400 --> 00:11:13,080
to generate a picture?

173
00:11:13,080 --> 00:11:16,200
Well, if you you could do the same thing with matter.

174
00:11:16,200 --> 00:11:20,000
And so I sort of set up a situation in my book where that's possible.

175
00:11:20,000 --> 00:11:24,280
You basically give a a description of what it is that you're trying to make, what the

176
00:11:24,280 --> 00:11:25,840
function should be.

177
00:11:25,840 --> 00:11:29,480
And then basically, the the nanotech will just build that.

178
00:11:29,480 --> 00:11:33,760
You don't need to necessarily know the nuts and bolts of how something works.

179
00:11:33,760 --> 00:11:37,880
You just say, I need something that, you know, has the following features and functions in

180
00:11:37,880 --> 00:11:38,880
this way.

181
00:11:38,880 --> 00:11:41,380
And then it will just sort of build itself in front of you.

182
00:11:41,380 --> 00:11:44,640
But I did really explain how all of the different technologies work.

183
00:11:44,640 --> 00:11:49,320
So to the degree that it's really technical, it's just an explanation of how those things

184
00:11:49,320 --> 00:11:50,320
work.

185
00:11:50,320 --> 00:11:51,320
Right.

186
00:11:51,320 --> 00:11:57,900
You know, I'm looking at your Amazon, your three titles here on Amazon, Awakening, Body

187
00:11:57,900 --> 00:12:00,300
of Work, City of Dreams.

188
00:12:00,300 --> 00:12:01,680
And there was one other I saw.

189
00:12:01,680 --> 00:12:03,120
Where is it?

190
00:12:03,120 --> 00:12:05,840
I guess Body of Work and City of Dreams.

191
00:12:05,840 --> 00:12:09,080
Oh, we're there's Body of Work, the Enforcer.

192
00:12:09,080 --> 00:12:10,080
Right.

193
00:12:10,080 --> 00:12:14,260
So the Body of Work series is set in the same universe.

194
00:12:14,260 --> 00:12:20,040
So one of the things that happens in Awakening is that mankind starts to go out and spread

195
00:12:20,040 --> 00:12:22,280
across the galaxy.

196
00:12:22,280 --> 00:12:25,640
And I wanted to go back and say, hey, you know, what happened to those colonies?

197
00:12:25,640 --> 00:12:28,840
You know, Earth may not be around anymore.

198
00:12:28,840 --> 00:12:31,760
But what about all those colonies that humans created?

199
00:12:31,760 --> 00:12:38,960
So Body of Work is set in one of those colonies, a place called the Union of Worlds.

200
00:12:38,960 --> 00:12:44,200
And they essentially were a colony of Earth and Mars that broke off and became its own

201
00:12:44,200 --> 00:12:47,280
thing in the post-collapse era.

202
00:12:47,280 --> 00:12:52,640
Well, I noticed that you have only Kindle editions listed here.

203
00:12:52,640 --> 00:12:56,080
Are there not paperback editions as well?

204
00:12:56,080 --> 00:12:59,760
There's a paperback edition of Awakening.

205
00:12:59,760 --> 00:13:02,560
But Body of Work series right now is just Kindle.

206
00:13:02,560 --> 00:13:06,000
And that's because it is a short story.

207
00:13:06,000 --> 00:13:10,480
And I wasn't so sure that it was worth it.

208
00:13:10,480 --> 00:13:14,640
It just doesn't seem to be very cost effective to have a print version.

209
00:13:14,640 --> 00:13:18,000
I have a couple of print versions that I gave to friends.

210
00:13:18,000 --> 00:13:23,080
But I just thought that for what you're getting, the length of the book is just better in Kindle

211
00:13:23,080 --> 00:13:24,080
form.

212
00:13:24,080 --> 00:13:25,080
Gotcha.

213
00:13:25,080 --> 00:13:27,760
Well, I've published on Kindle and paperback.

214
00:13:27,760 --> 00:13:29,600
I have not done a hardback.

215
00:13:29,600 --> 00:13:34,240
And I would very much like to do an audiobook because most of the sci-fi I take in these

216
00:13:34,240 --> 00:13:39,200
days I take in as, you know, Audible books.

217
00:13:39,200 --> 00:13:42,820
And Amazon has got, you know, Amazon owns Audible.

218
00:13:42,820 --> 00:13:46,680
So they're very integrated there.

219
00:13:46,680 --> 00:13:51,440
They have a function where an AI narrator will read your novel.

220
00:13:51,440 --> 00:13:56,600
And I haven't done it because, you know, I want different character voices.

221
00:13:56,600 --> 00:13:59,720
I think you can choose different character voices.

222
00:13:59,720 --> 00:14:05,480
They have like a production option where there's a couple of different options on how you can

223
00:14:05,480 --> 00:14:07,000
generate that.

224
00:14:07,000 --> 00:14:09,640
And I think one of them is there's a production option.

225
00:14:09,640 --> 00:14:14,520
I also found another company that will do audiobooks.

226
00:14:14,520 --> 00:14:17,000
I'll have to find it and send it to you afterward.

227
00:14:17,000 --> 00:14:22,880
But you can technically do this via 11 labs.

228
00:14:22,880 --> 00:14:29,320
However, long generations tend to have more of a compute demand.

229
00:14:29,320 --> 00:14:34,200
And you'll start noticing a degradation in audio quality the longer it goes on.

230
00:14:34,200 --> 00:14:37,760
Also it doesn't always pronounce certain types of words.

231
00:14:37,760 --> 00:14:41,280
Like if you have a lot of specialty terms, it may mispronounce those.

232
00:14:41,280 --> 00:14:45,080
And you can add phonetic corrections in.

233
00:14:45,080 --> 00:14:46,920
But it's very labor intensive.

234
00:14:46,920 --> 00:14:50,480
But technically, if you're willing to put in the time and you're willing to do the editing

235
00:14:50,480 --> 00:14:55,580
and you're willing to, it's a lot of generation.

236
00:14:55,580 --> 00:14:59,400
You can technically do an entire audiobook using 11 labs.

237
00:14:59,400 --> 00:15:03,000
But it's not at the place where it should be.

238
00:15:03,000 --> 00:15:07,680
Ideally, what would happen is you would take your EPUB file, you would upload it.

239
00:15:07,680 --> 00:15:10,520
And then you would say, okay, I want this voice for these characters.

240
00:15:10,520 --> 00:15:12,420
And I want this voice for these characters.

241
00:15:12,420 --> 00:15:14,720
And then you would basically be able to sort of lay it out.

242
00:15:14,720 --> 00:15:15,720
Like that's what would be ideal.

243
00:15:15,720 --> 00:15:18,640
And I'm sure something like that's coming.

244
00:15:18,640 --> 00:15:22,040
But that's not where we are right now.

245
00:15:22,040 --> 00:15:29,320
Well, I did use 11 labs to do an audiobook for a novella that I published on Substack.

246
00:15:29,320 --> 00:15:35,640
But I just ended up reading the whole thing myself and then just using 11 labs to change

247
00:15:35,640 --> 00:15:36,640
the voice.

248
00:15:36,640 --> 00:15:37,640
Because there's a female protagonist.

249
00:15:37,640 --> 00:15:39,280
So I made it into a female narrator.

250
00:15:39,280 --> 00:15:40,280
Oh, nice.

251
00:15:40,280 --> 00:15:41,280
Yeah.

252
00:15:41,280 --> 00:15:46,840
And I had to do that to get the right intonations and also the right pronunciations of characters

253
00:15:46,840 --> 00:15:47,840
names.

254
00:15:47,840 --> 00:15:48,840
Yeah.

255
00:15:48,840 --> 00:15:50,480
And I think 11 labs right now has a voice to voice.

256
00:15:50,480 --> 00:15:51,480
Did you do it through that?

257
00:15:51,480 --> 00:15:53,280
Or did you, was it just text to voice?

258
00:15:53,280 --> 00:15:54,880
No, it was voice to voice.

259
00:15:54,880 --> 00:15:55,880
I had to read it.

260
00:15:55,880 --> 00:15:56,880
Voice to voice.

261
00:15:56,880 --> 00:15:57,880
Yeah.

262
00:15:57,880 --> 00:15:58,880
Nice.

263
00:15:58,880 --> 00:15:59,880
Yeah.

264
00:15:59,880 --> 00:16:01,480
So it's a crazy amount of work.

265
00:16:01,480 --> 00:16:05,320
I mean, you know, I'm basically creating the audiobook myself.

266
00:16:05,320 --> 00:16:07,900
You know, my reading is not perfect on the first go.

267
00:16:07,900 --> 00:16:12,240
So I would have to do a long recording session and edit that session and then upload that

268
00:16:12,240 --> 00:16:14,520
file to 11 labs.

269
00:16:14,520 --> 00:16:17,240
And yeah, and it's cost money.

270
00:16:17,240 --> 00:16:18,240
It's not free.

271
00:16:18,240 --> 00:16:19,240
Yeah.

272
00:16:19,240 --> 00:16:20,240
And don't you have to have a pro account?

273
00:16:20,240 --> 00:16:23,400
You have to have like the most expensive account to do it?

274
00:16:23,400 --> 00:16:25,680
No, I don't think so.

275
00:16:25,680 --> 00:16:29,200
No, I think I had the cheapest account.

276
00:16:29,200 --> 00:16:30,200
Oh, wow.

277
00:16:30,200 --> 00:16:31,200
Yeah.

278
00:16:31,200 --> 00:16:33,560
So how long were your generations?

279
00:16:33,560 --> 00:16:36,840
Like over a paragraph or under a paragraph?

280
00:16:36,840 --> 00:16:37,840
I forget.

281
00:16:37,840 --> 00:16:40,800
I think it was a word count, you know, maximum word count.

282
00:16:40,800 --> 00:16:42,920
And I just don't remember the number.

283
00:16:42,920 --> 00:16:44,840
It was more than a paragraph for sure.

284
00:16:44,840 --> 00:16:49,560
It was maybe a minute or two.

285
00:16:49,560 --> 00:16:50,560
That's interesting.

286
00:16:50,560 --> 00:16:54,080
Yeah, because I found really long ones that the audio quality just drops, right?

287
00:16:54,080 --> 00:16:56,840
The volume drops and the audio quality drops off.

288
00:16:56,840 --> 00:17:00,080
And so I had more success with shorter generations.

289
00:17:00,080 --> 00:17:02,360
I would just put shorter.

290
00:17:02,360 --> 00:17:07,080
Like for all of my Substack articles, I kind of have like an audio annotation and it is

291
00:17:07,080 --> 00:17:10,520
my voice, but it's an AI version of my voice.

292
00:17:10,520 --> 00:17:13,880
And so I have to do it in chunks and then edit them all together.

293
00:17:13,880 --> 00:17:17,880
Yeah, pardon me, which is a lot of work.

294
00:17:17,880 --> 00:17:22,720
And it seems like the sort of work that would be good to hand off to an AI and the AI is

295
00:17:22,720 --> 00:17:24,960
just not there yet.

296
00:17:24,960 --> 00:17:31,400
And that's, I think, sort of a microscopic example of just basically people's experience

297
00:17:31,400 --> 00:17:35,300
of AI writ large, which is that, man, this stuff is really amazing.

298
00:17:35,300 --> 00:17:39,840
It seems like it should be able to do all kinds of stuff that it's just not very good

299
00:17:39,840 --> 00:17:40,840
at.

300
00:17:40,840 --> 00:17:44,920
Yeah, I think we'll eventually get there, but it won't take years and years.

301
00:17:44,920 --> 00:17:49,220
I mean, because there's capabilities that we have now that we didn't have like exactly

302
00:17:49,220 --> 00:17:50,760
one year ago.

303
00:17:50,760 --> 00:17:53,440
So I think that we will eventually get there.

304
00:17:53,440 --> 00:17:59,860
But you know, this AI thing has been pretty exciting, but we also have to be like realistic

305
00:17:59,860 --> 00:18:01,300
about it.

306
00:18:01,300 --> 00:18:06,600
And I think that the place where this will probably land will be something like it will

307
00:18:06,600 --> 00:18:09,920
be another sub genre.

308
00:18:09,920 --> 00:18:14,920
So you know, like when you were like 1415, I think all boys did this, you're sitting

309
00:18:14,920 --> 00:18:20,360
around and you're trying to think of bizarre movie and story combinations like, man, what

310
00:18:20,360 --> 00:18:23,760
if Batman, you know, fought Darth Vader or something?

311
00:18:23,760 --> 00:18:26,560
Well, you'll be able to do all those scenarios now.

312
00:18:26,560 --> 00:18:29,280
And I think it will it will just be another thing that people do.

313
00:18:29,280 --> 00:18:33,800
I don't believe that we're headed towards a future where people are going to sit around

314
00:18:33,800 --> 00:18:37,680
watching AI movies and doing AI stories.

315
00:18:37,680 --> 00:18:39,720
I think it'll be thrown into the mix.

316
00:18:39,720 --> 00:18:43,880
It will be yet another thing that folks are doing.

317
00:18:43,880 --> 00:18:46,840
But you know, people really do have to write stories.

318
00:18:46,840 --> 00:18:53,640
I do think that AI can be used to enhance and make that process easier so that more

319
00:18:53,640 --> 00:18:56,120
people can can do that.

320
00:18:56,120 --> 00:19:01,880
But all of this reminds me a lot of what happened with desktop publishing and drafting in the

321
00:19:01,880 --> 00:19:02,880
90s.

322
00:19:02,880 --> 00:19:06,400
I remember there were a lot of people that were in the publishing business and even in

323
00:19:06,400 --> 00:19:12,920
school where if you were writing your papers using a word processor, I had certain instructors

324
00:19:12,920 --> 00:19:14,360
that would insist that you type it.

325
00:19:14,360 --> 00:19:15,720
I'm like, well, what difference does it make?

326
00:19:15,720 --> 00:19:20,080
It's like, well, not if not if you just press a button, it'll do your paper.

327
00:19:20,080 --> 00:19:22,080
I'm like, no, that's not what's happening.

328
00:19:22,080 --> 00:19:26,520
And I remember during with desktop publishing, they used to have these service.

329
00:19:26,520 --> 00:19:30,680
I don't know if you remember service bureaus, but they were like these places you'd go

330
00:19:30,680 --> 00:19:31,840
and they had everything there.

331
00:19:31,840 --> 00:19:38,580
They had computers, they had printers, they had if you needed to, if you needed a plotter,

332
00:19:38,580 --> 00:19:41,160
they had those and you could rent the time.

333
00:19:41,160 --> 00:19:44,520
And they were very popular like in the early, early mid 90s.

334
00:19:44,520 --> 00:19:45,520
I'm thinking 90.

335
00:19:45,520 --> 00:19:50,320
Gosh, this was 94, 95 ish.

336
00:19:50,320 --> 00:19:54,880
And I remember at that time, a lot of people just dumped out of the drafting, you know,

337
00:19:54,880 --> 00:19:59,200
an art business because they said, well, you know, it's all being taken over by Photoshop

338
00:19:59,200 --> 00:20:01,600
and you know, all of these Adobe products.

339
00:20:01,600 --> 00:20:02,920
And they just kind of gave up on their career.

340
00:20:02,920 --> 00:20:07,240
And the thing is, is that you have to kind of whether you're a writer or whatever you're

341
00:20:07,240 --> 00:20:10,800
involved in, you have to embrace these technologies and integrate them in.

342
00:20:10,800 --> 00:20:13,240
And if you look, that's what people have always done.

343
00:20:13,240 --> 00:20:18,160
So you know, like the the great artistic masters, people think that, you know, they just drew

344
00:20:18,160 --> 00:20:20,440
that stuff straight from hand.

345
00:20:20,440 --> 00:20:25,280
And what I found out is that what a lot of them did was they use all kinds of complicated

346
00:20:25,280 --> 00:20:30,640
optics in order to create projections that they would then trace over.

347
00:20:30,640 --> 00:20:33,480
So you ever you heard of a camera obscura?

348
00:20:33,480 --> 00:20:34,480
Oh, yeah.

349
00:20:34,480 --> 00:20:35,480
Yeah.

350
00:20:35,480 --> 00:20:40,560
So oftentimes, works were created by using camera obscura, where you just have this box,

351
00:20:40,560 --> 00:20:44,220
you have a pinhole, you have the subject outside the artist inside.

352
00:20:44,220 --> 00:20:49,440
And although the image is inverted, you just trace over what it is that you're seeing.

353
00:20:49,440 --> 00:20:52,280
And then that would give you a foundation upon which to build.

354
00:20:52,280 --> 00:20:55,240
Now, some people think of this as cheating.

355
00:20:55,240 --> 00:20:56,600
I don't think it's cheating.

356
00:20:56,600 --> 00:20:59,340
It's just you utilizing the technologies that were available.

357
00:20:59,340 --> 00:21:05,240
So as far back as you can look, writers, artists have always used whatever the latest tech

358
00:21:05,240 --> 00:21:12,040
is to, you know, to further their craft or whatever it is, whether it's the stage, whether

359
00:21:12,040 --> 00:21:16,240
it's music or art or, in our case, writing.

360
00:21:16,240 --> 00:21:20,640
So I think we just have to figure out ways of how do we integrate these technologies

361
00:21:20,640 --> 00:21:22,620
into what it is that we're doing.

362
00:21:22,620 --> 00:21:27,400
But I think that the losing game are all of these people that are really into no, no,

363
00:21:27,400 --> 00:21:31,200
no AI, you know, no AI art, you know, no AI anything.

364
00:21:31,200 --> 00:21:33,840
And I'm like, well, you're missing out.

365
00:21:33,840 --> 00:21:39,060
Because I treat AI like a buddy that sits on my desk and I say, hey, does this scene

366
00:21:39,060 --> 00:21:40,240
make any sense to you?

367
00:21:40,240 --> 00:21:43,600
And I'll put the scene in and also how can I make this scene better in terms of tone,

368
00:21:43,600 --> 00:21:45,480
in terms of what's going on?

369
00:21:45,480 --> 00:21:50,000
And so first I make sure that it, in quotation marks, understands the scene.

370
00:21:50,000 --> 00:21:52,120
And then I start to improve it.

371
00:21:52,120 --> 00:21:57,780
So I don't think it's a matter of, you know, cutting and pasting from your LLM.

372
00:21:57,780 --> 00:22:00,580
It's working with the LLM to make your writing better.

373
00:22:00,580 --> 00:22:04,560
And from some of the suggestions I've gotten, it's like not exactly what I want, but it

374
00:22:04,560 --> 00:22:07,240
gives me ideas to write something else.

375
00:22:07,240 --> 00:22:12,520
And that's kind of how I think about a lot of these tools is their creativity.

376
00:22:12,520 --> 00:22:20,060
Like I call it like a gumbo starter for, you know, for your creativity or for your writing.

377
00:22:20,060 --> 00:22:23,760
So anybody that's ever made gumbo, I'm terrible at making a room, which to me is like the

378
00:22:23,760 --> 00:22:25,340
hardest part of making gumbo.

379
00:22:25,340 --> 00:22:27,600
So I usually use a starter mix, right?

380
00:22:27,600 --> 00:22:32,540
Well, that's kind of how I think about like generative AI art, LLMs, things like that.

381
00:22:32,540 --> 00:22:36,560
They can sort of be a booster to your creative process.

382
00:22:36,560 --> 00:22:38,800
They can help it along.

383
00:22:38,800 --> 00:22:46,800
And I actually think that between the AI tools that I use, I have a, I use in Word, it's

384
00:22:46,800 --> 00:22:50,320
called, hang on a second, I actually can't remember.

385
00:22:50,320 --> 00:22:53,460
It's called ProWritingAid.

386
00:22:53,460 --> 00:22:55,800
So ProWritingAid is what I use.

387
00:22:55,800 --> 00:23:00,200
And that's completely replaced the need for an editor for me.

388
00:23:00,200 --> 00:23:04,560
And then for everything else, I use a custom GPT that's been trained on things that I've

389
00:23:04,560 --> 00:23:05,560
written.

390
00:23:05,560 --> 00:23:09,120
I've been trying to capture my voice in the LLM.

391
00:23:09,120 --> 00:23:11,280
So I trained it on all this stuff.

392
00:23:11,280 --> 00:23:16,800
Poems I wrote, blog articles that I wrote in the past, old screenplays.

393
00:23:16,800 --> 00:23:23,840
I dumped it all into the custom GPT so that when I'm going through scenes, I can say,

394
00:23:23,840 --> 00:23:27,000
hey, how can I make this scene better?

395
00:23:27,000 --> 00:23:28,000
And it'll give me suggestions.

396
00:23:28,000 --> 00:23:30,840
And then I take some of those suggestions and some of me.

397
00:23:30,840 --> 00:23:39,840
And that's how I've been working since I integrated using some of these tools into my work.

398
00:23:39,840 --> 00:23:45,160
And with regard to art, I used to just go to DeviantArt and I would just sort of randomly

399
00:23:45,160 --> 00:23:49,320
look around at different artists, try to get ideas.

400
00:23:49,320 --> 00:23:54,000
Now I can actually describe exactly what it is that I want to see.

401
00:23:54,000 --> 00:23:58,800
I can generate that in using something like Lexica, which I use often.

402
00:23:58,800 --> 00:24:00,480
And then just seeing those images around.

403
00:24:00,480 --> 00:24:02,760
I'll make a bunch of different versions.

404
00:24:02,760 --> 00:24:10,560
And between the AI art and the music and just getting myself in that zone, the story just

405
00:24:10,560 --> 00:24:11,560
starts to come.

406
00:24:11,560 --> 00:24:16,040
It's like literally just turning a faucet on and the story starts coming.

407
00:24:16,040 --> 00:24:25,040
I've noticed that the AI image generators came on the scene a little bit ahead of chat

408
00:24:25,040 --> 00:24:26,040
GPT.

409
00:24:26,040 --> 00:24:30,160
And GPT 2.5, GPT 3, and 3.5.

410
00:24:30,160 --> 00:24:34,500
That's when people, when they put out chat GPT, that's when a lot of people first woke

411
00:24:34,500 --> 00:24:40,400
up to transformer-based LLMs.

412
00:24:40,400 --> 00:24:44,720
And so there was the pushback from the artists.

413
00:24:44,720 --> 00:24:53,280
And mostly I think it was pushback from aspiring artists against AI image generation.

414
00:24:53,280 --> 00:24:58,680
That came a little bit before the current AI pushback, which I think is now oriented

415
00:24:58,680 --> 00:25:04,520
more toward text-based or text generation.

416
00:25:04,520 --> 00:25:11,960
And I remember seeing really rapid improvement in the quality of AI image generation over

417
00:25:11,960 --> 00:25:15,700
the course of 2022 and 2023.

418
00:25:15,700 --> 00:25:17,400
And I was using it a lot for a while.

419
00:25:17,400 --> 00:25:21,440
And I just realized I don't use those things very often anymore.

420
00:25:21,440 --> 00:25:24,280
I'm just not all that interested.

421
00:25:24,280 --> 00:25:28,840
I noticed you're talking about your custom GPTs and whatnot.

422
00:25:28,840 --> 00:25:34,800
And I just realized that we have a new cultural split over technology, I think, that will

423
00:25:34,800 --> 00:25:35,800
be emerging.

424
00:25:35,800 --> 00:25:41,200
There's the PC versus Mac or iPhone versus Android.

425
00:25:41,200 --> 00:25:42,960
And you're one of those open AI people.

426
00:25:42,960 --> 00:25:45,480
And I'm over on Claude at Anthropic.

427
00:25:45,480 --> 00:25:47,720
I don't much care for open AI.

428
00:25:47,720 --> 00:25:51,880
I don't like just the off-the-shelf voice of chat GPT.

429
00:25:51,880 --> 00:25:53,880
I much prefer Claude.

430
00:25:53,880 --> 00:25:54,880
Yeah.

431
00:25:54,880 --> 00:25:58,440
Also, the device I was thinking of is called a Camera Ludica.

432
00:25:58,440 --> 00:26:04,600
So this is like a portable optical tool that was used, I think it was patented back in

433
00:26:04,600 --> 00:26:06,360
1806.

434
00:26:06,360 --> 00:26:11,600
And it was used by the old masters in order to create that artwork where they were tracing

435
00:26:11,600 --> 00:26:14,920
over and that tracing would give them the foundation.

436
00:26:14,920 --> 00:26:17,280
And then from there, they would just move on.

437
00:26:17,280 --> 00:26:22,320
But it just shows how, if you really look at it, artists have always integrated new

438
00:26:22,320 --> 00:26:25,520
technologies and techniques into their work.

439
00:26:25,520 --> 00:26:28,520
And I think that we should probably continue to do that.

440
00:26:28,520 --> 00:26:33,720
And yes, I do consider writing to be an art form.

441
00:26:33,720 --> 00:26:35,440
You follow Brian Chow?

442
00:26:35,440 --> 00:26:38,920
No, no, I've never heard of him.

443
00:26:38,920 --> 00:26:41,200
Well, it's one of those worlds.

444
00:26:41,200 --> 00:26:45,160
There's so many people who are online saying good stuff.

445
00:26:45,160 --> 00:26:46,960
You can't follow them all.

446
00:26:46,960 --> 00:26:55,360
But he was talking about basically the sort of provincial and small minded response to

447
00:26:55,360 --> 00:27:04,440
AI generated text and how his opinion is that the ideas are what's important.

448
00:27:04,440 --> 00:27:10,980
And if all you have is sort of your style of expressing the ideas and that style can

449
00:27:10,980 --> 00:27:15,920
be duplicated algorithmically, well, then you don't really have much to offer.

450
00:27:15,920 --> 00:27:21,400
And he's not very sympathetic to people who are basically filling up moats and defending

451
00:27:21,400 --> 00:27:26,920
their territory against transgression by artificial intelligence.

452
00:27:26,920 --> 00:27:29,040
And I'm not sure I entirely agree with that.

453
00:27:29,040 --> 00:27:32,040
But I wrote about this just recently.

454
00:27:32,040 --> 00:27:37,000
I'm not tempted to have an AI write anything for me because the stuff that it puts out

455
00:27:37,000 --> 00:27:40,520
is just sort of bland and generic and not very interesting.

456
00:27:40,520 --> 00:27:44,400
And I noticed that if I'm reading, if I'm on Substack and I come across something that

457
00:27:44,400 --> 00:27:48,200
feels like it's AI generated to me, I just move on to something else.

458
00:27:48,200 --> 00:27:55,120
So I'm not remotely tempted to have an AI just create something for me that I then put

459
00:27:55,120 --> 00:27:56,320
up on the web.

460
00:27:56,320 --> 00:27:58,160
That just kind of seems like a waste of time.

461
00:27:58,160 --> 00:28:00,200
Yeah, it's a complete waste of time.

462
00:28:00,200 --> 00:28:02,800
And that's not really the way that you use it.

463
00:28:02,800 --> 00:28:07,640
The way that I think about how you use AI in the same way that you would have a friend

464
00:28:07,640 --> 00:28:11,800
that's hanging around at your house, you say, hey, does this read this thing for me and

465
00:28:11,800 --> 00:28:13,640
tell me, does this make any sense?

466
00:28:13,640 --> 00:28:14,840
And how can I improve it?

467
00:28:14,840 --> 00:28:17,480
I look at it as being no different from that.

468
00:28:17,480 --> 00:28:21,400
Yeah, it's something to help you improve what it is that you're doing.

469
00:28:21,400 --> 00:28:24,920
It's not something to replace what you're doing.

470
00:28:24,920 --> 00:28:28,280
I mean, there's certainly, now look, there's going to be a lot of people, and there are

471
00:28:28,280 --> 00:28:34,320
now, that are taking AI generated books and just putting them up on Kindle.

472
00:28:34,320 --> 00:28:38,440
And it's kind of like their thing.

473
00:28:38,440 --> 00:28:40,560
I just don't think that that's the way to go.

474
00:28:40,560 --> 00:28:42,360
That's not a proper use of AI.

475
00:28:42,360 --> 00:28:46,360
AI should be sparking your imagination.

476
00:28:46,360 --> 00:28:52,000
It should be helping you be more creative and a better writer, not doing the work for

477
00:28:52,000 --> 00:28:53,000
you.

478
00:28:53,000 --> 00:28:54,000
That's not the idea.

479
00:28:54,000 --> 00:28:58,960
Now, if capabilities continue to improve, though, a few years from now, we might be

480
00:28:58,960 --> 00:29:03,120
having this conversation and we might be saying, you know, I just I notice I'm not reading

481
00:29:03,120 --> 00:29:09,280
any human written stuff anymore, that pretty much everything I read is completely AI generated.

482
00:29:09,280 --> 00:29:11,120
And at that point, it's a very different story.

483
00:29:11,120 --> 00:29:17,120
I mean, then, you know, once once AI is doing things like that better than the best humans,

484
00:29:17,120 --> 00:29:18,840
we live in a very different world.

485
00:29:18,840 --> 00:29:20,320
I think we're very formulaic.

486
00:29:20,320 --> 00:29:28,100
I mean, you know, I think it's sports stories have been written by AI since the 90s.

487
00:29:28,100 --> 00:29:32,280
Because sports story like when you're reporting on baseball, it's very, very formulaic in

488
00:29:32,280 --> 00:29:33,420
terms of what happens.

489
00:29:33,420 --> 00:29:35,640
There's not like a zillion things that can happen.

490
00:29:35,640 --> 00:29:39,980
There's only what let's just say maybe, I don't know, maybe a thousand different things

491
00:29:39,980 --> 00:29:41,280
that could possibly happen.

492
00:29:41,280 --> 00:29:43,880
So it's not an infinite number of options.

493
00:29:43,880 --> 00:29:50,420
And I remember reading articles in 1999 where they were saying that a lot of a lot of sports

494
00:29:50,420 --> 00:29:53,000
articles are just automatically generated.

495
00:29:53,000 --> 00:29:58,080
These were really super, you know, primitive as compared to what we have today with LLMs

496
00:29:58,080 --> 00:30:00,960
and not sure what the what the model was at that time.

497
00:30:00,960 --> 00:30:04,600
I think they were just using pre-programmed rules.

498
00:30:04,600 --> 00:30:09,800
But you know, if if a if a score is made, the score is made by a person.

499
00:30:09,800 --> 00:30:14,520
If if if you know something happens, it's it's done in a particular way and all that

500
00:30:14,520 --> 00:30:15,600
can be scripted out.

501
00:30:15,600 --> 00:30:21,680
So there's a lot of AI generated content out there that's been out there in different forms.

502
00:30:21,680 --> 00:30:25,000
But I just think it's going to get it's going to get increasingly more difficult to have

503
00:30:25,000 --> 00:30:30,760
this sort of, you know, no AI stance, especially when it's being integrated into absolutely

504
00:30:30,760 --> 00:30:31,760
everything.

505
00:30:31,760 --> 00:30:34,920
I mean, it's being integrated into into office.

506
00:30:34,920 --> 00:30:39,240
It's already integrated into many Adobe products.

507
00:30:39,240 --> 00:30:45,040
You know, there's it just seems like it's going to be difficult to have a no AI stance

508
00:30:45,040 --> 00:30:49,160
realistically, especially if it's integrated everywhere and into everything.

509
00:30:49,160 --> 00:30:50,160
Yeah.

510
00:30:50,160 --> 00:30:54,280
I mean, there are certain conversations I see repeated online again and again.

511
00:30:54,280 --> 00:30:57,320
And I don't, you know, insert myself into them.

512
00:30:57,320 --> 00:31:01,120
I just kind of stand back and watch people carry out the sort of rote moves.

513
00:31:01,120 --> 00:31:06,520
But, you know, lots of people, they just post on social media declaring they will never

514
00:31:06,520 --> 00:31:08,560
use AI for anything.

515
00:31:08,560 --> 00:31:13,240
And invariably somebody else posts, well, not that you know of, but, you know, you are

516
00:31:13,240 --> 00:31:17,440
interacting with these systems already and that the number of them that you will be interacting

517
00:31:17,440 --> 00:31:20,160
with without even knowing it will increase in the future.

518
00:31:20,160 --> 00:31:21,160
Yeah.

519
00:31:21,160 --> 00:31:27,920
And I'm also not sure what people think they're achieving by having like a no AI stance.

520
00:31:27,920 --> 00:31:31,440
I remember I was having a debate and you ever had a debate with someone where you realize

521
00:31:31,440 --> 00:31:36,920
a couple of seconds in that, all right, this is probably not worth it.

522
00:31:36,920 --> 00:31:39,880
His entire argument was he says it's missing.

523
00:31:39,880 --> 00:31:40,880
It doesn't have sold.

524
00:31:40,880 --> 00:31:45,000
And I'm like, oh, God, we're going to have, you know, we're going to have that conversation.

525
00:31:45,000 --> 00:31:48,400
But I don't know.

526
00:31:48,400 --> 00:31:54,520
I think part of part of my frustration with all of this, all of the talk around AI is

527
00:31:54,520 --> 00:31:58,860
that everyone's talking with this degree of certainty that they shouldn't have.

528
00:31:58,860 --> 00:32:00,600
This is a new thing.

529
00:32:00,600 --> 00:32:02,680
I don't know how this is going to turn out.

530
00:32:02,680 --> 00:32:05,280
I mean, we're all riding on this on this big wave.

531
00:32:05,280 --> 00:32:07,560
It's like, who knows where this thing's going?

532
00:32:07,560 --> 00:32:13,140
You know, and then there's other times where I almost feel like we're flying down the road.

533
00:32:13,140 --> 00:32:15,240
The headlights aren't working.

534
00:32:15,240 --> 00:32:18,780
You know, there might be a turn coming up, but we don't know.

535
00:32:18,780 --> 00:32:22,080
You look to the guy next to you, say, hey, should we slow down?

536
00:32:22,080 --> 00:32:24,920
And the answer is like, the Chinese aren't going to slow down.

537
00:32:24,920 --> 00:32:28,480
We're just barreling down the road.

538
00:32:28,480 --> 00:32:30,280
You know, you got no headlights.

539
00:32:30,280 --> 00:32:32,040
You can barely see.

540
00:32:32,040 --> 00:32:34,700
And sometimes it feels like that's where we are.

541
00:32:34,700 --> 00:32:41,080
But I think that's what it feels like when you're in the middle of like a transformational

542
00:32:41,080 --> 00:32:44,720
time period like the one that we're like the one that we're living in right now.

543
00:32:44,720 --> 00:32:49,200
Well, the dynamic you just described, I think, is terminal race condition.

544
00:32:49,200 --> 00:32:50,200
Yeah.

545
00:32:50,200 --> 00:32:51,200
Yeah.

546
00:32:51,200 --> 00:32:54,480
Where you want to stop, you want to slow down, you want to proceed with more caution, but

547
00:32:54,480 --> 00:32:59,480
you're in competition with somebody who has, you know, the same competitive motive to cut

548
00:32:59,480 --> 00:33:02,680
every corner and to just push the pedal to the metal.

549
00:33:02,680 --> 00:33:05,200
Yeah, I think it was Mark Zuckerberg that had that argument.

550
00:33:05,200 --> 00:33:06,520
He's like, well, the Chinese are good.

551
00:33:06,520 --> 00:33:08,840
I'm like, so basically that's going to be the standard now.

552
00:33:08,840 --> 00:33:11,960
Like, you know, it's like, are we are we going to stop drinking the poison?

553
00:33:11,960 --> 00:33:14,680
Well, the Chinese are going to stop drinking.

554
00:33:14,680 --> 00:33:20,000
Yeah, we live through that in the 20th century with nuclear weapons.

555
00:33:20,000 --> 00:33:24,880
Yeah, we're going to spend how much to build these things that do what?

556
00:33:24,880 --> 00:33:27,680
And it's like, well, we have to because the Soviets are doing it.

557
00:33:27,680 --> 00:33:29,000
Yeah, that was the idea.

558
00:33:29,000 --> 00:33:30,480
I mean, it's still that way now.

559
00:33:30,480 --> 00:33:36,800
Like, I would be willing to bet that we're going to have a lot of changes in our law

560
00:33:36,800 --> 00:33:38,600
in order to address drone overflights.

561
00:33:38,600 --> 00:33:43,040
So I don't know if you've been keeping up with what's been going on in New Jersey, but

562
00:33:43,040 --> 00:33:45,440
they have these drones and then there's confusion.

563
00:33:45,440 --> 00:33:47,840
Is it UAPs or is it drones?

564
00:33:47,840 --> 00:33:53,160
And there's a limitation on what the military can and can't do because of the Posse Comitatus

565
00:33:53,160 --> 00:33:54,160
Act.

566
00:33:54,160 --> 00:33:59,800
So I'm pretty sure that we have some some law changes coming to allow the military to

567
00:33:59,800 --> 00:34:03,320
be able to like, hey, and I thought that they had this ability already.

568
00:34:03,320 --> 00:34:07,020
But I started researching and found out, no, it's actually rather complicated.

569
00:34:07,020 --> 00:34:13,080
And also from other information security folks that I know that are military, part of the

570
00:34:13,080 --> 00:34:18,920
reason why sometimes they will allow sort of like because this infuriated a lot of people

571
00:34:18,920 --> 00:34:24,880
is the Chinese balloon was allowed to overfly the United States is that sometimes the reason

572
00:34:24,880 --> 00:34:30,240
why that's allowed is because although they're capturing data on you, you're also capturing

573
00:34:30,240 --> 00:34:31,880
data on them.

574
00:34:31,880 --> 00:34:39,680
So it's a two way data capture opportunity for both the aggressor and the person that's

575
00:34:39,680 --> 00:34:41,980
trying to defend against the aggressor.

576
00:34:41,980 --> 00:34:46,580
So these things aren't unfortunately explained in the press all the time.

577
00:34:46,580 --> 00:34:53,080
But my understanding is that that's why they will sometimes allow surveillance overflights

578
00:34:53,080 --> 00:34:57,960
of these balloons because they want to do data capture.

579
00:34:57,960 --> 00:35:04,600
I would be interested in hearing how AI is changing your work in cybersecurity.

580
00:35:04,600 --> 00:35:14,400
Well, I could tell you a lot of the tools that we're using leverage, leverage AI, and

581
00:35:14,400 --> 00:35:17,100
I'm using it in order to understand logs.

582
00:35:17,100 --> 00:35:22,120
So a lot of times when you have to pour through a log file, I'll just dump it into our address

583
00:35:22,120 --> 00:35:25,720
and say, hey, give me a summary of exactly what this means.

584
00:35:25,720 --> 00:35:28,400
And it's really helpful for things like that.

585
00:35:28,400 --> 00:35:34,680
It's also helpful for things like email security.

586
00:35:34,680 --> 00:35:42,800
So we use certain tools that will if it comes across an email that it doesn't it can't determine

587
00:35:42,800 --> 00:35:47,160
a verdict for it will kick that over to us for human review.

588
00:35:47,160 --> 00:35:51,600
And then we'll go through it, run it through Joe Sandbox and other tools, Virus Total,

589
00:35:51,600 --> 00:35:52,880
in order to see what's wrong with it.

590
00:35:52,880 --> 00:35:57,500
And then if we find something wrong with it, we can report back or we can score it and

591
00:35:57,500 --> 00:35:59,240
come up with our own verdict.

592
00:35:59,240 --> 00:36:04,160
And by all of the different users everywhere doing that, you're also helping to train their

593
00:36:04,160 --> 00:36:06,760
AI model on how to better identify things.

594
00:36:06,760 --> 00:36:14,840
So in terms of AI threats, I have noticed that phishing emails have vastly improved

595
00:36:14,840 --> 00:36:19,560
in terms of like spelling issues and grammar issues.

596
00:36:19,560 --> 00:36:24,480
The only thing that it can't seem to get perfect is certain cultural things.

597
00:36:24,480 --> 00:36:30,680
So I saw a scam email a while back that talked about something called the US Department of

598
00:36:30,680 --> 00:36:31,680
Income.

599
00:36:31,680 --> 00:36:34,160
And I'm like, well, we don't have a US Department of Income.

600
00:36:34,160 --> 00:36:36,480
We have apps.

601
00:36:36,480 --> 00:36:41,440
So they're just not clear on what our departments are, which is odd since that's public information.

602
00:36:41,440 --> 00:36:52,880
But yeah, we haven't seen, let me say I haven't seen too many specifically AI related threats,

603
00:36:52,880 --> 00:36:57,280
but we have done a ton of research on potential future threats.

604
00:36:57,280 --> 00:37:02,200
So one of the things I did, I was involved in a project at my work where we wanted to

605
00:37:02,200 --> 00:37:09,520
show, hey, could someone simulate someone's voice in leadership and then call into support

606
00:37:09,520 --> 00:37:14,040
to try to get them to do something like say turn off 2FA.

607
00:37:14,040 --> 00:37:19,680
And so I created a couple of different scenarios and they played them to the board and they

608
00:37:19,680 --> 00:37:21,120
were slightly horrified.

609
00:37:21,120 --> 00:37:28,420
And so we started to take some measures and put in some tools and policies to try to mitigate

610
00:37:28,420 --> 00:37:29,420
against this.

611
00:37:29,420 --> 00:37:30,760
But again, it's a moving target.

612
00:37:30,760 --> 00:37:36,700
I also did, I think I did another demo where the president of the company is explaining

613
00:37:36,700 --> 00:37:44,640
to employees why he wants them to follow local law enforcement dictates to move away from

614
00:37:44,640 --> 00:37:47,600
the coast because the creature is now attacking San Francisco.

615
00:37:47,600 --> 00:37:51,640
So I did Godzilla attack San Francisco.

616
00:37:51,640 --> 00:37:54,480
And yeah, that was a fun project.

617
00:37:54,480 --> 00:37:57,640
It was fun to kind of terrify the board like that.

618
00:37:57,640 --> 00:38:00,200
But this stuff is really easy to do.

619
00:38:00,200 --> 00:38:04,920
And all of the voice samples that I use were just captured from, he's all over YouTube.

620
00:38:04,920 --> 00:38:11,240
So I mean, I just captured his voice from YouTube videos from the investment call, things

621
00:38:11,240 --> 00:38:12,240
like that.

622
00:38:12,240 --> 00:38:13,240
And that was enough.

623
00:38:13,240 --> 00:38:14,360
You don't need a whole lot.

624
00:38:14,360 --> 00:38:19,200
I think I used seven samples of his voice and they were relatively high quality.

625
00:38:19,200 --> 00:38:23,960
And I was able to replicate his voice and make leadership say whatever one of them say.

626
00:38:23,960 --> 00:38:26,240
It's not difficult at all.

627
00:38:26,240 --> 00:38:31,840
And I think there was a case where someone, there was like a fake kidnapping and ransom

628
00:38:31,840 --> 00:38:36,760
case, someone thought that their child was being held for ransom and they were just simulating

629
00:38:36,760 --> 00:38:37,760
the voice.

630
00:38:37,760 --> 00:38:43,200
So I suspect that a new cultural norm will emerge soon where we will either not share

631
00:38:43,200 --> 00:38:47,480
our names or we will have, I already have this set up with my family, but we will have

632
00:38:47,480 --> 00:38:54,240
certain phrases so that we can distinguish between ourselves and a artificial generated

633
00:38:54,240 --> 00:38:56,880
version of ourselves.

634
00:38:56,880 --> 00:39:02,080
You know, I've been podcasting since 2006 and I've done well over a thousand podcasts

635
00:39:02,080 --> 00:39:03,560
for various different shows.

636
00:39:03,560 --> 00:39:05,400
I've been guests on different shows.

637
00:39:05,400 --> 00:39:08,080
And so my voice is out there.

638
00:39:08,080 --> 00:39:12,160
I know that anybody who wants to can clone it at any time.

639
00:39:12,160 --> 00:39:14,380
There's nothing I can do about that.

640
00:39:14,380 --> 00:39:17,320
You mentioned email.

641
00:39:17,320 --> 00:39:20,080
I've had the same email address since 1995.

642
00:39:20,080 --> 00:39:32,200
It now forwards to my Gmail address and I have 300,000, 302,572 unread emails in my

643
00:39:32,200 --> 00:39:34,520
inbox.

644
00:39:34,520 --> 00:39:36,240
There's so much stuff in my inbox.

645
00:39:36,240 --> 00:39:40,600
I don't really like go through it and, you know, read everything.

646
00:39:40,600 --> 00:39:45,720
I just sort of glance through it looking for familiar names, basically.

647
00:39:45,720 --> 00:39:51,840
Google is okay at like putting this little yellow flag on things that it thinks might

648
00:39:51,840 --> 00:39:58,200
be legitimate or, you know, worth my interest, but it's moved from being, hey, this is suspicious

649
00:39:58,200 --> 00:40:02,460
to hey, out of this huge tsunami of crap, here's a couple of things that you should

650
00:40:02,460 --> 00:40:04,760
probably open and look at.

651
00:40:04,760 --> 00:40:09,960
And I noticed that every like once in a blue moon, I'll go into my spam folder and I'll

652
00:40:09,960 --> 00:40:14,840
find things in there that really shouldn't have been, you know, tagged as spam.

653
00:40:14,840 --> 00:40:20,920
But as far as I can tell, Google's getting pretty good at filtering this stuff for me.

654
00:40:20,920 --> 00:40:25,160
And I never see like scam emails anymore.

655
00:40:25,160 --> 00:40:30,520
But what I see a lot of are thirst trap, you know, fishing, like cat fishing, they call

656
00:40:30,520 --> 00:40:32,080
it or pig fattening.

657
00:40:32,080 --> 00:40:35,360
The pig butchering.

658
00:40:35,360 --> 00:40:39,440
Coming in via my phone, you know, through text messages.

659
00:40:39,440 --> 00:40:45,740
And it's in one respect, you know, it's obviously annoying and infuriating.

660
00:40:45,740 --> 00:40:50,440
A lot of times the people who are doing this, you know, it's like a sweatshop sort of thing.

661
00:40:50,440 --> 00:40:55,200
They've got a board with, you know, 40 phones strapped down to it.

662
00:40:55,200 --> 00:41:00,520
And they're just sending these automated messages, you know, these scripted messages to huge

663
00:41:00,520 --> 00:41:04,720
lists of, you know, of numbers that they purchased somewhere.

664
00:41:04,720 --> 00:41:09,040
But the people actually doing the work are, you know, a lot of times they're trafficked,

665
00:41:09,040 --> 00:41:11,400
they're not the scammers themselves.

666
00:41:11,400 --> 00:41:13,000
The scammers are their bosses.

667
00:41:13,000 --> 00:41:15,520
Yeah, it's the most I've seen those setups.

668
00:41:15,520 --> 00:41:17,360
It's the most cyberpunk thing you've ever seen.

669
00:41:17,360 --> 00:41:21,360
It's guys with sitting around, you know, and like you say, in front of boards exactly how

670
00:41:21,360 --> 00:41:22,480
it happens.

671
00:41:22,480 --> 00:41:24,960
And they're they're running these scams.

672
00:41:24,960 --> 00:41:29,640
They also have situations where I was aware of a situation where this one individual in

673
00:41:29,640 --> 00:41:31,540
India was running two businesses.

674
00:41:31,540 --> 00:41:36,320
One was a legit, like service desk type business.

675
00:41:36,320 --> 00:41:39,960
But the other one was an illegitimate business, and they were in the same building, same floor,

676
00:41:39,960 --> 00:41:43,200
but divided by two different sides of the building.

677
00:41:43,200 --> 00:41:48,040
So the illegitimate business was on one side, the legit was on the other.

678
00:41:48,040 --> 00:41:49,960
And this kind of stuff apparently is common.

679
00:41:49,960 --> 00:41:55,760
I had a boss where he used to work with Interpol to actually go and, you know, knock down doors.

680
00:41:55,760 --> 00:42:01,160
And he said that when you get to these places where in Eastern Europe where they're operating,

681
00:42:01,160 --> 00:42:08,640
he says it's a business by every visual standard, you have to badge in, they have HR, there's

682
00:42:08,640 --> 00:42:10,560
catered lunch in every way.

683
00:42:10,560 --> 00:42:15,000
It looks like a startup company, but it's a scam company.

684
00:42:15,000 --> 00:42:16,760
Company just run scams.

685
00:42:16,760 --> 00:42:20,200
So I wish I could say that there's this movie called Beekeeper.

686
00:42:20,200 --> 00:42:21,520
I don't know if you saw it.

687
00:42:21,520 --> 00:42:24,480
I wish I could say that was an exaggeration.

688
00:42:24,480 --> 00:42:26,680
It's kind of what those places are like.

689
00:42:26,680 --> 00:42:30,040
Probably not as glitzy, but it's pretty darn close.

690
00:42:30,040 --> 00:42:37,800
I haven't seen Beekeeper, but I know it's a Jason Statham revenge flick and that it

691
00:42:37,800 --> 00:42:43,960
doesn't incorporate a lot of the topics, the subject matter that we're discussing here.

692
00:42:43,960 --> 00:42:52,080
But a point I was trying to get around to was that while these catfishing schemes, the

693
00:42:52,080 --> 00:42:57,720
fact that they're always popping up on my phone every single day, I get several.

694
00:42:57,720 --> 00:43:03,640
It's actually kind of encouraging because I can see in a glance what's happening, which

695
00:43:03,640 --> 00:43:05,800
means that they're not that sophisticated yet.

696
00:43:05,800 --> 00:43:09,360
Eventually, there's going to come a time when they will fool me.

697
00:43:09,360 --> 00:43:12,680
Yeah, yeah.

698
00:43:12,680 --> 00:43:15,840
I think it's just going to be a back and forth kind of thing.

699
00:43:15,840 --> 00:43:23,920
It almost feels like spy versus spy where the protection methods will increase and then

700
00:43:23,920 --> 00:43:27,680
their attack techniques will change.

701
00:43:27,680 --> 00:43:29,800
And we'll just kind of go back and forth.

702
00:43:29,800 --> 00:43:36,800
But yeah, it's going to be challenging and it is challenging.

703
00:43:36,800 --> 00:43:40,600
How it is that we're ultimately going to deal with it?

704
00:43:40,600 --> 00:43:43,520
I don't think that law is the way to do this.

705
00:43:43,520 --> 00:43:48,000
I think that there are things that industry can do in order to address it, but law just

706
00:43:48,000 --> 00:43:49,960
moves way too slow.

707
00:43:49,960 --> 00:43:56,560
And I think that a lot of the issues that we have are because different things are moving

708
00:43:56,560 --> 00:43:57,920
at different rates.

709
00:43:57,920 --> 00:44:02,160
So I kind of imagine it as a giant phonograph record that's spinning.

710
00:44:02,160 --> 00:44:03,160
Technology is almost at the center.

711
00:44:03,160 --> 00:44:05,480
It's the thing that's moving the fastest, right?

712
00:44:05,480 --> 00:44:07,280
Then you have the general public.

713
00:44:07,280 --> 00:44:08,400
They're sort of like in the middle.

714
00:44:08,400 --> 00:44:10,320
They're not too slow.

715
00:44:10,320 --> 00:44:14,760
They are aware of some things, but they're certainly not moving as fast as, say, those

716
00:44:14,760 --> 00:44:17,780
that are closer to the center of the phonograph record.

717
00:44:17,780 --> 00:44:21,040
And then you got the law, which was way out on the edge.

718
00:44:21,040 --> 00:44:23,080
And they're the slowest thing moving.

719
00:44:23,080 --> 00:44:25,880
And so all of these things are moving.

720
00:44:25,880 --> 00:44:30,840
Industry, politics, media, all of these things are moving at different rates.

721
00:44:30,840 --> 00:44:33,120
And we keep expecting them to align.

722
00:44:33,120 --> 00:44:36,240
And they never quite do.

723
00:44:36,240 --> 00:44:40,720
And I think that the other issue is that we used to treat all of these things separately.

724
00:44:40,720 --> 00:44:43,480
So there was the realm of finance.

725
00:44:43,480 --> 00:44:45,880
There was the realm of technology.

726
00:44:45,880 --> 00:44:47,760
There was the realm of politics, et cetera.

727
00:44:47,760 --> 00:44:49,920
And we had all of these separate realms.

728
00:44:49,920 --> 00:44:54,040
And now it feels like all of these things have been converged, like you've taken all

729
00:44:54,040 --> 00:44:58,080
of the different colors of Plato and mashed them together.

730
00:44:58,080 --> 00:45:02,240
And it's very difficult to disentangle these things, where it's like, well, finance and

731
00:45:02,240 --> 00:45:07,400
technology and politics are now all inextricably bound to one another.

732
00:45:07,400 --> 00:45:13,480
And I think that we still think of them as separate things, but they're not.

733
00:45:13,480 --> 00:45:18,340
They've kind of merged into this new thing that we don't really have a name for or really

734
00:45:18,340 --> 00:45:20,480
haven't quite recognized.

735
00:45:20,480 --> 00:45:25,140
And even in that binding, that's also changing.

736
00:45:25,140 --> 00:45:30,680
It's interesting to watch Hollywood undergo this very slow collapse.

737
00:45:30,680 --> 00:45:37,720
It's like watching a star turn dark around the edges, and it's just slowly starting to

738
00:45:37,720 --> 00:45:38,720
collapse.

739
00:45:38,720 --> 00:45:41,480
It's just really odd watching all of these things happen.

740
00:45:41,480 --> 00:45:43,820
And it seems like it's happening faster than ever.

741
00:45:43,820 --> 00:45:49,600
I was talking to a friend of mine about, doesn't this tech era feel like it's moving infinitely

742
00:45:49,600 --> 00:45:54,200
faster than what we remember in the 1990s?

743
00:45:54,200 --> 00:45:56,120
Timescales used to mean something.

744
00:45:56,120 --> 00:46:01,400
Like if we said three to five years, we had a fairly good idea about what's going to happen

745
00:46:01,400 --> 00:46:02,400
three to five years.

746
00:46:02,400 --> 00:46:06,000
I have absolutely no idea what's going to happen in the next five years.

747
00:46:06,000 --> 00:46:09,360
I wouldn't even dare to guess.

748
00:46:09,360 --> 00:46:14,100
And so all of these timescales that used to mean something now don't really mean anything.

749
00:46:14,100 --> 00:46:19,080
It's all been compressed by just the speed of technology.

750
00:46:19,080 --> 00:46:23,240
So a lot of times I'll be saying, I'll catch myself in the middle of saying something like,

751
00:46:23,240 --> 00:46:24,240
you know what?

752
00:46:24,240 --> 00:46:25,240
I don't know.

753
00:46:25,240 --> 00:46:27,720
I don't know if that may be a possibility in the future.

754
00:46:27,720 --> 00:46:29,640
So that's just where we live now.

755
00:46:29,640 --> 00:46:35,360
We live in this place where the recognizable timescales that we used to plan, we used to

756
00:46:35,360 --> 00:46:37,040
say, okay, three years, five years, 10 years.

757
00:46:37,040 --> 00:46:40,160
And we had a good idea as to what could be achieved in that time.

758
00:46:40,160 --> 00:46:42,880
Can you imagine anything in 10 years?

759
00:46:42,880 --> 00:46:45,720
Because I can't.

760
00:46:45,720 --> 00:46:50,320
In very broad strokes, but certainly not in the specifics, and it's the specifics that

761
00:46:50,320 --> 00:46:55,160
people don't anticipate that turn out to be really important, you know, that define eras.

762
00:46:55,160 --> 00:46:58,280
I mean, those are the black swan events.

763
00:46:58,280 --> 00:47:00,280
You are in the Pacific time zone.

764
00:47:00,280 --> 00:47:01,280
Are you in California?

765
00:47:01,280 --> 00:47:03,360
Yes, I am.

766
00:47:03,360 --> 00:47:09,720
So in your state, you know, a lot of the big AI companies are headquartered there.

767
00:47:09,720 --> 00:47:15,620
And the heads of those companies collaborated with the California legislature to come up

768
00:47:15,620 --> 00:47:21,240
with an AI safety bill that was passed and then your governor vetoed it.

769
00:47:21,240 --> 00:47:25,320
What's your experience of being a Californian and watching that happen?

770
00:47:25,320 --> 00:47:32,480
I mean, California has got a lot of different issues.

771
00:47:32,480 --> 00:47:40,960
I actually worked for the state of California for a while, and I realized how what I found

772
00:47:40,960 --> 00:47:46,400
was that for every single role, at least where I was working, there were there were two people

773
00:47:46,400 --> 00:47:47,940
working every role.

774
00:47:47,940 --> 00:47:54,600
There was the state worker and then there was the contractor that did the actual work.

775
00:47:54,600 --> 00:48:02,120
And every everything that I had ever been told about like the, you know, well, now granted,

776
00:48:02,120 --> 00:48:04,640
the private sector isn't any more efficient.

777
00:48:04,640 --> 00:48:06,400
Private sector has their issues, too.

778
00:48:06,400 --> 00:48:11,680
So my goodness, the amount of waste that I saw working in that position for the state,

779
00:48:11,680 --> 00:48:16,200
like risk accepting everything, for example, and not touching anything, because people

780
00:48:16,200 --> 00:48:17,760
are terrified to touch the code base.

781
00:48:17,760 --> 00:48:22,040
They're terrified to patch anything because it could break the entire system.

782
00:48:22,040 --> 00:48:25,440
And you've got a code base that's over 20 years old.

783
00:48:25,440 --> 00:48:28,160
You've got systems in place that have been in place for years.

784
00:48:28,160 --> 00:48:34,120
And so instead of solving problems, I found they just risk accepted everything, which

785
00:48:34,120 --> 00:48:36,600
is not which is not security.

786
00:48:36,600 --> 00:48:40,920
And you know, I was so upset by what I saw there.

787
00:48:40,920 --> 00:48:43,120
I eventually left, but I was so upset by what I saw.

788
00:48:43,120 --> 00:48:47,560
I just wrote up everything that needed to be fixed and how to fix it and gave it to

789
00:48:47,560 --> 00:48:51,200
all of the the folks in the different departments.

790
00:48:51,200 --> 00:48:55,480
And then I wrote a letter, I think, at the time to Kamala Harris, strangely enough to

791
00:48:55,480 --> 00:48:59,840
say this organization has some really serious issues.

792
00:48:59,840 --> 00:49:04,760
And, you know, it deals with a lot of data from Californians.

793
00:49:04,760 --> 00:49:06,520
You know, something needs to be done.

794
00:49:06,520 --> 00:49:09,000
I got a sort of a boilerplate letter back.

795
00:49:09,000 --> 00:49:12,280
Never heard anything else about it.

796
00:49:12,280 --> 00:49:18,600
But yeah, I'm just not so sure if things like an A.I. safety bill is going to do it.

797
00:49:18,600 --> 00:49:25,420
What we need is is and what we sort of have are more private sector solutions similar

798
00:49:25,420 --> 00:49:27,900
to bioethics.

799
00:49:27,900 --> 00:49:33,000
So when genetic engineering and these these technologies start to emerge, people realize

800
00:49:33,000 --> 00:49:39,800
quickly we need to set up like an international bioethics, you know, infrastructure to sort

801
00:49:39,800 --> 00:49:42,540
of deal with some of these issues and come to some agreements.

802
00:49:42,540 --> 00:49:46,620
We need something like that for for for A.I.

803
00:49:46,620 --> 00:49:49,320
And I think that in a lot of ways, we're sort of headed there.

804
00:49:49,320 --> 00:49:54,800
But it's unclear to me that if we're really going to it's unclear to me that that the

805
00:49:54,800 --> 00:50:01,400
law is going to be flexible and agile enough to keep up with where the technology is going.

806
00:50:01,400 --> 00:50:07,200
Are you familiar with the concept of accelerationism?

807
00:50:07,200 --> 00:50:08,560
I am.

808
00:50:08,560 --> 00:50:14,320
And when I kind of looked into it, it was immediate turn off to me because it just seems

809
00:50:14,320 --> 00:50:18,440
very anti-human.

810
00:50:18,440 --> 00:50:23,240
In my opinion, the notion that if you can do something, you should you should do it

811
00:50:23,240 --> 00:50:29,920
better is that you should we should be accelerating things towards some type of, you know, technologically

812
00:50:29,920 --> 00:50:31,800
economic termination points.

813
00:50:31,800 --> 00:50:37,400
I mean, a lot of damage can be done to real people in the real world along the way.

814
00:50:37,400 --> 00:50:40,200
So I don't know.

815
00:50:40,200 --> 00:50:41,840
I came across this.

816
00:50:41,840 --> 00:50:46,320
I think it's the EAC is what it's it's how it's referred to.

817
00:50:46,320 --> 00:50:52,000
Well, the ACC originally by itself was a thing.

818
00:50:52,000 --> 00:50:56,880
The main figure associated with that was a guy named Nick Land, who wrote in a very impenetrable

819
00:50:56,880 --> 00:50:57,880
style.

820
00:50:57,880 --> 00:51:01,720
But he was very I mean, he ended up in a very anti-human place, saying basically, it doesn't

821
00:51:01,720 --> 00:51:06,420
matter if humans survive, we just need to push forward to this informational singularity

822
00:51:06,420 --> 00:51:11,880
and send the AI off to the stars or whatever to colonize the galaxy.

823
00:51:11,880 --> 00:51:12,880
Yeah.

824
00:51:12,880 --> 00:51:21,020
EAC, effective accelerationism, is sort of the continuation of effective altruism.

825
00:51:21,020 --> 00:51:28,440
That effective altruism was discredited with the the FTX debacle with, you know, Sam Bankman

826
00:51:28,440 --> 00:51:35,200
Fried basically being the sort of the poster child for effective altruism.

827
00:51:35,200 --> 00:51:37,940
And, you know, he turns out to be a criminal.

828
00:51:37,940 --> 00:51:43,240
So a lot of the same people who were persuaded by that school of thought have sort of moved

829
00:51:43,240 --> 00:51:49,800
over to effective accelerationism, which basically says for the good of humanity, we need to,

830
00:51:49,800 --> 00:51:55,560
you know, push the techno capital lever, you know, as far as it goes and as fast as it

831
00:51:55,560 --> 00:51:56,560
goes.

832
00:51:56,560 --> 00:52:03,760
And like major figures there are Mark Andreessen, who has made some very sort of tone deaf statements

833
00:52:03,760 --> 00:52:07,920
about people who are not involved in tech, you know, basically that they don't matter

834
00:52:07,920 --> 00:52:14,280
and that he's he's grateful for video games and Oxycontin to keep them quiet and occupied

835
00:52:14,280 --> 00:52:17,000
and sort of out of the way.

836
00:52:17,000 --> 00:52:18,480
So yeah, go ahead.

837
00:52:18,480 --> 00:52:21,760
Yeah, I mean, it's a little history lesson there.

838
00:52:21,760 --> 00:52:24,240
But what's your experience of it?

839
00:52:24,240 --> 00:52:25,640
It's just a complete turn off.

840
00:52:25,640 --> 00:52:27,360
I'm not interested in it at all.

841
00:52:27,360 --> 00:52:30,360
I can't I encountered it.

842
00:52:30,360 --> 00:52:35,200
I think the first time it was Alex Friedman podcast when I first heard about it.

843
00:52:35,200 --> 00:52:38,920
And then a friend of mine texted me and it was so strange because I think that when he

844
00:52:38,920 --> 00:52:44,800
texted me by his reaction, he thought I would be pro EAAC.

845
00:52:44,800 --> 00:52:46,600
And he said, what do you think about this?

846
00:52:46,600 --> 00:52:47,600
And I had already seen.

847
00:52:47,600 --> 00:52:48,600
I said, yeah, I've seen it.

848
00:52:48,600 --> 00:52:49,600
He's like, what do you think?

849
00:52:49,600 --> 00:52:53,720
I said, I think it's the most anti human ideology I've ever seen.

850
00:52:53,720 --> 00:52:59,320
And I could just tell that there was like this huge sigh of relief from him, even through

851
00:52:59,320 --> 00:53:02,920
like, dude, I said, what did you think I was going to say?

852
00:53:02,920 --> 00:53:06,280
And he said, I thought that you were going to be I thought you'd be in support of it.

853
00:53:06,280 --> 00:53:08,840
And I'm like, no.

854
00:53:08,840 --> 00:53:11,320
So that was the first time I came across it.

855
00:53:11,320 --> 00:53:18,880
But now I I mean, you know, Douglas Roscoff pretty much killed the techno utopian in me.

856
00:53:18,880 --> 00:53:25,360
I still think that that technology is probably one of the best tools we have to improve the

857
00:53:25,360 --> 00:53:27,040
human condition.

858
00:53:27,040 --> 00:53:30,080
What I think I don't think that people really hate technology.

859
00:53:30,080 --> 00:53:34,480
What they hate are these business models that are wrapped around the technology.

860
00:53:34,480 --> 00:53:42,520
I think that people would love to use this tech if it didn't track us and collect our

861
00:53:42,520 --> 00:53:43,880
data, things like that.

862
00:53:43,880 --> 00:53:46,720
It's not the technology, it's the business models.

863
00:53:46,720 --> 00:53:51,000
And many of the business models are pretty exploitative and kind of shitty.

864
00:53:51,000 --> 00:53:53,800
And I think that's the thing that people really, really resent.

865
00:53:53,800 --> 00:53:59,400
But it does make me wonder, I might explore this in a story like what would what would

866
00:53:59,400 --> 00:54:03,680
a technological civilization like ours at our level look like?

867
00:54:03,680 --> 00:54:10,400
But without all the all the extractive data, all the extractive business models, what if

868
00:54:10,400 --> 00:54:14,960
they were things that actually, you know, added to the the human project and didn't

869
00:54:14,960 --> 00:54:16,440
just extract things?

870
00:54:16,440 --> 00:54:22,200
Because it seems like we're moving to a future with these eight this EA.

871
00:54:22,200 --> 00:54:25,840
I think it's E slash a CC, I think.

872
00:54:25,840 --> 00:54:26,840
Yeah.

873
00:54:26,840 --> 00:54:29,040
EAC is how it's usually settled out.

874
00:54:29,040 --> 00:54:30,040
Yeah.

875
00:54:30,040 --> 00:54:34,920
We're moving to this place where the only value that humans will serve is to is to generate

876
00:54:34,920 --> 00:54:35,920
training data.

877
00:54:35,920 --> 00:54:39,920
And I just don't think that's what we are as beings.

878
00:54:39,920 --> 00:54:43,800
We're not things to be used to generate training data.

879
00:54:43,800 --> 00:54:45,520
You know, we're human beings.

880
00:54:45,520 --> 00:54:48,800
And I just don't understand where this is going like that.

881
00:54:48,800 --> 00:54:51,840
Like I don't understand what world you would end up with other than something close to

882
00:54:51,840 --> 00:54:52,840
the Borg.

883
00:54:52,840 --> 00:54:59,560
I just don't see where how that how their ideology would be in any way beneficial to

884
00:54:59,560 --> 00:55:01,920
humanity.

885
00:55:01,920 --> 00:55:07,640
When the first Avatar film came out, I think when 2009, I remember reading something from

886
00:55:07,640 --> 00:55:13,680
a I guess you'd call them a techno utopian or a techno techno file.

887
00:55:13,680 --> 00:55:20,600
Basically, somebody who is really contemptuous of spirituality and like, you know, deep green

888
00:55:20,600 --> 00:55:22,520
sentiments, ecological sentiments.

889
00:55:22,520 --> 00:55:27,560
But they were saying, yeah, this this whole like the scenario in Avatar of this planet

890
00:55:27,560 --> 00:55:31,720
where you have all these tall, beautiful, strong people who live in harmony with nature

891
00:55:31,720 --> 00:55:36,880
and they all have ponytails that they can plug into the ponytails on animals, you know,

892
00:55:36,880 --> 00:55:42,720
and sort of interface with them and they can plug into this great tree and become one with

893
00:55:42,720 --> 00:55:44,400
the mind of the planet.

894
00:55:44,400 --> 00:55:50,760
But this was all created in the wake of a technological singularity where some super

895
00:55:50,760 --> 00:55:57,200
intelligence basically just created this sort of spiritual playground and populated it with

896
00:55:57,200 --> 00:56:02,920
these naive entities who think, you know, that everything they're doing is spiritual

897
00:56:02,920 --> 00:56:04,960
and organic and ecological.

898
00:56:04,960 --> 00:56:10,160
When in fact, it's all just a big construct that was created as sort of a paradise, you

899
00:56:10,160 --> 00:56:12,040
know, an artificial paradise.

900
00:56:12,040 --> 00:56:16,560
And that's that's one, you know, one sort of trajectory.

901
00:56:16,560 --> 00:56:22,560
Like if you wanted if I know people who are genuinely anti technology, I mean, it's not

902
00:56:22,560 --> 00:56:24,200
that they hate capitalism.

903
00:56:24,200 --> 00:56:27,200
They do hate capitalism, but they also hate machinery.

904
00:56:27,200 --> 00:56:28,200
They hate cars.

905
00:56:28,200 --> 00:56:29,200
They hate planes.

906
00:56:29,200 --> 00:56:30,200
They hate computers.

907
00:56:30,200 --> 00:56:33,300
I mean, they just they love biology.

908
00:56:33,300 --> 00:56:38,280
And for somebody with that mentality, you know, with sophisticated enough technology,

909
00:56:38,280 --> 00:56:44,060
you could create a paradise for them, which seems to answer to all of their preferences

910
00:56:44,060 --> 00:56:48,840
and obscures from them the fact that it was provided by technology, the very technology

911
00:56:48,840 --> 00:56:49,840
that they hate.

912
00:56:49,840 --> 00:56:56,360
Yeah, I mean, I do think that spirituality for him is kind of an unavoidable thing.

913
00:56:56,360 --> 00:56:59,920
And I think the reason it's unavoidable is because we die.

914
00:56:59,920 --> 00:57:05,240
And yeah, and because we die, there's this question that's there, which is, hey, you

915
00:57:05,240 --> 00:57:09,080
know, everything that I am, you know, does that just end when I die?

916
00:57:09,080 --> 00:57:12,200
Or is there some type of after death state?

917
00:57:12,200 --> 00:57:17,600
And I think that as long as that question remains unanswered, the issue of spirituality

918
00:57:17,600 --> 00:57:20,600
is is is kind of inescapable.

919
00:57:20,600 --> 00:57:24,640
And I sort of wonder sometimes for people that don't have any kind of spirituality or

920
00:57:24,640 --> 00:57:30,800
belief at all, things must be pretty miserable because you must spend like a lot of time

921
00:57:30,800 --> 00:57:36,120
trying to, you know, get away from the idea of spirituality.

922
00:57:36,120 --> 00:57:40,000
And I've just come to the conclusion that it's for humans, it's inescapable.

923
00:57:40,000 --> 00:57:49,520
So long as as the possibility of death exists with the most the other issue that you that

924
00:57:49,520 --> 00:57:52,520
you had brought up there.

925
00:57:52,520 --> 00:57:53,680
I don't know that it was an issue.

926
00:57:53,680 --> 00:57:57,120
I was just reproducing somebody's argument that I had encountered a while back.

927
00:57:57,120 --> 00:58:00,200
I mean, you say spirituality is inescapable.

928
00:58:00,200 --> 00:58:02,760
And this is this is sort of a tired trope.

929
00:58:02,760 --> 00:58:09,080
But you know, the people who are the most anti-religion and pro-technology tend to construct

930
00:58:09,080 --> 00:58:14,040
these very religious seeming narratives about the future, you know, about the technological

931
00:58:14,040 --> 00:58:19,440
singularity and uploading their consciousness and, you know, immortality through technology.

932
00:58:19,440 --> 00:58:27,320
And, you know, it's it's basically replicating the the psychological palliative of religion,

933
00:58:27,320 --> 00:58:31,440
you know, while holding the actual concept of religion at arm's length.

934
00:58:31,440 --> 00:58:33,720
Yeah, I would have to agree.

935
00:58:33,720 --> 00:58:40,840
There's a book by a guy named John C. Lennox that goes into this and shows how a lot of

936
00:58:40,840 --> 00:58:46,360
techno utopianism really is just a surrogate for religious belief.

937
00:58:46,360 --> 00:58:47,360
It just is.

938
00:58:47,360 --> 00:58:52,040
There's even a an after death state that's described where you would, you know, treating

939
00:58:52,040 --> 00:58:56,680
human consciousness as data that can be moved from one thing to another and that you would

940
00:58:56,680 --> 00:59:04,080
just sort of live perpetually in this sort of, you know, this sort of, you know, cloud

941
00:59:04,080 --> 00:59:05,640
or some type of virtual state.

942
00:59:05,640 --> 00:59:06,640
Interesting.

943
00:59:06,640 --> 00:59:14,480
But I think that if you want to really get into a good exploration of that, John C. Lennox

944
00:59:14,480 --> 00:59:19,800
in his book 2084 did a pretty good pretty good job at really going through what that

945
00:59:19,800 --> 00:59:21,560
what that argument is really about.

946
00:59:21,560 --> 00:59:23,040
All right.

947
00:59:23,040 --> 00:59:27,440
Another book for the very tall list of unread books that I'd like to get to someday.

948
00:59:27,440 --> 00:59:28,440
Yeah.

949
00:59:28,440 --> 00:59:29,440
All right.

950
00:59:29,440 --> 00:59:31,520
Well, we've been on for about an hour, so we should wrap it up.

951
00:59:31,520 --> 00:59:37,360
But before I go, we've been talking a lot about, you know, artificial intelligence and technology

952
00:59:37,360 --> 00:59:42,320
and, you know, systems that involve economics and sociology and technology.

953
00:59:42,320 --> 00:59:44,440
But I'd really like to get back to science fiction.

954
00:59:44,440 --> 00:59:48,480
What are some of your like foundational science fiction texts?

955
00:59:48,480 --> 00:59:50,960
What are the big books for you?

956
00:59:50,960 --> 00:59:57,040
Well, for me, it kind of all starts with with Dune when I read Dune as a kid.

957
00:59:57,040 --> 01:00:01,680
And I've read Dune so many times, like throughout my life.

958
01:00:01,680 --> 01:00:04,800
It's had a huge influence on my thinking.

959
01:00:04,800 --> 01:00:09,920
I realized that in Dune that there's a I learned a lot about politics by reading Dune.

960
01:00:09,920 --> 01:00:12,040
Also learned a lot about power.

961
01:00:12,040 --> 01:00:20,760
And the thing is, is that, you know, I'm not so sure that Paul Atreides is a hero, per

962
01:00:20,760 --> 01:00:21,760
se.

963
01:00:21,760 --> 01:00:22,760
Definitely not.

964
01:00:22,760 --> 01:00:26,400
He's taking advantage of a situation.

965
01:00:26,400 --> 01:00:29,680
But then he even sort of breaks from that because his mother has a certain thing that,

966
01:00:29,680 --> 01:00:31,200
you know, she wants to achieve.

967
01:00:31,200 --> 01:00:33,080
He has certain things that he wants to achieve.

968
01:00:33,080 --> 01:00:35,800
But I actually learned a lot from Dune.

969
01:00:35,800 --> 01:00:39,640
I've read like I haven't read the ones that were written by his sons.

970
01:00:39,640 --> 01:00:41,140
I've kind of skimmed them.

971
01:00:41,140 --> 01:00:43,000
But I've read all of the core Dune novels.

972
01:00:43,000 --> 01:00:48,340
And yeah, tremendous amount of influence is has been with with Dune.

973
01:00:48,340 --> 01:00:57,120
Also like, you know, Old Man's War, John Scalzi's books among many, many of the books that I

974
01:00:57,120 --> 01:00:58,480
read during the pandemic.

975
01:00:58,480 --> 01:01:03,200
Adrian Tchaikovsky's Children of Time, Children of Ruin.

976
01:01:03,200 --> 01:01:05,520
I really enjoyed those.

977
01:01:05,520 --> 01:01:10,560
And then there's other books that I read, like, you know, The Art of War, you know,

978
01:01:10,560 --> 01:01:11,560
The Fourth Turning.

979
01:01:11,560 --> 01:01:16,460
Yeah, I read books that are other than science fiction.

980
01:01:16,460 --> 01:01:21,480
One good book that I read about a while back was a book by this guy named Ernest Becker's

981
01:01:21,480 --> 01:01:26,240
called The Birth and Death of Meaning by Ernest Becker.

982
01:01:26,240 --> 01:01:27,240
Really good book.

983
01:01:27,240 --> 01:01:30,900
So I try to read like a lot of different kinds of things.

984
01:01:30,900 --> 01:01:34,880
But James A. Corey, and I know that that's a that's a pen name.

985
01:01:34,880 --> 01:01:38,500
But I've read a bunch of those.

986
01:01:38,500 --> 01:01:44,360
Recently I read Andy Fitturo's No Dogs in Philly, which is a really gritty cyberpunk

987
01:01:44,360 --> 01:01:45,360
novel.

988
01:01:45,360 --> 01:01:46,880
I actually enjoyed it a lot.

989
01:01:46,880 --> 01:01:51,760
I'm going to get the second version.

990
01:01:51,760 --> 01:01:56,440
Also Nick Webb, his Legacy Fleet series I enjoyed a lot.

991
01:01:56,440 --> 01:01:59,960
First book is kind of like a little rough to get through.

992
01:01:59,960 --> 01:02:03,000
But once you get through that first book, it just it takes off.

993
01:02:03,000 --> 01:02:08,920
I don't know if you've ever heard of Nick Webb, but check out his legacy fleet series

994
01:02:08,920 --> 01:02:10,080
is actually pretty good.

995
01:02:10,080 --> 01:02:15,080
So Cloud Atlas, you know, I enjoyed that book a lot.

996
01:02:15,080 --> 01:02:21,680
So I've got a pretty broad set of of things I like.

997
01:02:21,680 --> 01:02:25,580
Sometimes I'll read autobiographies as well.

998
01:02:25,580 --> 01:02:31,360
So the last one I read was Becoming Superman by J. Michael Straczynski, who's also a sci-fi

999
01:02:31,360 --> 01:02:32,360
writer.

1000
01:02:32,360 --> 01:02:34,360
He's the guy that created Babylon 5.

1001
01:02:34,360 --> 01:02:35,360
Oh, yeah.

1002
01:02:35,360 --> 01:02:37,960
Yeah, he's written a lot of comics.

1003
01:02:37,960 --> 01:02:42,200
That book really pulled me out of a pretty deep depression that I was going through at

1004
01:02:42,200 --> 01:02:43,200
the time.

1005
01:02:43,200 --> 01:02:45,200
I had been laid off.

1006
01:02:45,200 --> 01:02:49,800
It was one of those types of situations where I was laid off about a year and a half out

1007
01:02:49,800 --> 01:02:51,000
from fully vesting.

1008
01:02:51,000 --> 01:02:55,800
So there were a lot of plans that I had planned to do with that with that that stock money

1009
01:02:55,800 --> 01:02:57,920
and just never happened.

1010
01:02:57,920 --> 01:03:00,760
And I was just doing a lot of reevaluation.

1011
01:03:00,760 --> 01:03:04,800
And once I read that book, it had such a profound effect on me because I'm like, man, if this

1012
01:03:04,800 --> 01:03:10,880
guy could survive everything that he did, you know, there's no good reason why I can't

1013
01:03:10,880 --> 01:03:12,120
survive what I'm going through.

1014
01:03:12,120 --> 01:03:15,920
And it just gave me a lot of inspiration.

1015
01:03:15,920 --> 01:03:18,240
But I try to read a lot of different kinds of things.

1016
01:03:18,240 --> 01:03:23,520
Right now, I'm reading this book called Chronicles from the Future, the Imagen story of Paul

1017
01:03:23,520 --> 01:03:25,760
Amadeus Dina.

1018
01:03:25,760 --> 01:03:28,360
And that's kind of a weird book.

1019
01:03:28,360 --> 01:03:30,560
Definitely check that one out.

1020
01:03:30,560 --> 01:03:34,680
Yeah, so I try to read a lot of different kinds of things.

1021
01:03:34,680 --> 01:03:38,360
Have you been watching Dune prophecy?

1022
01:03:38,360 --> 01:03:39,360
I'm saving them.

1023
01:03:39,360 --> 01:03:43,160
I'm letting them build up and then then we're going to just marathon them.

1024
01:03:43,160 --> 01:03:45,320
But I've seen a couple of clips.

1025
01:03:45,320 --> 01:03:48,780
But yeah, I'm kind of holding off until we can watch them all.

1026
01:03:48,780 --> 01:03:51,640
But I look forward to it.

1027
01:03:51,640 --> 01:03:54,880
Let me encourage you to calibrate your expectations.

1028
01:03:54,880 --> 01:03:55,880
Really?

1029
01:03:55,880 --> 01:03:59,040
Yeah, I've stopped watching it.

1030
01:03:59,040 --> 01:04:01,720
I watched the first two episodes and I didn't get through the third.

1031
01:04:01,720 --> 01:04:05,360
I tried watching it on three different nights and I was like, I just don't care.

1032
01:04:05,360 --> 01:04:06,480
I don't care what happens.

1033
01:04:06,480 --> 01:04:07,960
I don't care about these characters.

1034
01:04:07,960 --> 01:04:09,360
Oh, wow.

1035
01:04:09,360 --> 01:04:10,360
Any of this.

1036
01:04:10,360 --> 01:04:11,360
Yeah.

1037
01:04:11,360 --> 01:04:13,000
Oh, one of them I forgot to mention.

1038
01:04:13,000 --> 01:04:15,560
Walter John Williams, Hardwired.

1039
01:04:15,560 --> 01:04:22,880
Yeah, Richard Paul Russo's, basically Richard Paul Russo's Destroying Angel.

1040
01:04:22,880 --> 01:04:24,760
That had had a huge effect on me in the 90s.

1041
01:04:24,760 --> 01:04:27,160
I mean, that book just melted my brain down.

1042
01:04:27,160 --> 01:04:31,160
If you've ever read Destroying Angel, that's very cyberpunk.

1043
01:04:31,160 --> 01:04:35,120
It's sort of set in a cyberpunk San Francisco.

1044
01:04:35,120 --> 01:04:37,680
And man, it's pretty intense.

1045
01:04:37,680 --> 01:04:42,680
Well, I listened to several audiobooks by Walter John Williams.

1046
01:04:42,680 --> 01:04:43,720
What's it called?

1047
01:04:43,720 --> 01:04:48,880
It's about this empire, this multi-species empire where the dominant species basically

1048
01:04:48,880 --> 01:04:54,280
went extinct and as soon as the last one died, all the other species sort of went to war

1049
01:04:54,280 --> 01:04:55,280
with each other.

1050
01:04:55,280 --> 01:04:58,280
Gosh, what was that called?

1051
01:04:58,280 --> 01:05:00,520
It's like at least six books.

1052
01:05:00,520 --> 01:05:02,480
It's two trilogies and maybe more.

1053
01:05:02,480 --> 01:05:07,440
It's been a long time in the mind of Walter John Williams in the last couple of years.

1054
01:05:07,440 --> 01:05:08,440
That was pretty cool.

1055
01:05:08,440 --> 01:05:09,440
Yeah.

1056
01:05:09,440 --> 01:05:11,160
Do you like David Mitchell?

1057
01:05:11,160 --> 01:05:13,840
He's the author of Cloud Atlas.

1058
01:05:13,840 --> 01:05:15,800
I have never read any David Mitchell.

1059
01:05:15,800 --> 01:05:16,800
Yeah.

1060
01:05:16,800 --> 01:05:22,200
Now that guy, the thing that's trippy about that book is that he completely changes his

1061
01:05:22,200 --> 01:05:24,240
writing style from story to story.

1062
01:05:24,240 --> 01:05:27,000
And there comes this point where you're like, wait a minute, and you like thumb back in

1063
01:05:27,000 --> 01:05:28,000
the book.

1064
01:05:28,000 --> 01:05:29,000
You're like, what's going on?

1065
01:05:29,000 --> 01:05:33,040
And it takes you a couple of seconds to figure out why is this voice like this?

1066
01:05:33,040 --> 01:05:36,400
He completely changes his writing voice from story to story.

1067
01:05:36,400 --> 01:05:38,520
It's quite amazing.

1068
01:05:38,520 --> 01:05:39,520
Yeah.

1069
01:05:39,520 --> 01:05:44,480
The big names in sci-fi for me, they tend to be older ones, I guess, like Ursula K.

1070
01:05:44,480 --> 01:05:47,480
Le Guin.

1071
01:05:47,480 --> 01:05:51,920
I liked, you know, I used to read a lot of Larry Niven books.

1072
01:05:51,920 --> 01:05:54,960
They're very sort of pulpy sci-fi adventure in space.

1073
01:05:54,960 --> 01:05:58,160
And you know, I have a love for that.

1074
01:05:58,160 --> 01:06:02,400
So I also love Ian M. Banks, you know, the culture novels and other space opera stuff

1075
01:06:02,400 --> 01:06:03,400
that he does.

1076
01:06:03,400 --> 01:06:08,800
I've read any of his literary fiction and maybe never will.

1077
01:06:08,800 --> 01:06:13,920
The culture stuff is just it's so perfect for, you know, my personality and interests.

1078
01:06:13,920 --> 01:06:18,080
Yeah, I've heard of the culture novels and I do want to read them, but I want to read

1079
01:06:18,080 --> 01:06:24,400
them after I'm done with mine because I don't want to be influenced by them.

1080
01:06:24,400 --> 01:06:27,520
So yeah, I do plan on reading them.

1081
01:06:27,520 --> 01:06:30,440
Because I've been told by people that have read my book, they're like, this is a lot

1082
01:06:30,440 --> 01:06:35,560
like the culture novels, because I do have like this expansive, you know, human diaspora

1083
01:06:35,560 --> 01:06:38,280
that has sort of spread itself across the galaxy.

1084
01:06:38,280 --> 01:06:44,320
And you know, there's been a lot of cultural and genetic drift because cultures have had

1085
01:06:44,320 --> 01:06:47,320
to adapt themselves to those different planets.

1086
01:06:47,320 --> 01:06:52,880
And so you have those genetic adaptations over time tend to build and, you know, you

1087
01:06:52,880 --> 01:06:56,200
just end up with very different looking humans.

1088
01:06:56,200 --> 01:07:00,600
But if people are comparing you to Ian and banks, take it as a compliment.

1089
01:07:00,600 --> 01:07:05,360
Yeah, I mean, I don't I don't think I'm there on that level yet, but I'm certainly I'm certainly

1090
01:07:05,360 --> 01:07:07,200
shooting for it.

1091
01:07:07,200 --> 01:07:09,920
Neil Stevenson is interesting.

1092
01:07:09,920 --> 01:07:11,920
Red Snow Crash.

1093
01:07:11,920 --> 01:07:18,000
Yeah, that's that's an interesting that's an interesting guy.

1094
01:07:18,000 --> 01:07:19,840
Books are a little long.

1095
01:07:19,840 --> 01:07:20,840
I haven't read this thing.

1096
01:07:20,840 --> 01:07:24,200
No, I did read Termination Shock.

1097
01:07:24,200 --> 01:07:27,280
So I'm kind of looking through my book list here.

1098
01:07:27,280 --> 01:07:30,960
Yeah, Neil Stevenson was pretty good.

1099
01:07:30,960 --> 01:07:33,280
Do you like John Scalzi?

1100
01:07:33,280 --> 01:07:37,840
You know, I've heard of Old Man's War, but I haven't read any Scalzi.

1101
01:07:37,840 --> 01:07:42,680
Yeah, Old Man's War was was pretty good.

1102
01:07:42,680 --> 01:07:45,680
I read God, I think I read them all.

1103
01:07:45,680 --> 01:07:49,960
The Collapsing Empire was the one that was pretty good as well.

1104
01:07:49,960 --> 01:07:52,160
It's a different book, but I think there are three different ones.

1105
01:07:52,160 --> 01:07:55,320
There's Collapsing Empire and the what was it?

1106
01:07:55,320 --> 01:07:57,440
I think it's The Last Emperor and a couple of others.

1107
01:07:57,440 --> 01:08:00,720
But that's a series that you might want to might want to check out.

1108
01:08:00,720 --> 01:08:01,760
All right.

1109
01:08:01,760 --> 01:08:10,960
Right now I'm listening to the audiobooks of the what are they called?

1110
01:08:10,960 --> 01:08:11,960
The Draca.

1111
01:08:11,960 --> 01:08:18,040
This is it's basically it starts out as an alternate history.

1112
01:08:18,040 --> 01:08:26,760
Where elements from the you know, the Confederacy end up taking over South Africa, along with

1113
01:08:26,760 --> 01:08:30,800
some people from Nordic countries and some Brits.

1114
01:08:30,800 --> 01:08:38,880
And they create this English speaking, you know, overtly racist empire and they become

1115
01:08:38,880 --> 01:08:40,160
expansionist.

1116
01:08:40,160 --> 01:08:47,000
And you know, they basically the European countries go to war with one another and fight

1117
01:08:47,000 --> 01:08:50,920
themselves into, you know, utter weakness, utter weakness.

1118
01:08:50,920 --> 01:08:54,440
And then the Draca just come in and, you know, take over.

1119
01:08:54,440 --> 01:09:00,680
And it basically their their position is we cannot countenance any, you know, any competing

1120
01:09:00,680 --> 01:09:02,180
political or social system.

1121
01:09:02,180 --> 01:09:05,160
So world domination is their objective.

1122
01:09:05,160 --> 01:09:07,440
Well, what is the name of this?

1123
01:09:07,440 --> 01:09:11,480
The first book is called Marching Through Georgia.

1124
01:09:11,480 --> 01:09:13,840
And the second one is Under the Yoke.

1125
01:09:13,840 --> 01:09:15,160
That's the one that I'm in right now.

1126
01:09:15,160 --> 01:09:17,400
But I know that there's three more.

1127
01:09:17,400 --> 01:09:23,640
And right now it's basically, you know, where I am in the book, it's in the late 1940s,

1128
01:09:23,640 --> 01:09:24,640
this alternate history.

1129
01:09:24,640 --> 01:09:28,080
But I know that it advances into the future, into space.

1130
01:09:28,080 --> 01:09:34,040
And eventually, somebody who is a descendant of these Draca, who is genetically engineered

1131
01:09:34,040 --> 01:09:41,760
and just really just ruthless and malevolent, malevolent comes over to our world and is

1132
01:09:41,760 --> 01:09:46,840
trying to build a border from theirs to ours so that they can, you know, basically invade

1133
01:09:46,840 --> 01:09:51,200
and impose their their cultural system on us.

1134
01:09:51,200 --> 01:09:52,200
Yeah.

1135
01:09:52,200 --> 01:09:56,680
So I'm really looking forward to those, although I'm told that those later books are not as

1136
01:09:56,680 --> 01:09:59,360
engaging as the the first ones.

1137
01:09:59,360 --> 01:10:03,800
But the first ones are really I mean, it's one of those things where as soon as I finish

1138
01:10:03,800 --> 01:10:05,600
the book, I'm on to the next one.

1139
01:10:05,600 --> 01:10:08,200
There's no temptation to go to any other series right now.

1140
01:10:08,200 --> 01:10:09,200
Oh, wow.

1141
01:10:09,200 --> 01:10:10,880
Yeah, I have to check that out.

1142
01:10:10,880 --> 01:10:13,200
That's how the Scalzi books were for me.

1143
01:10:13,200 --> 01:10:19,600
I mean, you know, it's the pandemic, of course, but I was doing anything to just keep my mind

1144
01:10:19,600 --> 01:10:21,040
occupied at that time.

1145
01:10:21,040 --> 01:10:23,080
But yeah, I'll have to check those books out.

1146
01:10:23,080 --> 01:10:24,080
All right.

1147
01:10:24,080 --> 01:10:26,520
Well, hey, Kenneth, it was good talking to you.

1148
01:10:26,520 --> 01:10:28,760
Yeah, it's good talking to you as well.

1149
01:10:28,760 --> 01:10:31,560
And you know, I'll keep following you on Substack.

1150
01:10:31,560 --> 01:10:36,800
And anyone that's interested in my books, you can go to books to read dot com forward

1151
01:10:36,800 --> 01:10:38,800
slash can Kenneth E.

1152
01:10:38,800 --> 01:10:43,800
Harrell and if you want to see my Substack, you can just go to Substack at Kenneth E.

1153
01:10:43,800 --> 01:10:47,120
Harrell and I will post a link.

1154
01:10:47,120 --> 01:10:48,120
All right.

1155
01:10:48,120 --> 01:10:49,400
It was good talking to you.

1156
01:10:49,400 --> 01:10:50,400
Yeah.

1157
01:10:50,400 --> 01:10:51,400
Take care.

1158
01:10:51,400 --> 01:10:53,200
All right.

1159
01:10:53,200 --> 01:10:54,200
That was Kenneth.

1160
01:10:54,200 --> 01:10:59,160
My New Year's resolution for twenty twenty four was to publish to Substack twice a week

1161
01:10:59,160 --> 01:11:01,280
on Tuesday and Thursday.

1162
01:11:01,280 --> 01:11:06,320
And with a couple misses, I kept that up all year until November when I cut down to just

1163
01:11:06,320 --> 01:11:13,360
Tuesday so that I could focus on writing fiction for National Novel Writing Month or NaNoWriMo.

1164
01:11:13,360 --> 01:11:18,800
But now that it's December and I'm writing twice a week again, the Thursday Thursday post

1165
01:11:18,800 --> 01:11:20,840
is feeling kind of forced.

1166
01:11:20,840 --> 01:11:25,720
So I think I am going to stick with a once a week publishing schedule for twenty twenty

1167
01:11:25,720 --> 01:11:31,240
five, but add in a weekly podcast conversation of some type.

1168
01:11:31,240 --> 01:11:37,200
Now, the most unappealing part of podcasting, at least for me, is the producer's role, which

1169
01:11:37,200 --> 01:11:39,840
is to say keeping the pipeline full.

1170
01:11:39,840 --> 01:11:46,520
So if you are a person on Substack who is interested in science fiction or A.I. or future

1171
01:11:46,520 --> 01:11:52,240
technology or anything that would sound right, you know, that would seem to fit in the context

1172
01:11:52,240 --> 01:11:58,020
of a show or a blog called Gen X science fiction and futurism, feel free to contact me.

1173
01:11:58,020 --> 01:12:02,200
We can talk and if you've got a book or some of something else you'd like to promote, well,

1174
01:12:02,200 --> 01:12:05,080
you know, I'll be linking to your Substack.

1175
01:12:05,080 --> 01:12:08,800
And I have to say, I am pretty Substack centric these days.

1176
01:12:08,800 --> 01:12:11,160
Got kicked off of Facebook years ago.

1177
01:12:11,160 --> 01:12:19,040
I tried to establish like self promotion habits on Instagram and X and things like that.

1178
01:12:19,040 --> 01:12:21,760
And I just I don't use those platforms.

1179
01:12:21,760 --> 01:12:23,140
I don't care about them.

1180
01:12:23,140 --> 01:12:25,020
They're not engaging to me.

1181
01:12:25,020 --> 01:12:29,320
They're not just naturally enjoyable and I just tend to neglect them.

1182
01:12:29,320 --> 01:12:34,040
Substack that's pretty much where I'm at in terms of online activity these days.

1183
01:12:34,040 --> 01:12:36,160
So I like the place.

1184
01:12:36,160 --> 01:12:37,720
I'm going to focus on it.

1185
01:12:37,720 --> 01:12:42,160
And I'm also going to use it as my primary means of networking and finding people to

1186
01:12:42,160 --> 01:12:44,040
talk to for this podcast.

1187
01:12:44,040 --> 01:12:45,400
All right.

1188
01:12:45,400 --> 01:12:46,400
I'm out.

1189
01:12:46,400 --> 01:12:55,080
Have a great day.

1190
01:12:55,080 --> 01:13:17,200
Thanks for everything.

1191
01:13:17,200 --> 01:13:22,360
OMEGA.

