1
00:00:00,000 --> 00:00:14,720
Sometimes it takes a rocket scientist with Dr. Pamela Nenges.

2
00:00:30,000 --> 00:00:48,320
Hi, this is Dr. Pamela Nenges with Sometimes It Takes a Rocket Scientist and Dr. Bart Bartholomew

3
00:00:48,320 --> 00:00:50,880
is joining us today for part two.

4
00:00:50,880 --> 00:00:51,880
Enjoy.

5
00:00:51,880 --> 00:00:58,360
One of the things I mentioned to you earlier, and this is a concern I have, and I assume

6
00:00:58,360 --> 00:01:08,600
that a lot of people do, that our key high-level government agencies seem to be losing their

7
00:01:08,600 --> 00:01:13,480
cultural foundations in terms of their structure and leadership.

8
00:01:13,480 --> 00:01:19,120
I think this is a significant, I'm not going to say threat, I'm going to say challenge

9
00:01:19,120 --> 00:01:21,840
to the way we do things in the U.S.

10
00:01:21,840 --> 00:01:28,400
I would like to see more emphasis on knowledge management and respect for the history and

11
00:01:28,400 --> 00:01:32,200
the foundations for what the agencies have done and what they need to do.

12
00:01:32,200 --> 00:01:33,200
Absolutely.

13
00:01:33,200 --> 00:01:36,320
And knowledge management, I'm glad you brought up that term.

14
00:01:36,320 --> 00:01:42,000
At my age, by the way, this is joking, okay, people come to me for wisdom.

15
00:01:42,000 --> 00:01:47,400
Whenever you get white here, you've been around a long time, you haven't been shot yet, or

16
00:01:47,400 --> 00:01:52,320
something, they go, hey, tell me about the old, like you are, tell me about the old days,

17
00:01:52,320 --> 00:01:53,960
what did you guys do?

18
00:01:53,960 --> 00:01:57,720
Knowledge management is absolutely a form of wisdom, okay?

19
00:01:57,720 --> 00:02:01,600
And it's no longer hard to do.

20
00:02:01,600 --> 00:02:04,760
I mean, my gosh, look at what we got.

21
00:02:04,760 --> 00:02:12,920
And the other day, I simulated using CHAT GPT, a simple AI system, okay, a dialogue

22
00:02:12,920 --> 00:02:18,160
between 10 people that we crafted, created, okay?

23
00:02:18,160 --> 00:02:23,560
And these were like, could be you, I interviewed these people, I talked about their personal

24
00:02:23,560 --> 00:02:28,160
characteristics, their lives, their history, and their technology.

25
00:02:28,160 --> 00:02:33,360
And then we had a CHAT GPT session on a particular subject, okay?

26
00:02:33,360 --> 00:02:40,240
Now the subject has to be pretty well defined, otherwise, you know, within one hour, we had

27
00:02:40,240 --> 00:02:45,880
another session, which built on the first session, and we were coming up with phenomenal

28
00:02:45,880 --> 00:02:46,880
answers.

29
00:02:46,880 --> 00:02:50,280
And we never involved the human being in the entire thing.

30
00:02:50,280 --> 00:02:55,320
Now if we involve real human beings, let's say I put you in CHAT GPT, you go find everything

31
00:02:55,320 --> 00:03:00,240
about you, you know, including the speeches you gave in your podcasts, they not only simulate

32
00:03:00,240 --> 00:03:06,400
your knowledge, they simulate your behavior too, like, oh, she's a pushy gal, or you know,

33
00:03:06,400 --> 00:03:09,000
she's a quiet gal, or whatever.

34
00:03:09,000 --> 00:03:12,840
They'll be able to do that without bothering too many people.

35
00:03:12,840 --> 00:03:18,160
And that really is knowledge management to some extent, that's based on the past, but

36
00:03:18,160 --> 00:03:21,200
we can use it to look at where do you go?

37
00:03:21,200 --> 00:03:26,280
You know, one question we asked is, what do you think the most crucial technology developments

38
00:03:26,280 --> 00:03:37,000
are needed to do, we were talking about, and I hate this term, the global power conflict,

39
00:03:37,000 --> 00:03:41,760
this is our new term, GPC, I think, global powers conflict.

40
00:03:41,760 --> 00:03:47,600
I hate the word conflict, because I truly would rather see the global powers collaboration.

41
00:03:47,600 --> 00:03:54,880
Now that's a hard thing to get, China, Russia, India, United States, you know, how do you

42
00:03:54,880 --> 00:03:56,440
get them to collaborate?

43
00:03:56,440 --> 00:04:03,840
But we're using CHAT GPT and AI to simulate sessions between leaders in those countries,

44
00:04:03,840 --> 00:04:08,520
to see where the problems are, see where it breaks down, you know, see where China walks

45
00:04:08,520 --> 00:04:10,840
away from the table and stuff like that.

46
00:04:10,840 --> 00:04:16,720
Now it's not real, but it's a simulation, you know, and you used AI before anybody,

47
00:04:16,720 --> 00:04:19,160
and now we're using it, you know, with the capability.

48
00:04:19,160 --> 00:04:21,160
Well, there were other people using AI.

49
00:04:21,160 --> 00:04:23,160
Well, not much.

50
00:04:23,160 --> 00:04:24,920
Harvey and Drexler and Drexler.

51
00:04:24,920 --> 00:04:26,920
Oh, yeah, well, of course.

52
00:04:26,920 --> 00:04:27,920
Yeah.

53
00:04:27,920 --> 00:04:28,920
But I had a different approach.

54
00:04:28,920 --> 00:04:32,240
And it's interesting, there's a guy at MIT now calling it liquid neurons.

55
00:04:32,240 --> 00:04:36,040
And of course, if you look back and looked at my work, you go, oh, she was kind of doing

56
00:04:36,040 --> 00:04:37,640
that 25 years ago.

57
00:04:37,640 --> 00:04:40,760
You were, you were, you were ahead of the curve, for sure.

58
00:04:40,760 --> 00:04:41,760
Yeah.

59
00:04:41,760 --> 00:04:44,880
Now, the other part of this is that you're talking about, you know, global stuff.

60
00:04:44,880 --> 00:04:51,920
Number one, you're talking about creating basically a digital dossier on how people

61
00:04:51,920 --> 00:04:53,120
function.

62
00:04:53,120 --> 00:04:57,520
So we're looking and this is an important thing that people don't understand is AI is

63
00:04:57,520 --> 00:05:00,000
being used to study respondent behavior.

64
00:05:00,000 --> 00:05:02,640
In other words, how they respond to things.

65
00:05:02,640 --> 00:05:03,640
Exactly.

66
00:05:03,640 --> 00:05:07,680
And that's a concern for me, because where it's highly productive for us to say, we're

67
00:05:07,680 --> 00:05:12,080
going to look at a concept in battle management for dealing with world powers.

68
00:05:12,080 --> 00:05:13,080
Yeah.

69
00:05:13,080 --> 00:05:15,520
But also, there are some human rights issues.

70
00:05:15,520 --> 00:05:16,520
Yeah, sure.

71
00:05:16,520 --> 00:05:17,520
Yeah.

72
00:05:17,520 --> 00:05:21,080
And that's, that's, that's, I think, a concern.

73
00:05:21,080 --> 00:05:29,960
But from a bigger concern is we need to find a way to get, what's the term, world leaders

74
00:05:29,960 --> 00:05:31,920
to behave like adults again.

75
00:05:31,920 --> 00:05:32,920
Exactly.

76
00:05:32,920 --> 00:05:33,920
Yes.

77
00:05:33,920 --> 00:05:37,320
They're like little kids in the playground, fighting each other.

78
00:05:37,320 --> 00:05:42,720
You know, I'll nuke you if you, if you're not happy with me, you know, that is, frankly,

79
00:05:42,720 --> 00:05:44,520
Pam, I'm worried.

80
00:05:44,520 --> 00:05:51,120
Now I'm old and maybe it's the time to get worried before you, you know, you go away.

81
00:05:51,120 --> 00:05:52,760
But I don't like what I see.

82
00:05:52,760 --> 00:05:59,920
I see more division in this country, more division in the world, more and more rogues.

83
00:05:59,920 --> 00:06:02,160
And that scares the living daylight out of me.

84
00:06:02,160 --> 00:06:08,120
You know, Korea, even Iran, you know, I mean, I, I, I don't, I don't know how bad they are,

85
00:06:08,120 --> 00:06:09,120
but it worries me.

86
00:06:09,120 --> 00:06:11,200
And a lot of these guys have a nuclear program.

87
00:06:11,200 --> 00:06:12,200
Oh, yeah.

88
00:06:12,200 --> 00:06:13,200
Everybody does.

89
00:06:13,200 --> 00:06:16,800
But you have to understand North Korea has a nuclear program because Russia has been

90
00:06:16,800 --> 00:06:17,800
helping them.

91
00:06:17,800 --> 00:06:18,800
Oh, I know.

92
00:06:18,800 --> 00:06:19,800
I know.

93
00:06:19,800 --> 00:06:20,800
Yeah.

94
00:06:20,800 --> 00:06:22,640
But so we have issues of proliferation.

95
00:06:22,640 --> 00:06:27,120
We have issues with people like Vladimir Putin, who are using nuclear menacing as a tool,

96
00:06:27,120 --> 00:06:29,120
which I find I just.

97
00:06:29,120 --> 00:06:30,120
Absolutely.

98
00:06:30,120 --> 00:06:34,400
And we, but we don't seem to have the ability to respond to this kind of behavior.

99
00:06:34,400 --> 00:06:38,760
What I find really extraordinary that we don't have people standing up in the West going,

100
00:06:38,760 --> 00:06:39,760
you just cut it out.

101
00:06:39,760 --> 00:06:40,760
We're not going to do this.

102
00:06:40,760 --> 00:06:41,760
Right, right.

103
00:06:41,760 --> 00:06:42,760
Exactly.

104
00:06:42,760 --> 00:06:45,720
No, I mean, it seems so logical in our personal lives.

105
00:06:45,720 --> 00:06:49,880
If we had a family that was dysfunctional or, you know, or just friends or whatever, we

106
00:06:49,880 --> 00:06:50,880
do that, right?

107
00:06:50,880 --> 00:06:51,880
We'd say, wait a minute here.

108
00:06:51,880 --> 00:06:53,240
This is not where to go.

109
00:06:53,240 --> 00:06:54,840
You know, we got to start.

110
00:06:54,840 --> 00:06:55,840
What do we have in common?

111
00:06:55,840 --> 00:06:57,720
What, you know, what is it?

112
00:06:57,720 --> 00:07:00,560
Now of course, in common is survival, right?

113
00:07:00,560 --> 00:07:04,760
Because on the nuclear issue, I don't know if you read it.

114
00:07:04,760 --> 00:07:06,200
It's called nuclear war.

115
00:07:06,200 --> 00:07:07,880
I just came out.

116
00:07:07,880 --> 00:07:12,480
I don't know if you can see it, but the name of the book is nuclear war.

117
00:07:12,480 --> 00:07:13,800
I just bought it.

118
00:07:13,800 --> 00:07:18,920
And in the first chapter, which is about five pages long, the world goes away because of

119
00:07:18,920 --> 00:07:21,320
one errant hydrogen bomb hit in the Pentagon.

120
00:07:21,320 --> 00:07:22,320
Okay.

121
00:07:22,320 --> 00:07:25,640
Errant being, you know, some nut, let it off.

122
00:07:25,640 --> 00:07:30,200
The world goes away and about, well, it doesn't go away, but you know, it collapses very,

123
00:07:30,200 --> 00:07:31,200
very quickly.

124
00:07:31,200 --> 00:07:36,120
So the nucleus threat is terrifying to me because there are some crazy people that may

125
00:07:36,120 --> 00:07:37,480
have access to one.

126
00:07:37,480 --> 00:07:39,360
All it takes is one, you know?

127
00:07:39,360 --> 00:07:44,100
Well, we have a proliferation issue right now and largely is due to the lack of leadership

128
00:07:44,100 --> 00:07:48,560
in the Western world and our ability to collaborate with other partners.

129
00:07:48,560 --> 00:07:52,080
And obviously the Middle East is a significant issue.

130
00:07:52,080 --> 00:07:53,080
Absolutely.

131
00:07:53,080 --> 00:07:56,520
And Iran has every intention of acquiring a nuclear warhead.

132
00:07:56,520 --> 00:08:01,280
And we make a mistake of not making them friends, which we should have done 20 years ago.

133
00:08:01,280 --> 00:08:02,280
Absolutely.

134
00:08:02,280 --> 00:08:06,280
Because they're an educated, competent, highly sophisticated society.

135
00:08:06,280 --> 00:08:07,280
Yeah.

136
00:08:07,280 --> 00:08:08,280
Yeah, absolutely.

137
00:08:08,280 --> 00:08:14,960
And now, and now, which is God smacking to me, they're now friends with Saudi Arabia,

138
00:08:14,960 --> 00:08:19,520
who had been their arch enemy for 25 years.

139
00:08:19,520 --> 00:08:26,320
And the United States fought a surrogate war against their people in Saudi Arabia.

140
00:08:26,320 --> 00:08:31,560
I mean, to me, it's just stunning that we don't seem to have people.

141
00:08:31,560 --> 00:08:34,000
It's like Washington is half asleep.

142
00:08:34,000 --> 00:08:35,000
Yeah.

143
00:08:35,000 --> 00:08:38,640
And I don't know how we fix that.

144
00:08:38,640 --> 00:08:39,640
We need to figure out.

145
00:08:39,640 --> 00:08:43,120
I was just going to say, I was going to say the opposite of that.

146
00:08:43,120 --> 00:08:47,960
I really do credit Ronald Reagan for saying, look, we don't want a nuclear war with the

147
00:08:47,960 --> 00:08:49,520
Soviet unions.

148
00:08:49,520 --> 00:08:51,720
What can we do to prevent that?

149
00:08:51,720 --> 00:08:56,520
And of course, some of it was strength, but some of it was compromise to some extent.

150
00:08:56,520 --> 00:09:00,120
You know, yeah, the NASC program was part of Star Wars.

151
00:09:00,120 --> 00:09:03,600
And you know, he wanted to show, you know, we could do it.

152
00:09:03,600 --> 00:09:09,120
But on the other hand, it was like, you know, the olive branch and the weapon, you know,

153
00:09:09,120 --> 00:09:11,600
he was offering Gorbachev an olive branch too.

154
00:09:11,600 --> 00:09:12,600
Carrot is thick.

155
00:09:12,600 --> 00:09:13,600
Carrot is thick, yeah.

156
00:09:13,600 --> 00:09:14,600
So I'm worried too, Carrot.

157
00:09:14,600 --> 00:09:22,200
You got more years to live than me, but I'm worried too.

158
00:09:22,200 --> 00:09:26,840
Well, here's what I want to say very quickly, because you don't hear about this in like

159
00:09:26,840 --> 00:09:29,400
CNN.

160
00:09:29,400 --> 00:09:33,920
This Strategic Defense Initiative, Star Wars, trained a whole generation of scientists and

161
00:09:33,920 --> 00:09:34,920
engineers.

162
00:09:34,920 --> 00:09:35,920
Yeah.

163
00:09:35,920 --> 00:09:38,520
It basically funded the infrastructures we currently have.

164
00:09:38,520 --> 00:09:39,520
Yeah.

165
00:09:39,520 --> 00:09:44,040
And my generation benefited tremendously from it.

166
00:09:44,040 --> 00:09:46,840
Now we don't see that with the millennials and the Z's.

167
00:09:46,840 --> 00:09:48,460
We don't see these great programs.

168
00:09:48,460 --> 00:09:51,040
We don't see these great efforts.

169
00:09:51,040 --> 00:09:56,920
We have people from Wall Street to IBM deciding what technology is in this country, and we

170
00:09:56,920 --> 00:09:57,920
can't do that.

171
00:09:57,920 --> 00:10:03,360
We have to have leadership that recognizes the importance of technology in the structure

172
00:10:03,360 --> 00:10:10,120
of our economy, our security, our defense, and what benefits our society.

173
00:10:10,120 --> 00:10:13,960
And the other part of this, I think, and I just want to say, amen to that.

174
00:10:13,960 --> 00:10:14,960
Damn.

175
00:10:14,960 --> 00:10:18,720
I completely agree with what you said.

176
00:10:18,720 --> 00:10:23,320
And the other part of this is that I'm a weirdo aerospace engineer because I'm actually an

177
00:10:23,320 --> 00:10:25,800
engineering physicist.

178
00:10:25,800 --> 00:10:29,080
And I've been to nuclear weapons school.

179
00:10:29,080 --> 00:10:32,440
I was trained by Dr. William Highland Chambers as well as you.

180
00:10:32,440 --> 00:10:35,600
He founded the Nuclear Emergency Search Team.

181
00:10:35,600 --> 00:10:38,000
I worked in the intelligence community.

182
00:10:38,000 --> 00:10:43,080
And most Americans today, including some listening to this now, don't realize the Department

183
00:10:43,080 --> 00:10:47,000
of Energy, they're not the guys who put the stars on your refrigerators.

184
00:10:47,000 --> 00:10:49,480
They're a defense and intelligence agency.

185
00:10:49,480 --> 00:10:52,800
Yes, yes, right.

186
00:10:52,800 --> 00:10:56,840
And apparently someone needs to tell Congress.

187
00:10:56,840 --> 00:10:57,840
Yeah.

188
00:10:57,840 --> 00:11:05,920
You know, we're so, I think part of that is a symptom of the bigger issue of how divided

189
00:11:05,920 --> 00:11:06,920
we are.

190
00:11:06,920 --> 00:11:11,000
You know, the reason they're asleep, I do believe that part of the reason that they're

191
00:11:11,000 --> 00:11:17,080
asleep is they spend so much time fighting each other that very little collaboration

192
00:11:17,080 --> 00:11:18,080
is there.

193
00:11:18,080 --> 00:11:19,520
Now, I know there are good people in Congress.

194
00:11:19,520 --> 00:11:23,520
I know some people that are wonderful people.

195
00:11:23,520 --> 00:11:30,000
But then when you get to the top of the pile and you get one party just screaming, the

196
00:11:30,000 --> 00:11:35,080
other guys are a bunch of nuts and vice versa, it's kind of hard to get anything done.

197
00:11:35,080 --> 00:11:39,160
It would be like a marriage that is completely dysfunctional.

198
00:11:39,160 --> 00:11:41,280
And usually that ends up in divorce.

199
00:11:41,280 --> 00:11:43,280
But in America, we can't afford a divorce.

200
00:11:43,280 --> 00:11:46,240
You know, we have to what I don't know what it's going to take.

201
00:11:46,240 --> 00:11:47,240
I really don't.

202
00:11:47,240 --> 00:11:49,440
And here we get this upcoming election.

203
00:11:49,440 --> 00:11:50,600
I don't want to get into that.

204
00:11:50,600 --> 00:11:55,800
But you know, and there's no difference from what happened four years ago or eight years

205
00:11:55,800 --> 00:11:56,800
ago.

206
00:11:56,800 --> 00:11:58,840
We haven't learned a lesson yet.

207
00:11:58,840 --> 00:12:00,920
We're still mad at each other, you know?

208
00:12:00,920 --> 00:12:03,600
Well, I'm concerned when they're saying, oh, it's the end of democracy.

209
00:12:03,600 --> 00:12:04,600
It's the end of the country.

210
00:12:04,600 --> 00:12:05,600
No, it's not.

211
00:12:05,600 --> 00:12:10,960
Yeah, and the United States is not going to come to an end because we got another president

212
00:12:10,960 --> 00:12:13,520
who may or may not be terribly competent.

213
00:12:13,520 --> 00:12:15,120
It's not the issue.

214
00:12:15,120 --> 00:12:19,840
We are a country that is fairly substantial internationally still.

215
00:12:19,840 --> 00:12:24,480
We're still one of the probably two superpowers on the planet.

216
00:12:24,480 --> 00:12:28,800
The United States is not going to come to an end.

217
00:12:28,800 --> 00:12:33,160
We've got to stop this pre adolescent crap.

218
00:12:33,160 --> 00:12:34,960
Yeah, it is.

219
00:12:34,960 --> 00:12:35,960
It's childish.

220
00:12:35,960 --> 00:12:36,960
It's absolutely childish.

221
00:12:36,960 --> 00:12:39,960
Yeah, or very old age.

222
00:12:39,960 --> 00:12:46,880
It's like childish behavior with old people, you know, or something like that.

223
00:12:46,880 --> 00:12:52,120
Join us next time on Sometimes It Takes a Rocket Scientist.

224
00:12:52,120 --> 00:13:14,680
For part three of Dr. Pamela Mendes' interview with Dr. Bart Bartholay.

