1
00:00:00,000 --> 00:00:06,200
Welcome to the Azure Security Podcast,

2
00:00:06,200 --> 00:00:09,360
where we discuss topics relating to security, privacy,

3
00:00:09,360 --> 00:00:13,720
reliability, and compliance on the Microsoft Cloud Platform.

4
00:00:13,720 --> 00:00:17,200
Hey everybody, welcome to Episode 50.

5
00:00:17,200 --> 00:00:19,440
Yes, we've made it to 50 episodes.

6
00:00:19,440 --> 00:00:20,800
Not quite two years,

7
00:00:20,800 --> 00:00:22,720
we have to wait for two more episodes.

8
00:00:22,720 --> 00:00:24,360
But yeah, Episode 50,

9
00:00:24,360 --> 00:00:27,000
this has been a real fun endeavor actually.

10
00:00:27,000 --> 00:00:28,120
We've had a lot of fun,

11
00:00:28,120 --> 00:00:32,160
we had some fantastic guests in the previous episodes.

12
00:00:32,160 --> 00:00:35,200
I really enjoyed doing this and I think based on

13
00:00:35,200 --> 00:00:37,600
the feedback that we get from people who listen,

14
00:00:37,600 --> 00:00:39,960
is doing a good service out there as well.

15
00:00:39,960 --> 00:00:42,960
So yeah, 50 episodes, who would have thought it?

16
00:00:42,960 --> 00:00:45,200
This week, we actually have the full gang here.

17
00:00:45,200 --> 00:00:46,480
It's myself, Michael.

18
00:00:46,480 --> 00:00:49,160
We have Gladys, we have Sarah and Mark.

19
00:00:49,160 --> 00:00:51,560
We don't have a guest this week, it's a little bit different.

20
00:00:51,560 --> 00:00:53,480
Mark is going to talk about

21
00:00:53,480 --> 00:00:56,240
the Microsoft Security Reference Architecture.

22
00:00:56,240 --> 00:00:58,000
Because it's just the four of us,

23
00:00:58,000 --> 00:00:59,800
we actually have no news.

24
00:00:59,800 --> 00:01:02,240
Well, in actual fact, we weren't going to have any news,

25
00:01:02,240 --> 00:01:05,240
but something popped up that I really can't help talking about,

26
00:01:05,240 --> 00:01:08,320
mainly because it's in a product that's near and dear to my heart,

27
00:01:08,320 --> 00:01:10,880
using a feature that is near and dear to my heart.

28
00:01:10,880 --> 00:01:14,840
That is that Cosmos DB now has always encrypted,

29
00:01:14,840 --> 00:01:16,320
is now generally available.

30
00:01:16,320 --> 00:01:18,600
So always encrypted is client-side encryption,

31
00:01:18,600 --> 00:01:22,960
basically the same technology that's used by SQL Server always encrypted.

32
00:01:22,960 --> 00:01:24,920
So the client has the keys,

33
00:01:24,920 --> 00:01:27,600
the encryption and decryption is done at the clients,

34
00:01:27,600 --> 00:01:34,480
you can do a subset of queries against encrypted data without decrypting it.

35
00:01:34,480 --> 00:01:36,760
So you're probably familiar with encryption of data at rest,

36
00:01:36,760 --> 00:01:38,080
encryption of data on the wire,

37
00:01:38,080 --> 00:01:40,560
or this is encryption of data while it's in use.

38
00:01:40,560 --> 00:01:43,360
So it's fantastic technology, I'm a huge fan of it.

39
00:01:43,360 --> 00:01:46,000
It puts no strain on the server at all

40
00:01:46,000 --> 00:01:48,080
because all the crypto is done by the client.

41
00:01:48,080 --> 00:01:51,880
Cosmos DB also doesn't have the keys either.

42
00:01:51,880 --> 00:01:54,480
So in the case of, for example,

43
00:01:54,480 --> 00:01:58,720
heaven forbid but compromise of Cosmos DB or your installation of it,

44
00:01:58,720 --> 00:02:00,560
the attacker doesn't have the keys.

45
00:02:00,560 --> 00:02:03,360
So great to see that, congratulations to the Cosmos DB teams.

46
00:02:03,360 --> 00:02:04,480
And in fact, in a few weeks,

47
00:02:04,480 --> 00:02:08,160
we'll actually have the Cosmos DB folks here to talk in depth

48
00:02:08,160 --> 00:02:09,760
about some of the other stuff that's coming out

49
00:02:09,760 --> 00:02:12,320
as well as talk about the always encrypted.

50
00:02:12,320 --> 00:02:16,240
So now that we've got my little news item out of the way,

51
00:02:16,240 --> 00:02:19,120
let's turn our attention to Mark,

52
00:02:19,120 --> 00:02:22,720
who's going to talk to us about Microsoft Security Reference Architecture,

53
00:02:22,720 --> 00:02:25,120
or otherwise known as the MCRA.

54
00:02:25,120 --> 00:02:27,920
I would say welcome, Mark, but it's kind of not really needed.

55
00:02:28,720 --> 00:02:30,480
So Mark, can you just sort of lead us through

56
00:02:30,480 --> 00:02:33,040
and get us started with MCRA?

57
00:02:33,040 --> 00:02:36,160
So yeah, I want to actually talk about MCRA,

58
00:02:36,160 --> 00:02:37,680
Cyber Reference Architecture,

59
00:02:37,680 --> 00:02:40,000
but also the CAST Secure methodology

60
00:02:40,000 --> 00:02:42,880
because they're kind of two parts of the same whole.

61
00:02:43,680 --> 00:02:46,640
So we've been spending a lot of time in recent months

62
00:02:46,640 --> 00:02:50,640
and quite frankly years working out a lot of the details,

63
00:02:50,640 --> 00:02:53,600
based on working with customers, our own IT organization, etc.

64
00:02:54,160 --> 00:02:58,640
on what does a good end-to-end security program look like

65
00:02:58,640 --> 00:03:01,600
and what does a good end-to-end security architecture look like.

66
00:03:02,160 --> 00:03:04,400
And as it turns out, it's not just one diagram.

67
00:03:04,960 --> 00:03:08,640
So the original version of that reference architecture

68
00:03:09,520 --> 00:03:12,400
probably, I don't know, five, six years ago, the thing started,

69
00:03:13,280 --> 00:03:14,720
you know, it was just a single diagram

70
00:03:14,720 --> 00:03:16,880
that had all of Microsoft's capabilities on it

71
00:03:16,880 --> 00:03:19,680
and kind of how they logically connected together

72
00:03:19,680 --> 00:03:21,280
and grouped together, etc.

73
00:03:21,280 --> 00:03:23,520
And so that's sort of like that core capabilities one,

74
00:03:23,520 --> 00:03:25,760
that one's still there, still getting updated.

75
00:03:25,760 --> 00:03:27,840
And then what we found over time

76
00:03:27,840 --> 00:03:29,520
is what we had to develop sort of,

77
00:03:29,520 --> 00:03:32,480
okay, well, how does the SOC or the security operations team

78
00:03:33,120 --> 00:03:35,520
and all the tools related to XDR and SIM

79
00:03:35,520 --> 00:03:37,120
and threat intelligence and all that,

80
00:03:37,120 --> 00:03:39,760
how did those all connect with SOAR and UEBA

81
00:03:39,760 --> 00:03:41,200
and all those kind of things.

82
00:03:41,200 --> 00:03:44,960
SOAR is Security Orchestration, Automation and Remediation,

83
00:03:45,520 --> 00:03:48,320
and UEBA is Behavior Analytics,

84
00:03:48,320 --> 00:03:50,640
User and Entity Behavior Analytics.

85
00:03:50,640 --> 00:03:52,880
But like, how does all that work together?

86
00:03:52,880 --> 00:03:58,000
How does modern user access using Xerotrust principles,

87
00:03:58,000 --> 00:03:59,760
how does that all connect together?

88
00:04:00,880 --> 00:04:04,560
And what are all the native security capabilities in Azure?

89
00:04:04,560 --> 00:04:06,240
So we ended up answering a lot of questions

90
00:04:06,240 --> 00:04:09,120
with what became a bunch of complicated diagrams.

91
00:04:09,120 --> 00:04:11,760
And so the Cyber Security Reference Architectures

92
00:04:11,760 --> 00:04:14,080
is actually plural, sometimes forget to say the S,

93
00:04:14,800 --> 00:04:16,880
with a bunch of those diagrams pulled together

94
00:04:16,880 --> 00:04:18,800
and a few other kind of key highlights.

95
00:04:20,000 --> 00:04:23,200
And then the CAF Secure Methodology is sort of that,

96
00:04:23,760 --> 00:04:26,880
slightly higher level, that's more of like a CIO, CISO,

97
00:04:26,880 --> 00:04:29,120
and directs kind of look at the world,

98
00:04:29,120 --> 00:04:31,680
as opposed to sort of an architect, technical manager,

99
00:04:32,560 --> 00:04:33,920
technical director kind of view.

100
00:04:34,480 --> 00:04:37,440
So the CAF Secure Methodology is really about that,

101
00:04:37,440 --> 00:04:39,520
what does an end-to-end security program look like

102
00:04:39,520 --> 00:04:44,000
and how does it plug into the bigger picture business

103
00:04:44,000 --> 00:04:46,160
and cloud adoption and digital transformation,

104
00:04:46,160 --> 00:04:48,000
all the other stuff that's happening

105
00:04:48,000 --> 00:04:51,280
that security is trying to keep up with and keep safe.

106
00:04:51,280 --> 00:04:53,280
So I've read quite a, over the years,

107
00:04:53,280 --> 00:04:55,760
read quite a bit of the MCRA documentation.

108
00:04:56,400 --> 00:04:58,160
So what was the genesis of this?

109
00:04:58,160 --> 00:05:01,520
I mean, how did it get started and how do people use it?

110
00:05:03,840 --> 00:05:05,440
That's actually a little bit of a funny story.

111
00:05:06,640 --> 00:05:08,800
So Ann Johnson, huge fan of hers,

112
00:05:08,800 --> 00:05:10,080
and she was my boss at the time,

113
00:05:10,880 --> 00:05:14,720
was speaking in front of a big portion of our field

114
00:05:14,720 --> 00:05:17,920
at one of our sort of get-together internal conference events.

115
00:05:18,480 --> 00:05:21,040
And she, well, she was talking,

116
00:05:21,040 --> 00:05:23,360
she said, see that person over there, Mark Simos,

117
00:05:23,360 --> 00:05:24,880
waved, and she's like,

118
00:05:24,880 --> 00:05:28,240
he's going to build a cybersecurity reference architecture

119
00:05:28,240 --> 00:05:30,480
that will show how all our security technology

120
00:05:30,480 --> 00:05:33,760
works and connects with everything that y'all are working with.

121
00:05:33,760 --> 00:05:35,200
And I'm like taking a note.

122
00:05:37,120 --> 00:05:39,600
And so that was sort of the genesis of it.

123
00:05:39,600 --> 00:05:43,520
And it ultimately answered that first most important question

124
00:05:43,520 --> 00:05:48,720
as Microsoft built and got on this crazy train of cybersecurity

125
00:05:48,720 --> 00:05:52,000
and the massive portfolio that we've built up since.

126
00:05:52,720 --> 00:05:55,440
I think we put like a billion a year into it historically,

127
00:05:55,440 --> 00:05:59,280
and we're looking to put another $5 billion in the next,

128
00:05:59,280 --> 00:06:02,160
no, $20 billion over the next five years, or something like that.

129
00:06:02,160 --> 00:06:04,480
Anyway, it's multiple billions of dollars of investment a year.

130
00:06:04,480 --> 00:06:05,280
It's insane.

131
00:06:06,160 --> 00:06:09,120
But the first question that answered at the beginning of that

132
00:06:09,120 --> 00:06:09,920
is like, what do we have?

133
00:06:10,480 --> 00:06:12,400
Right? And like, what does that actually look like?

134
00:06:12,400 --> 00:06:14,880
And what are the different capabilities that are in there?

135
00:06:14,880 --> 00:06:15,600
How do they relate?

136
00:06:16,160 --> 00:06:18,480
And so that was really sort of the genesis of it,

137
00:06:18,480 --> 00:06:20,080
is answering that question.

138
00:06:20,080 --> 00:06:22,000
And then more questions came up over time.

139
00:06:22,560 --> 00:06:25,440
So I've been watching several of the videos,

140
00:06:25,440 --> 00:06:28,480
and there's a lot of interesting information in there.

141
00:06:29,200 --> 00:06:32,240
Can you list some of the topics addressed?

142
00:06:33,040 --> 00:06:33,920
Oh, yeah.

143
00:06:33,920 --> 00:06:36,560
So the MCRA, for those that haven't seen it,

144
00:06:36,560 --> 00:06:39,120
is kind of, there's like a main menu slide

145
00:06:39,120 --> 00:06:42,000
with using PowerPoint zooms in the first two slides.

146
00:06:42,000 --> 00:06:46,240
So there's two different pages of kind of these interactive things

147
00:06:46,240 --> 00:06:46,960
you can click on.

148
00:06:47,600 --> 00:06:50,160
And so what we did is we decided that each of those sections

149
00:06:50,160 --> 00:06:52,800
they point to, one to three slides,

150
00:06:52,800 --> 00:06:54,640
one to five slides kind of thing.

151
00:06:54,640 --> 00:06:57,520
We were going to record a video around each of those.

152
00:06:57,520 --> 00:06:59,360
Those are the MCRA videos.

153
00:06:59,920 --> 00:07:02,880
And so we cover, let's see, what are the top ones?

154
00:07:02,880 --> 00:07:04,960
There's the big one in the middle is the people.

155
00:07:05,920 --> 00:07:08,400
So that's like, what are the roles and responsibilities?

156
00:07:08,400 --> 00:07:10,240
How were those evolving in today's world?

157
00:07:10,240 --> 00:07:14,560
In terms of what are the jobs and how are they changing?

158
00:07:14,560 --> 00:07:19,360
And we got the new kind of thing emerging of posture management

159
00:07:20,160 --> 00:07:21,360
that's super important.

160
00:07:21,360 --> 00:07:24,480
That really complements the sock on sort of an operational function

161
00:07:24,480 --> 00:07:29,360
of security, but focused on prevention rather than focused on reaction,

162
00:07:29,360 --> 00:07:31,840
detect, respond, recover kind of things.

163
00:07:32,400 --> 00:07:37,040
And so kind of talking about that and the roles and how does security

164
00:07:37,040 --> 00:07:40,000
fit within DevOps and OT and all these other things.

165
00:07:40,000 --> 00:07:42,560
So the people one is definitely one of the more popular ones.

166
00:07:43,360 --> 00:07:45,120
The capabilities one, the original one,

167
00:07:45,920 --> 00:07:48,560
zero trust user access is a big one.

168
00:07:49,440 --> 00:07:51,280
Using Azure AD conditional access,

169
00:07:51,280 --> 00:07:54,160
how do we achieve that sort of zero trust approach

170
00:07:54,160 --> 00:07:57,440
to validating devices, validating users before giving you

171
00:07:57,440 --> 00:07:59,600
access to those highly valuable resources?

172
00:08:00,400 --> 00:08:03,280
Azure native controls is a big popular one there.

173
00:08:04,240 --> 00:08:07,360
That kind of covers all of the different stuff in there

174
00:08:07,360 --> 00:08:09,280
that is built in Azure.

175
00:08:09,280 --> 00:08:12,400
And then the other things that also help secure your access to Azure,

176
00:08:12,400 --> 00:08:15,520
the workstations you use, the accounts you use, etc.

177
00:08:16,880 --> 00:08:19,760
There's one on the multi-cloud and cross-platform.

178
00:08:20,320 --> 00:08:23,520
So like a lot of people think Microsoft just secures Microsoft.

179
00:08:24,320 --> 00:08:25,520
And that's not the case.

180
00:08:25,520 --> 00:08:29,520
We secure AWS, GCP, Linux, you name it.

181
00:08:29,520 --> 00:08:31,600
I mean, we're a security company period.

182
00:08:32,560 --> 00:08:34,080
And so kind of highlighting that.

183
00:08:34,080 --> 00:08:37,440
Sassy, a lot of people have been asking a lot of questions about Sassy.

184
00:08:37,440 --> 00:08:39,840
What is it? How does it fit? How does it compare to zero trust?

185
00:08:40,720 --> 00:08:43,520
OT, how do I secure my operational technology,

186
00:08:43,520 --> 00:08:45,360
my industrial control systems or ICS?

187
00:08:46,880 --> 00:08:52,480
Scatter type of technology, security operations I mentioned earlier.

188
00:08:53,040 --> 00:08:55,840
And then we have some other ones that we threw some other diagrams in there

189
00:08:55,840 --> 00:08:59,680
that people tend to like around how our threat intelligence works,

190
00:08:59,680 --> 00:09:03,520
how the different SOC tools and components connect,

191
00:09:04,240 --> 00:09:06,560
how the SOC components actually,

192
00:09:06,560 --> 00:09:10,080
you know, security operations components actually integrate with access control.

193
00:09:10,080 --> 00:09:14,480
So you can make sure people don't log in and get access to valuable resources

194
00:09:14,480 --> 00:09:16,160
when their device is infected.

195
00:09:16,160 --> 00:09:19,120
A little bit of the privilege access story to, you know,

196
00:09:19,120 --> 00:09:22,320
that's super important to protect the admins, ransomware,

197
00:09:22,880 --> 00:09:25,760
which is just devastating and highly profitable.

198
00:09:25,760 --> 00:09:27,760
It keeps growing because of that.

199
00:09:27,760 --> 00:09:30,720
And, you know, the whole zero trust piece around,

200
00:09:30,720 --> 00:09:33,840
you know, the rapid deminitization plan, what do I do first, next, later?

201
00:09:33,840 --> 00:09:35,920
What does that look like architecturally?

202
00:09:35,920 --> 00:09:38,080
So there's a bunch of different sequences there.

203
00:09:38,080 --> 00:09:43,280
I always love the people and process portion that you have in there.

204
00:09:43,280 --> 00:09:47,680
And I always amaze how many organizations

205
00:09:49,680 --> 00:09:54,960
maybe address the people through training, but they don't address the process.

206
00:09:54,960 --> 00:10:00,320
Do you want to talk a little bit about this and how it changes

207
00:10:00,320 --> 00:10:05,840
when modernization is happening in a continuous basis?

208
00:10:05,840 --> 00:10:08,320
The thing that I've seen is because, you know,

209
00:10:08,320 --> 00:10:12,160
security has always been sort of a technical discipline

210
00:10:12,160 --> 00:10:15,600
or it started out as one, you know, there's a lot of technologists,

211
00:10:15,600 --> 00:10:18,560
a lot of people that just think in the frame of technology, right?

212
00:10:18,560 --> 00:10:21,360
You know, it's a problem, so therefore I'm going to apply a technical

213
00:10:21,360 --> 00:10:25,120
solution. It's that old proverb of, if I have a hammer,

214
00:10:25,120 --> 00:10:26,960
then every problem looks like a nail.

215
00:10:26,960 --> 00:10:29,520
But, you know, we have that challenge because, you know,

216
00:10:29,520 --> 00:10:33,840
people are just familiar with the trend on technology they're recruited for

217
00:10:33,840 --> 00:10:35,680
and hired based on their technical skills.

218
00:10:35,680 --> 00:10:39,680
So we have this like massive technology bias in the security industry.

219
00:10:39,680 --> 00:10:43,600
But the truth is we're protecting systems that, you know,

220
00:10:43,600 --> 00:10:45,440
ultimately they're run by people.

221
00:10:45,440 --> 00:10:47,680
The value that they bring is to people.

222
00:10:47,680 --> 00:10:49,120
You know, that's what we're doing.

223
00:10:49,120 --> 00:10:52,480
The value that they bring is to people. You know, that's what the business is there

224
00:10:52,480 --> 00:10:54,560
is to serving customers that are choosing to pay.

225
00:10:55,280 --> 00:10:57,360
So there's a lot of people involved in the process.

226
00:10:57,360 --> 00:11:00,560
I mean, you even look at, you know, the role of cybersecurity and,

227
00:11:01,840 --> 00:11:05,600
and like geopolitics and whatnot, you know, ultimately that's a very

228
00:11:05,600 --> 00:11:07,920
people-centric discipline of politics, right?

229
00:11:07,920 --> 00:11:11,360
Like, I mean, just people is just woven throughout it.

230
00:11:11,360 --> 00:11:14,640
And so, you know, we've, for us to sort of succeed, you know,

231
00:11:14,640 --> 00:11:18,400
we can't be applying a technical solution to a process problem

232
00:11:18,400 --> 00:11:19,520
or a people problem.

233
00:11:19,520 --> 00:11:22,800
We have to be applying the right solution, the right problem,

234
00:11:22,800 --> 00:11:26,160
and we have to make sure that the teams, the security people on the teams,

235
00:11:26,160 --> 00:11:27,680
are getting what they need.

236
00:11:27,680 --> 00:11:30,080
And they have the process to work together.

237
00:11:30,080 --> 00:11:33,920
I mean, almost all the organizations that I've worked with,

238
00:11:34,480 --> 00:11:38,480
if they're not now, they were, you know, within the past probably three to five years,

239
00:11:39,040 --> 00:11:41,200
broken into a bunch of individual silos,

240
00:11:41,200 --> 00:11:43,200
and they have no processes to communicate across them.

241
00:11:43,760 --> 00:11:47,360
Yeah, we've seen a lot of people that get a lot of value out of that sort of,

242
00:11:47,360 --> 00:11:49,440
you know, this is how the roles are changing.

243
00:11:49,440 --> 00:11:53,360
Well, I've actually heard a bunch of stories from some of our folks in the field

244
00:11:53,360 --> 00:11:57,280
that, you know, they use that single people slide and end up talking with the CIO

245
00:11:57,280 --> 00:12:01,200
and CISO about strategy and people and roles for like two hours and never looked at the

246
00:12:01,200 --> 00:12:03,360
technology and never talked about a single Microsoft product.

247
00:12:04,080 --> 00:12:09,760
Just to, you know, just because that was what the organization needed was to help overcome that bias.

248
00:12:09,760 --> 00:12:14,880
I know that you recorded some of those videos like six months ago,

249
00:12:14,880 --> 00:12:20,640
but I always remember the video talking about the people in process,

250
00:12:21,520 --> 00:12:27,040
where you were mentioning about accountability and embedding security,

251
00:12:28,080 --> 00:12:32,960
because that enables the processes to be updated.

252
00:12:32,960 --> 00:12:35,360
Do you want to talk a little bit more about that?

253
00:12:36,080 --> 00:12:41,920
We ended up doing two versions, I think two versions of videos for those sort of people things.

254
00:12:41,920 --> 00:12:46,400
One was for more of a technology audience in the MCRA one, and then we had,

255
00:12:47,440 --> 00:12:52,240
we covered those slides in a slightly different view at that, you know, full-on strategic level.

256
00:12:52,240 --> 00:12:55,920
That's more of like CIO or CISO's sort of level.

257
00:12:55,920 --> 00:12:59,280
And so we ended up covering that same slide in two very different ways,

258
00:12:59,280 --> 00:13:02,400
not very different, but overlapping, but different from the,

259
00:13:03,040 --> 00:13:04,640
what language we used, etc.

260
00:13:04,640 --> 00:13:08,880
Yeah, we found that it was, it's really critical to understand, you know, at the very top of it,

261
00:13:08,880 --> 00:13:11,920
we talk about, you know, ultimately, this is a risk, right?

262
00:13:11,920 --> 00:13:12,880
Security is a risk.

263
00:13:12,880 --> 00:13:17,680
It's something that, you know, the board management are used to managing risks

264
00:13:17,680 --> 00:13:20,480
and overseeing risk management, right?

265
00:13:20,480 --> 00:13:22,720
And so that's really the root of all of this,

266
00:13:22,720 --> 00:13:25,760
and that's the huge frame that we should all be thinking about is,

267
00:13:25,760 --> 00:13:27,680
you know, what's important to the organization,

268
00:13:28,320 --> 00:13:31,840
and what are the risks that the organization faces.

269
00:13:31,840 --> 00:13:34,240
And we don't want to be coming at them, you know,

270
00:13:34,240 --> 00:13:37,520
these people that manage risks all the time, right?

271
00:13:37,520 --> 00:13:39,840
And coming at them with a whole different language,

272
00:13:39,840 --> 00:13:41,600
a whole different way of doing things,

273
00:13:41,600 --> 00:13:44,720
and just expect them to magically understand it.

274
00:13:44,720 --> 00:13:47,520
The best thing for us to do as security professionals is learn

275
00:13:48,240 --> 00:13:50,560
what risk management framework they're using,

276
00:13:50,560 --> 00:13:51,760
what language they're using,

277
00:13:51,760 --> 00:13:53,520
how do they assess risks?

278
00:13:53,520 --> 00:13:56,000
Is it, you know, a certain scoring system,

279
00:13:56,000 --> 00:13:58,480
or is it, you know, entered into a risk register

280
00:13:58,480 --> 00:14:00,720
when it's officially then managed and accountable

281
00:14:00,720 --> 00:14:02,640
by the management team, etc.

282
00:14:02,640 --> 00:14:06,080
Like understanding that as a senior security leader is critical.

283
00:14:06,080 --> 00:14:08,320
So you can plug into those processes,

284
00:14:08,320 --> 00:14:10,960
and then all of a sudden the funding, the attention, etc.,

285
00:14:10,960 --> 00:14:13,840
that you need for security starts to magically appear

286
00:14:13,840 --> 00:14:16,640
because it is, you know, being managed like any other risk

287
00:14:16,640 --> 00:14:18,640
in the organization that people know how to manage.

288
00:14:19,600 --> 00:14:23,280
But really kind of having that language of translation we found

289
00:14:23,920 --> 00:14:25,280
is super critical,

290
00:14:25,280 --> 00:14:28,160
and that's something that we emphasized a lot

291
00:14:28,160 --> 00:14:31,360
in the CAF secure videos to make sure that people understand,

292
00:14:31,920 --> 00:14:34,480
hey, listen, this is unique in a special discipline,

293
00:14:34,480 --> 00:14:37,040
but it's also a risk discipline

294
00:14:37,040 --> 00:14:40,480
that needs to be managed similar to how people are trained

295
00:14:40,480 --> 00:14:43,920
and used to and have processes and have funding to manage.

296
00:14:43,920 --> 00:14:46,560
Just on that topic of funding, I mean, is the,

297
00:14:46,560 --> 00:14:50,320
I'm not going to say is the guidance around how to get funding,

298
00:14:51,040 --> 00:14:54,800
but is there sort of hints and tips on how you can use this

299
00:14:54,800 --> 00:14:58,320
to explain, you know, how you can get funding for security programs?

300
00:14:58,320 --> 00:15:01,520
Yeah, I mean, I think the start of it is speaking the language

301
00:15:01,520 --> 00:15:04,720
of the people that control the purse strings is the first rule, right?

302
00:15:04,720 --> 00:15:07,600
So that you understand them and what's on their plate

303
00:15:07,600 --> 00:15:08,640
and what's important to them.

304
00:15:08,640 --> 00:15:11,920
And, you know, second, that you can speak to them in that language

305
00:15:11,920 --> 00:15:14,880
and communicate your ideas and your concerns

306
00:15:14,880 --> 00:15:16,240
in a way that they'll understand.

307
00:15:16,240 --> 00:15:18,400
That's the number one thing we found

308
00:15:18,400 --> 00:15:20,880
is really learning that language, learning the dynamics,

309
00:15:20,880 --> 00:15:22,320
learning the risks and priorities.

310
00:15:22,960 --> 00:15:25,360
That is the number one piece.

311
00:15:25,360 --> 00:15:27,440
And of course, building the relationships, etc.,

312
00:15:27,440 --> 00:15:29,360
because, you know, people make decisions.

313
00:15:29,360 --> 00:15:32,240
And so that's it. And when we talk to CISOs,

314
00:15:32,240 --> 00:15:34,960
oftentimes getting money actually isn't that hard.

315
00:15:34,960 --> 00:15:37,280
It's feeling confident that they could spend the money

316
00:15:37,280 --> 00:15:40,480
and then deliver on the results that come tied to that money.

317
00:15:41,200 --> 00:15:44,240
Tends to be a concern that we hear pretty often

318
00:15:44,240 --> 00:15:46,480
that like if they ask for money, they're going to get it

319
00:15:46,480 --> 00:15:49,600
in today's environment anyway, you know, times could change.

320
00:15:50,800 --> 00:15:53,520
Well, it definitely wasn't that way 10 years ago, 15 years ago.

321
00:15:53,520 --> 00:15:55,120
You know, that's one of the key things

322
00:15:55,120 --> 00:15:57,440
is sort of foundationally establish that.

323
00:15:57,440 --> 00:16:01,280
And some of the advice that we do give to sort of, you know,

324
00:16:01,280 --> 00:16:04,080
CISOs and sort of the folks around them

325
00:16:04,080 --> 00:16:07,440
and the business leaders around them is, you know,

326
00:16:07,440 --> 00:16:09,760
start thinking about how do you make sure

327
00:16:09,760 --> 00:16:12,000
that you're not actually distracting the security team

328
00:16:12,000 --> 00:16:14,400
with ongoing requests for funding?

329
00:16:14,400 --> 00:16:16,960
Like how do you make sure that their budget is steady?

330
00:16:16,960 --> 00:16:19,680
And the paradigm that we sort of think of this through

331
00:16:19,680 --> 00:16:20,960
is like maintenance.

332
00:16:20,960 --> 00:16:24,720
Like we don't, you know, if I managed a fleet of planes or cars,

333
00:16:24,720 --> 00:16:26,720
I'm going to have downtime.

334
00:16:26,720 --> 00:16:28,720
I'm going to expect that I have to go and do maintenance

335
00:16:28,720 --> 00:16:30,080
and check things and change the oil

336
00:16:30,080 --> 00:16:31,920
and all the various other things on it.

337
00:16:31,920 --> 00:16:33,920
And that's just a part of owning it.

338
00:16:33,920 --> 00:16:36,400
And technology is actually quite similar.

339
00:16:36,400 --> 00:16:38,720
Like we need to be able to have some planned downtime

340
00:16:38,720 --> 00:16:41,760
in the schedule to patch it and to take care of those things

341
00:16:41,760 --> 00:16:44,320
because if you ignore it, sooner or later,

342
00:16:44,320 --> 00:16:46,000
it's going to break down, it's going to hurt.

343
00:16:46,000 --> 00:16:48,640
And that's pretty much the same dynamic we see in security

344
00:16:48,640 --> 00:16:51,600
is if you don't patch for, you know, months on end,

345
00:16:51,600 --> 00:16:53,520
it's just like not changing the oil.

346
00:16:53,520 --> 00:16:55,280
Sooner or later, the system is going to break down,

347
00:16:55,280 --> 00:16:57,920
you're going to get owned, and it's going to all happen suddenly.

348
00:16:57,920 --> 00:17:00,240
And then all of a sudden, your schedule is disrupted anyway.

349
00:17:00,240 --> 00:17:03,280
So it's much better to sort of proactively plan that

350
00:17:03,280 --> 00:17:05,760
and integrate that in to your schedules,

351
00:17:05,760 --> 00:17:07,120
your budgets and all that.

352
00:17:07,120 --> 00:17:09,520
And so that's one of the ways that we're sort of recommending

353
00:17:09,520 --> 00:17:11,280
that organizations think about it.

354
00:17:11,280 --> 00:17:14,480
So Mark, well, you and I both know that a couple of years ago,

355
00:17:14,480 --> 00:17:18,400
we spent some time doing the security compass,

356
00:17:18,400 --> 00:17:22,320
which was, oh yeah, that was Paul Mark,

357
00:17:22,320 --> 00:17:25,760
Paul Mark had to spend like two days recording with me.

358
00:17:25,760 --> 00:17:30,560
But, so if people looked at that from a few years ago

359
00:17:30,560 --> 00:17:34,000
and they were to ask the question, how is this different?

360
00:17:34,000 --> 00:17:36,400
And why should I go and have a look at the CAF?

361
00:17:37,200 --> 00:17:38,320
What would you tell them?

362
00:17:38,320 --> 00:17:42,240
What is the distinction between those two pieces of work

363
00:17:42,240 --> 00:17:42,880
that you did?

364
00:17:43,520 --> 00:17:47,600
The Azure Security Compass, which is a name we actually don't use,

365
00:17:47,600 --> 00:17:49,440
I think the official website is now called

366
00:17:49,440 --> 00:17:51,280
the Microsoft Security Best Practices.

367
00:17:51,280 --> 00:17:57,280
That was a look at how do we establish very clear best practices.

368
00:17:57,280 --> 00:18:00,720
The main burning concern at the time, the burning question,

369
00:18:00,720 --> 00:18:03,680
was how do I secure my Azure infrastructure and resources?

370
00:18:03,680 --> 00:18:06,560
And so that was sort of like that first structured view

371
00:18:06,560 --> 00:18:10,880
of how do we do that as sort of like a standalone,

372
00:18:10,880 --> 00:18:12,720
you know, multi-module workshop.

373
00:18:12,720 --> 00:18:14,720
That really sort of answered that question

374
00:18:14,720 --> 00:18:17,520
with a very heavy bias towards the Azure resources.

375
00:18:17,520 --> 00:18:22,720
The thing that's changed in addition to sort of the wisdom

376
00:18:22,720 --> 00:18:25,200
and knowledge and understanding that we've picked up since

377
00:18:26,080 --> 00:18:30,560
is the CAF is much more focused on an end-to-end view

378
00:18:30,560 --> 00:18:32,080
of the entire security program.

379
00:18:32,800 --> 00:18:35,520
So it's not really cut down for just the things

380
00:18:35,520 --> 00:18:37,600
that affect your Azure infrastructure.

381
00:18:37,600 --> 00:18:39,520
It's actually focused on, you know,

382
00:18:39,520 --> 00:18:41,280
how do you actually structure your program right?

383
00:18:41,280 --> 00:18:42,800
What are the security disciplines, you know,

384
00:18:42,800 --> 00:18:46,000
your programs of record that you're going to run over,

385
00:18:46,000 --> 00:18:49,280
you know, the course of multiple years on possibly decades?

386
00:18:49,920 --> 00:18:52,720
And, you know, so it's really answering that programmatic

387
00:18:52,720 --> 00:18:53,920
question from the CAF.

388
00:18:54,480 --> 00:18:56,880
We are actually in the process that's, you know,

389
00:18:56,880 --> 00:19:00,080
taking a while of sort of refreshing that.

390
00:19:00,080 --> 00:19:02,640
And so we're taking a look at how do we, you know,

391
00:19:02,640 --> 00:19:04,720
have a deeper end-to-end view,

392
00:19:04,720 --> 00:19:06,480
because, you know, the Azure Security Compass

393
00:19:06,480 --> 00:19:08,960
got pretty detailed on the technology.

394
00:19:08,960 --> 00:19:12,000
How do we do that sort of on an end-to-end workshop basis?

395
00:19:12,000 --> 00:19:13,760
And so we're actually looking at that now,

396
00:19:13,760 --> 00:19:16,880
kind of framed out using that CAF structure.

397
00:19:16,880 --> 00:19:19,680
In the CAF itself, the disciplines are actually aligned

398
00:19:19,680 --> 00:19:20,800
to the open group.

399
00:19:21,680 --> 00:19:24,400
And how, and the open group is where Jericho Form started,

400
00:19:24,400 --> 00:19:26,480
actually defined the UNIX standards back in the day.

401
00:19:27,360 --> 00:19:29,280
But it's an open standards organization.

402
00:19:31,120 --> 00:19:34,960
And so we borrowed the Zero Trust Security components

403
00:19:34,960 --> 00:19:37,520
from them, because the way that it approached it

404
00:19:37,520 --> 00:19:40,160
was actually, you know, one of the most useful ways

405
00:19:40,160 --> 00:19:41,520
to sort of structure that.

406
00:19:41,520 --> 00:19:44,720
And so that's really, you know, that sort of end-to-end program view.

407
00:19:45,280 --> 00:19:48,160
And just for full disclosure, I am a co-chair

408
00:19:48,160 --> 00:19:50,800
of the Zero Trust Architecture Forum at the Open Group,

409
00:19:50,800 --> 00:19:52,160
just so that folks know.

410
00:19:52,160 --> 00:19:56,160
Pretty easy to understand how those ideas were found

411
00:19:56,160 --> 00:19:57,200
on either side of the world.

412
00:19:57,200 --> 00:19:58,960
So one thing we hear a lot about

413
00:19:58,960 --> 00:20:03,520
is SASE Secure Access Service Edge and Zero Trust.

414
00:20:04,160 --> 00:20:08,080
So how does all of this relate to those two,

415
00:20:08,080 --> 00:20:09,520
let's just call them initiatives?

416
00:20:09,520 --> 00:20:11,520
Let me start by framing out Zero Trust,

417
00:20:11,520 --> 00:20:13,120
because there's like two different perceptions

418
00:20:13,120 --> 00:20:16,800
that we've seen out in the world at large of Zero Trust.

419
00:20:16,800 --> 00:20:19,440
One is that it's modernizing your access control.

420
00:20:19,440 --> 00:20:21,680
If you ask an identity person, it's going to be identity-centric.

421
00:20:21,680 --> 00:20:23,760
If you ask a network person, it's going to be network-centric.

422
00:20:23,760 --> 00:20:26,480
But, you know, there's a whole set of people

423
00:20:26,480 --> 00:20:29,280
that think of Zero Trust in terms of, you know,

424
00:20:29,280 --> 00:20:32,320
how do I modernize access control using either a heavy network

425
00:20:32,320 --> 00:20:35,120
or a heavy identity or combination of technologies.

426
00:20:35,120 --> 00:20:39,840
And then the other, you know, 40 or 60%, depending on how you measure it,

427
00:20:39,840 --> 00:20:44,880
folks are sort of look at Zero Trust as an end-to-end security strategy

428
00:20:44,880 --> 00:20:46,880
that has multiple different components,

429
00:20:46,880 --> 00:20:49,280
including access control, security operations,

430
00:20:49,280 --> 00:20:54,080
data center modernization, OT security, and, you know,

431
00:20:54,080 --> 00:20:55,840
all those kind of pieces.

432
00:20:55,840 --> 00:21:00,720
And so Microsoft is very much a proponent of Zero Trust

433
00:21:00,720 --> 00:21:03,280
as a larger big picture strategy.

434
00:21:03,280 --> 00:21:08,560
And the reason for that is that the same things that cause this modernization

435
00:21:08,560 --> 00:21:11,520
of access control like, hey, we can't count on the perimeter,

436
00:21:11,520 --> 00:21:16,720
you know, phishing attacks and cloud stuff outside the perimeter

437
00:21:16,720 --> 00:21:19,280
and mobile devices outside the perimeter and work from home

438
00:21:19,280 --> 00:21:20,720
and all this other kind of stuff,

439
00:21:20,720 --> 00:21:24,000
all those same forces that are making us go, oh my gosh,

440
00:21:24,000 --> 00:21:28,400
I need to reestablish access control in a way that works outside my perimeter.

441
00:21:28,400 --> 00:21:31,520
All that stuff is also influencing how we do.

442
00:21:31,520 --> 00:21:35,600
Security operations, I have to detect attacks and respond and recover

443
00:21:35,600 --> 00:21:39,280
to them outside the perimeter or inside the perimeter.

444
00:21:39,280 --> 00:21:43,280
I have to be able to do all these other kinds of things inside and outside.

445
00:21:43,280 --> 00:21:46,880
So the same forces are also transforming pretty much all of security.

446
00:21:46,880 --> 00:21:50,480
And of course, the, you know, SaaS services and the cloud tools,

447
00:21:50,480 --> 00:21:52,880
based security tools.

448
00:21:52,880 --> 00:21:55,840
So, you know, all these forces are basically doing that.

449
00:21:55,840 --> 00:21:58,880
And so that led us to say, you know, Zero Trust is more than just access control.

450
00:21:58,880 --> 00:22:02,640
It's the whole kit and caboodle.

451
00:22:02,640 --> 00:22:05,280
So SaaSy is a very similar thing.

452
00:22:05,280 --> 00:22:08,800
And a lot of people sort of think of them as the same or similar.

453
00:22:08,800 --> 00:22:12,320
But over the, you know, it started out fairly high level.

454
00:22:12,320 --> 00:22:18,560
But what we've seen is that it started to come into focus in the past six to 12 months

455
00:22:18,560 --> 00:22:21,600
as Gartner has released more and more guidance on it

456
00:22:21,600 --> 00:22:24,560
and more and more products are, you know, sort of kind of aligning to it.

457
00:22:24,560 --> 00:22:26,400
It's become more clear what SaaSy is.

458
00:22:26,400 --> 00:22:31,040
The way that we're seeing SaaSy evolve is it tends to be aligned much more closer

459
00:22:31,040 --> 00:22:33,280
to that access control use case.

460
00:22:34,160 --> 00:22:41,920
And it tends to be very focused on the network pieces, especially as they relate to identity.

461
00:22:41,920 --> 00:22:45,760
Because the service edge, the way that SaaSy names it there as a service edge

462
00:22:45,760 --> 00:22:50,080
is basically the edge of your enterprise, the edge of your technical estate.

463
00:22:50,080 --> 00:22:52,800
We used to call it the edge of the network, but you know, it's, you know,

464
00:22:52,800 --> 00:22:55,120
what we have is no longer defined by network.

465
00:22:55,120 --> 00:22:57,520
So I called it a technical state for lack of a better term.

466
00:22:58,160 --> 00:23:02,800
And so that service edge essentially is what SaaSy focuses on, you know,

467
00:23:02,800 --> 00:23:06,720
sort of from a cloud, a CASB perspective, from the physical networks,

468
00:23:06,720 --> 00:23:10,000
like a secure web gateway and firewalls, a service perspective,

469
00:23:10,000 --> 00:23:14,160
all these different types of technologies, you know, that define your,

470
00:23:14,160 --> 00:23:16,800
the edge of what you control and own and care about.

471
00:23:17,680 --> 00:23:20,640
It's basically how do we make sure they're secure access to it, you know,

472
00:23:20,640 --> 00:23:23,840
that people can have a good performance for it, you know,

473
00:23:23,840 --> 00:23:26,560
and they can, you know, get it, it's cashed around the world.

474
00:23:26,560 --> 00:23:30,880
They include SD-WAN, a software defined wide area network as part of SaaSy.

475
00:23:31,520 --> 00:23:34,240
But also how do you do it securely through those, you know,

476
00:23:34,240 --> 00:23:36,320
security capabilities, like I mentioned.

477
00:23:36,320 --> 00:23:40,720
So SaaSy tends to be on the security side of subset.

478
00:23:41,360 --> 00:23:44,080
It goes beyond security because it includes performance as well.

479
00:23:44,080 --> 00:23:47,600
And that sort of, you know, the WAN based kind of things that would not normally be,

480
00:23:47,600 --> 00:23:50,400
you know, sort of a CISO or Zerotrust responsibility.

481
00:23:50,400 --> 00:23:55,200
But that's kind of where it fits is it's that, you know, how do I secure access control with this,

482
00:23:55,200 --> 00:23:59,840
you know, heavy focus on, you know, wherever your edge happens to be network or otherwise.

483
00:23:59,840 --> 00:24:00,880
Does that make sense?

484
00:24:00,880 --> 00:24:02,000
It does.

485
00:24:02,000 --> 00:24:04,400
And this is going to sound like a really cynical response.

486
00:24:05,360 --> 00:24:08,240
But I mean, how long does it take to sort of get all this?

487
00:24:08,240 --> 00:24:10,480
Like to understand it, understand that big picture and like,

488
00:24:10,480 --> 00:24:14,480
okay, SaaSy's over here and Zerotrust over here and MCRA is over here.

489
00:24:14,480 --> 00:24:16,160
And here's what the Venn diagram looks like.

490
00:24:16,160 --> 00:24:20,480
And here are the tools that map into each of those areas and here's research that's come from

491
00:24:20,480 --> 00:24:24,560
Forrester and Gartner and here's research that's come from Microsoft and here's stuff that's

492
00:24:24,560 --> 00:24:26,240
groundbreaking from Cisco.

493
00:24:26,800 --> 00:24:30,080
I mean, how long does it take to get to understand that?

494
00:24:30,080 --> 00:24:34,800
I mean, is there a place that you can go that's like the, you know, this stuff,

495
00:24:34,800 --> 00:24:37,760
the security stuff 101 to sort of understand?

496
00:24:37,760 --> 00:24:40,240
Like there's so much and there's so much information.

497
00:24:40,800 --> 00:24:44,480
And I mean, do we feel like people understand this stuff?

498
00:24:44,480 --> 00:24:48,160
I know it sounds really cynical, but I sort of want to look at it from a really practical

499
00:24:48,160 --> 00:24:49,440
and pragmatic perspective.

500
00:24:50,000 --> 00:24:55,680
It's a really good question and it's tricky because there's so much to it.

501
00:24:56,240 --> 00:25:01,040
And quite frankly, there's so much money in our industry that everybody's trying to

502
00:25:01,040 --> 00:25:07,120
claim a piece of it and own sort of part of that intellectual space by framing it out

503
00:25:07,120 --> 00:25:08,320
in a way that favors them.

504
00:25:08,800 --> 00:25:12,640
And this is everybody from, you know, product vendors and marketing to analysts,

505
00:25:12,640 --> 00:25:18,400
houses to, I mean, everybody is trying to help with this because it's a serious problem

506
00:25:18,400 --> 00:25:20,240
because security itself is complicated.

507
00:25:20,800 --> 00:25:23,440
And then everybody's trying to simplify it in a way that favors them.

508
00:25:24,080 --> 00:25:29,120
And so you end up having this sort of too many simplifications thing that ends up creating

509
00:25:29,120 --> 00:25:30,240
more complexity in the middle.

510
00:25:30,800 --> 00:25:35,440
So one of the things that we did do to try and help combat this is there's actually kind

511
00:25:35,440 --> 00:25:38,320
of a third type of recording that we did around the MCRAs.

512
00:25:38,320 --> 00:25:41,600
We didn't do it for every single module, but we did, I think two or three of them.

513
00:25:41,600 --> 00:25:47,040
We'll include the links in the show notes where we deliberately focus the talk track

514
00:25:47,680 --> 00:25:49,520
on someone new to industry.

515
00:25:49,520 --> 00:25:54,000
So that would be easy for them to understand like, so say I'm an IT person, that's a career

516
00:25:54,000 --> 00:25:57,120
changer or I'm learning about technology and security because I want to be in the cyber

517
00:25:57,120 --> 00:25:58,960
security, you know, high pay band.

518
00:25:59,760 --> 00:26:03,120
So we actually recorded some of those and they're called interactive guides.

519
00:26:03,120 --> 00:26:07,200
And they're sort of, you know, like an online computer based training type of thing where

520
00:26:07,200 --> 00:26:11,280
you can click, you know, between each section and go back and all that kind of stuff.

521
00:26:11,280 --> 00:26:14,480
So we'll include a link to those because we actually tried to tackle some of that.

522
00:26:14,480 --> 00:26:19,040
I think we started with the people, the capabilities and then one other, I've forgotten

523
00:26:19,040 --> 00:26:24,400
which one, just to sort of make it a little bit easier for non geeks like us to sort of

524
00:26:24,400 --> 00:26:25,200
ramp up on it.

525
00:26:25,200 --> 00:26:28,240
Yeah, I think, again, I've already mentioned this, but it looks quite a bit of the

526
00:26:28,240 --> 00:26:30,800
documentation over the last few months.

527
00:26:30,800 --> 00:26:31,760
It's fantastic stuff.

528
00:26:31,760 --> 00:26:34,480
But at the end of the day, you've sort of got a grok it, you know, you have to understand,

529
00:26:34,480 --> 00:26:35,760
okay, how do I apply this?

530
00:26:36,320 --> 00:26:36,880
Yep.

531
00:26:36,880 --> 00:26:42,240
Great documentation is can be fantastic, but if it is not actionable, then basically worthless.

532
00:26:43,120 --> 00:26:46,080
And that's actually where we did a lot of investment into the ramp.

533
00:26:46,720 --> 00:26:51,200
So the rapid modernization plan, because we recognize that dynamic and we want people to

534
00:26:51,200 --> 00:26:54,160
have that sort of, okay, I just want to implement, right?

535
00:26:54,160 --> 00:26:55,920
Because that's one way to simplify it down.

536
00:26:56,720 --> 00:27:01,760
And, you know, said, okay, do this first, then this, then this, and then your second stage is

537
00:27:01,760 --> 00:27:03,280
this, and there's two steps in it.

538
00:27:03,280 --> 00:27:04,960
Your third stage has three steps in it.

539
00:27:04,960 --> 00:27:07,040
And just basically go do this.

540
00:27:07,040 --> 00:27:11,840
And then we're in the process of actually documenting each of those initiatives with

541
00:27:11,840 --> 00:27:15,920
very specific, hey, if you're doing it with Microsoft technology, here's exactly how to do it

542
00:27:15,920 --> 00:27:17,760
and walk through that on our docs page.

543
00:27:18,480 --> 00:27:23,040
So we're in the process of actually turning that zero trust ramp into basically documentation

544
00:27:23,040 --> 00:27:25,440
that says, okay, let's just, here's how to do it.

545
00:27:25,440 --> 00:27:27,040
And that cuts through a lot of the noise.

546
00:27:27,040 --> 00:27:29,840
So you have all the context, which is great, but here's what to do.

547
00:27:29,840 --> 00:27:33,280
I agree with Michael was saying on the complexity.

548
00:27:33,280 --> 00:27:41,360
And when I look back, many customers in the past used to look or acquire functionality

549
00:27:41,360 --> 00:27:47,680
and infrastructure to address a particular issue in the environment.

550
00:27:47,680 --> 00:27:53,760
And unfortunately, when this confusion with SASE, with zero trust, people start thinking,

551
00:27:53,760 --> 00:27:59,280
okay, do I have to invest on different infrastructure to deal with each one of these?

552
00:27:59,280 --> 00:28:05,360
And this you could see it because people, when they talk about ransomware, they think that

553
00:28:05,360 --> 00:28:09,120
is something different than every other attack out there.

554
00:28:09,120 --> 00:28:13,120
Yes, there's a few things that are different, but not necessarily

555
00:28:14,000 --> 00:28:17,760
start a different way than other type of attacks.

556
00:28:17,760 --> 00:28:20,240
Do you want to comment a little bit about this?

557
00:28:20,240 --> 00:28:21,520
Yeah, absolutely.

558
00:28:21,520 --> 00:28:26,080
Yeah, the easiest answer to ransomware, I always start with this because it helps

559
00:28:26,080 --> 00:28:31,600
create clarity really fast is defending against ransomware is pretty much the same as defending

560
00:28:31,600 --> 00:28:37,200
it against anything else, because ultimately the attackers will do anything to be able to

561
00:28:37,200 --> 00:28:38,640
ransom you technically.

562
00:28:38,640 --> 00:28:40,480
They don't really care.

563
00:28:40,480 --> 00:28:44,640
They've copied techniques from the APTs of credential theft and past the hash and past

564
00:28:44,640 --> 00:28:46,400
the ticket and past the everything else.

565
00:28:47,360 --> 00:28:51,920
The phishing techniques, they've created a few new ones that tend to be in hallmark of ransomware,

566
00:28:51,920 --> 00:28:57,200
like buying up commodity malware access, essentially taking a botnet or some other

567
00:28:57,200 --> 00:29:02,960
low-grade attack infection, and then buying access to that and then using that as an entry

568
00:29:02,960 --> 00:29:04,320
point for ransomware.

569
00:29:04,320 --> 00:29:10,720
But pretty much ransomware is just as flexible as an APT would be because they're just trying

570
00:29:10,720 --> 00:29:11,840
to get control of the environment.

571
00:29:11,840 --> 00:29:13,360
They're just making money in a different way.

572
00:29:13,360 --> 00:29:15,280
They're not stealing secrets or stealing data.

573
00:29:15,920 --> 00:29:20,720
Sometimes they are, but they're extorting you by turning your systems off and saying,

574
00:29:20,720 --> 00:29:22,960
you can't get back to them without paying me.

575
00:29:22,960 --> 00:29:27,760
And that also illustrates the root of the complexity is at the end of the day,

576
00:29:28,480 --> 00:29:36,000
security is trying to stop an intelligent human from finding a crack in your system.

577
00:29:37,600 --> 00:29:42,800
Because they're an intelligent human, they're going to find something in your complex

578
00:29:43,520 --> 00:29:49,520
technical estate somewhere, unpatch this, misconfigured that, poor operational practice,

579
00:29:49,520 --> 00:29:51,920
along with the main admin to an unsecure box, whatever.

580
00:29:52,560 --> 00:29:55,840
They're going to find something and abuse it.

581
00:29:55,840 --> 00:30:03,680
And you have to basically continually be burning down the risks in this huge 30-plus-year-old

582
00:30:03,680 --> 00:30:07,760
technical estate that was built when security wasn't even important.

583
00:30:08,560 --> 00:30:12,640
And so the reality is, security is by its nature very complicated,

584
00:30:12,640 --> 00:30:20,000
and it's forcing us to clean up the hygiene of IT that's been building up and lingering for decades.

585
00:30:20,000 --> 00:30:21,520
Sorry, I'm not sure if I answered the question.

586
00:30:21,520 --> 00:30:23,360
I kind of wanted off there for a bit.

587
00:30:24,000 --> 00:30:28,720
Actually, I like what you were mentioning, trying to stop.

588
00:30:29,520 --> 00:30:35,840
The other day, I was having a conversation with somebody about the definition of security.

589
00:30:35,840 --> 00:30:40,720
For them, when they were saying, okay, this infrastructure is secure,

590
00:30:40,720 --> 00:30:45,520
for them, it meant that nothing could penetrate it.

591
00:30:45,520 --> 00:30:50,400
And I was like, I think the focus of Microsoft is what we said,

592
00:30:50,400 --> 00:30:54,400
assume compromise, detect fast and recover fast.

593
00:30:54,400 --> 00:30:55,040
Right?

594
00:30:55,040 --> 00:30:59,360
So I like the fact that you mentioned about us trying to stop.

595
00:30:59,360 --> 00:31:05,120
But the main thing that I was trying to convey in the previous question is that,

596
00:31:05,120 --> 00:31:11,520
for the most part, the infrastructure used for ransomware, for zero trust, for

597
00:31:12,560 --> 00:31:15,680
suck modernization is almost the same.

598
00:31:15,680 --> 00:31:21,440
There's some capability that maybe SASE uses that is different from zero trust,

599
00:31:21,440 --> 00:31:25,360
but it's not a complete separate infrastructure.

600
00:31:25,360 --> 00:31:31,600
As long as organizations start interconnecting and working in a

601
00:31:31,600 --> 00:31:36,560
some compromise type of approach, detect fast, recover fast,

602
00:31:36,560 --> 00:31:41,280
I think they're able to deal with many of these security issues that all these

603
00:31:41,920 --> 00:31:45,360
SASE, zero trust, et cetera, are trying to deal with.

604
00:31:45,360 --> 00:31:47,120
Yeah, that's actually an excellent point.

605
00:31:48,560 --> 00:31:52,080
And that's a homework of Microsoft's approach, because we're not a,

606
00:31:52,080 --> 00:31:55,760
here's one simple tool that can assemble one simple problem.

607
00:31:55,760 --> 00:31:58,640
And it's all simple until you actually combine it with all your other simple tools.

608
00:31:58,640 --> 00:32:01,600
And then you have a hundred simple tools and it's not simple anymore.

609
00:32:02,240 --> 00:32:06,160
You know, we're actually doing that hard engineering work to, you know what,

610
00:32:06,160 --> 00:32:09,760
instead of giving you six or eight or 10 different types of detection tools,

611
00:32:10,320 --> 00:32:13,760
we're going to give you, we're going to do those kinds of six or eight or 10 or 12

612
00:32:13,760 --> 00:32:17,600
different detections, but we're going to put it in one or two portals, right?

613
00:32:17,600 --> 00:32:20,880
We're going to consolidate that and we're going to connect in, we're going to link it.

614
00:32:20,880 --> 00:32:22,320
So you don't have to do that work.

615
00:32:22,320 --> 00:32:24,080
You can just turn the tools on and go.

616
00:32:24,080 --> 00:32:29,120
And we're going to connect it to your Azure AD and then you can just use that and go.

617
00:32:29,760 --> 00:32:32,560
You know, there's a lot to it because we're kind of explaining how it all works.

618
00:32:32,560 --> 00:32:34,960
You know, a lot of technical people want to understand that.

619
00:32:35,440 --> 00:32:41,200
But the actual operation execution is a huge focus for us to simplify it Microsoft.

620
00:32:41,600 --> 00:32:45,600
Because we don't want people to have to do, you know, we have what,

621
00:32:45,600 --> 00:32:48,480
tens of thousands, hundreds of thousands of customers or whatever it is.

622
00:32:48,480 --> 00:32:55,040
We don't want 10 or 100,000 different individual analysts and engineers having to figure out

623
00:32:55,040 --> 00:32:56,640
how to do the same manual task.

624
00:32:57,040 --> 00:33:03,120
We want to do that once, automate it, and then our entire customer base can benefit from it.

625
00:33:03,600 --> 00:33:08,160
And so that's a huge, huge focus to help make it simpler and easier.

626
00:33:08,160 --> 00:33:12,560
A huge reason why we invest so much is to try and solve those problems and solve them once.

627
00:33:12,560 --> 00:33:15,520
So that goes directly to a question.

628
00:33:15,520 --> 00:33:22,880
I have seen a lot of organizations trying to develop a solution to share threat intelligence.

629
00:33:22,880 --> 00:33:29,840
And I think they're missing the point that Microsoft is already having one of the best

630
00:33:29,840 --> 00:33:34,160
and most complete threat intelligence information out there.

631
00:33:34,160 --> 00:33:40,640
And we are reusing it to simplify detection response to the different attacks.

632
00:33:40,640 --> 00:33:42,960
Can you comment a little bit more about that?

633
00:33:42,960 --> 00:33:44,240
Oh yeah, 100%.

634
00:33:44,240 --> 00:33:46,400
I saw some studies out there.

635
00:33:46,400 --> 00:33:50,960
I don't remember where they were that effectively when you go and take all these different paid

636
00:33:50,960 --> 00:33:56,080
intelligence feeds and this, that, and the other, you always end up with a lot of overlap.

637
00:33:56,080 --> 00:34:00,560
You kind of have a diminishing returns as you get these new different sources out there

638
00:34:00,560 --> 00:34:04,160
from different vendors and whatnot, because they end up reusing the same sources behind

639
00:34:04,160 --> 00:34:05,440
the scenes, et cetera.

640
00:34:05,440 --> 00:34:08,960
And so that's sort of like one of the dynamics a lot of customers are facing.

641
00:34:08,960 --> 00:34:12,320
And of course, it's very difficult to integrate multiple different feeds to a lot of different

642
00:34:12,320 --> 00:34:14,320
tools, et cetera.

643
00:34:14,320 --> 00:34:18,640
And so that's a good example of where Microsoft has invested to kind of make that problem

644
00:34:18,640 --> 00:34:19,640
go away.

645
00:34:19,640 --> 00:34:24,640
We get 24 trillion signals a day, I think is the latest statistic.

646
00:34:24,640 --> 00:34:26,160
Yeah, somewhere around that.

647
00:34:26,160 --> 00:34:29,120
And then of course, we do all the machine learning and reasoning over and this and that,

648
00:34:29,120 --> 00:34:33,600
turn that into like normalized baselines and the other anomalies that stick out and behavior

649
00:34:33,600 --> 00:34:38,800
analytics that say these are the behavior anomalies that stick out, et cetera.

650
00:34:38,800 --> 00:34:45,120
But ultimately, we have a massive set of threat intelligence that's built right into

651
00:34:45,120 --> 00:34:46,120
the tools.

652
00:34:46,120 --> 00:34:48,120
Like you don't have to manage it, you don't have to integrate it, you don't have to configure

653
00:34:48,120 --> 00:34:49,120
it, it's just there.

654
00:34:49,120 --> 00:34:51,880
And that's one of the things that we do is build that in there.

655
00:34:51,880 --> 00:34:57,760
So it's sort of like an ambient set of knowledge that's just built in to all the tools and

656
00:34:57,760 --> 00:34:59,520
then you don't have to go through that.

657
00:34:59,520 --> 00:35:01,400
Every organization has unique needs, right?

658
00:35:01,400 --> 00:35:06,040
So if you're in FinServe, maybe the generic threat intelligence, once you sort of integrate

659
00:35:06,040 --> 00:35:10,640
that in or use our tools, okay, that's fine and good, but I need to get to maturity level

660
00:35:10,640 --> 00:35:11,640
four.

661
00:35:11,640 --> 00:35:13,960
I'm at three and you got me to three easily, great.

662
00:35:13,960 --> 00:35:14,960
But now I need four.

663
00:35:14,960 --> 00:35:18,520
There's always going to be a need for sort of localized stuff, what you've been attacked

664
00:35:18,520 --> 00:35:23,880
with, what your industry has been attacked with that might not be the exact same thing

665
00:35:23,880 --> 00:35:26,280
as every other industry out there.

666
00:35:26,280 --> 00:35:31,840
So there's always going to be sort of a little bit of uniqueness, sort of the, I sort of jokingly

667
00:35:31,840 --> 00:35:36,160
say it's like the frosting on the cake that says happy birthday, Bobby.

668
00:35:36,160 --> 00:35:39,920
Most of the cake is identical to every kid's cake, but this one says happy birthday, Bobby.

669
00:35:39,920 --> 00:35:47,160
So there is a genuine need to do some personalization on threat intelligence, but the bulk of it

670
00:35:47,160 --> 00:35:52,240
is going to end up being fairly common across companies, industries, et cetera.

671
00:35:52,240 --> 00:35:57,800
And so we've done a lot of work at Microsoft to kind of make that problem go away and make

672
00:35:57,800 --> 00:36:02,400
the bulk of the problem go away.

673
00:36:02,400 --> 00:36:03,720
So that's an excellent point.

674
00:36:03,720 --> 00:36:10,400
Yeah, I like the sample that you showed in the threat intelligence and CRA video.

675
00:36:10,400 --> 00:36:14,760
I don't remember which attack you used, but it was some...

676
00:36:14,760 --> 00:36:16,880
Emotet, if I recall correctly.

677
00:36:16,880 --> 00:36:17,880
Yes.

678
00:36:17,880 --> 00:36:18,880
Yes.

679
00:36:18,880 --> 00:36:25,760
So somebody got an email with the Emotet and they didn't even realize it because the

680
00:36:25,760 --> 00:36:32,880
threat intelligence was using the backend to perform a lot of different investigations

681
00:36:32,880 --> 00:36:34,920
and response.

682
00:36:34,920 --> 00:36:40,080
This is the value that we're providing with embedded threat intelligence and signals from

683
00:36:40,080 --> 00:36:42,080
cross services.

684
00:36:42,080 --> 00:36:44,600
So I was happy about that.

685
00:36:44,600 --> 00:36:45,600
Oh yeah.

686
00:36:45,600 --> 00:36:46,600
It's such a powerful story.

687
00:36:46,600 --> 00:36:51,680
I mean, we can't do this every time to every single thing, but we basically, the first

688
00:36:51,680 --> 00:36:55,360
variant that existed of this particular trojan.

689
00:36:55,360 --> 00:36:58,680
It was a Gmail on a Windows app.

690
00:36:58,680 --> 00:37:00,680
So it was like right at the edge of sort of the stuff.

691
00:37:00,680 --> 00:37:04,800
It wasn't even the enterprise things, but we picked it up, did the ML, the detonation,

692
00:37:04,800 --> 00:37:10,400
et cetera, and deleted it before the person could ever run it within 400 milliseconds.

693
00:37:10,400 --> 00:37:14,600
And so this person was automatically protected, didn't even get interrupted, and they were

694
00:37:14,600 --> 00:37:21,720
the would be victim of the very first victim of this brand new variant of a banking trojan,

695
00:37:21,720 --> 00:37:24,800
but the ML and all that just took care of it automatically.

696
00:37:24,800 --> 00:37:29,280
And I'm like, that is the pinnacle of where we're trying to get to in as many scenarios

697
00:37:29,280 --> 00:37:30,280
as we can.

698
00:37:30,280 --> 00:37:34,640
So I have to ask, just because of my background, is there a sort of a secure software development

699
00:37:34,640 --> 00:37:36,560
aspect to any of this?

700
00:37:36,560 --> 00:37:37,560
Yes.

701
00:37:37,560 --> 00:37:41,080
So within the MCRA, it's a little bit of a limited exposure.

702
00:37:41,080 --> 00:37:43,440
We've got the STL links and all those kind of things.

703
00:37:43,440 --> 00:37:47,440
We've got the GitHub Advanced Security highlighted in a couple of places.

704
00:37:47,440 --> 00:37:51,960
So there's definitely some mentions within the cyber reference architecture, but the

705
00:37:51,960 --> 00:37:57,600
thing about the DevSecOps and security development lifecycle sort of space is it tends to be

706
00:37:57,600 --> 00:38:00,000
very heavy on people in process.

707
00:38:00,000 --> 00:38:03,840
There's definitely tools involved 100%, but there's a lot of people in process and training

708
00:38:03,840 --> 00:38:06,160
and awareness elements of it.

709
00:38:06,160 --> 00:38:11,760
And so what we actually did in the CAF secure methodology, because there's five security

710
00:38:11,760 --> 00:38:18,280
disciplines, access control, security operations, asset protection, security governance, and

711
00:38:18,280 --> 00:38:20,960
then the last one is innovation security.

712
00:38:20,960 --> 00:38:25,560
We named it innovation security instead of DevSecOps on purpose because we also want to

713
00:38:25,560 --> 00:38:29,160
be able to accommodate the emerging, not, we want to be able to accommodate not only

714
00:38:29,160 --> 00:38:35,280
DevSecOps, but also the emerging discipline of citizen developers, things like PowerApps

715
00:38:35,280 --> 00:38:40,160
and low code, no code types of apps that are really starting to ramp up in volume.

716
00:38:40,160 --> 00:38:42,440
They're small, but they're growing very quickly.

717
00:38:42,440 --> 00:38:48,480
And so what we did there is we actually put in two different pages in the CAF secure methodology.

718
00:38:48,480 --> 00:38:53,200
The first one is innovation security, what it is, how it works, really highlighting that

719
00:38:53,200 --> 00:38:55,680
we need to bring together the different teams.

720
00:38:55,680 --> 00:38:58,360
We need the business and the developer folks.

721
00:38:58,360 --> 00:39:02,360
We need to bring the ops together, that your DevOps kind of combo.

722
00:39:02,360 --> 00:39:06,200
But we also need to have that safety and security built in as well.

723
00:39:06,200 --> 00:39:08,800
It's like having a car without seatbelts or brakes.

724
00:39:08,800 --> 00:39:11,120
You're not going to drive it very fast.

725
00:39:11,120 --> 00:39:14,800
You want to have that assurance and that comfort of safety and of course the reality of it as

726
00:39:14,800 --> 00:39:19,880
well to feel comfortable to have that speed and agility to go and capture new business

727
00:39:19,880 --> 00:39:21,680
opportunities, et cetera.

728
00:39:21,680 --> 00:39:25,960
So we built innovation security, kind of focus on those kind of key themes.

729
00:39:25,960 --> 00:39:30,040
And when you do the security, don't just think about developer and developer processes, which

730
00:39:30,040 --> 00:39:32,160
most APSEC people will think of.

731
00:39:32,160 --> 00:39:38,200
And don't just think about infrastructure and the actual build servers and the workstations

732
00:39:38,200 --> 00:39:41,760
and all that, that an infrastructure security person would think about, but think about

733
00:39:41,760 --> 00:39:43,120
both of them.

734
00:39:43,120 --> 00:39:46,920
And so we introduced a lot of themes like that and addressed a lot of those elements

735
00:39:46,920 --> 00:39:54,480
there and how do you sort of bring all the goodness of the SDL and SDLC type of things

736
00:39:54,480 --> 00:39:56,960
into the age and speed of DevSecOps.

737
00:39:56,960 --> 00:39:59,000
And so that's the first page.

738
00:39:59,000 --> 00:40:06,760
And then the second page, we focused on specifically what kinds of technical controls to put into

739
00:40:06,760 --> 00:40:11,720
your DevSecOps process to sort of support those principles and those high level ideals.

740
00:40:11,720 --> 00:40:18,160
And we actually worked closely with Victoria Almazova, a very talented person that has

741
00:40:18,160 --> 00:40:22,560
been working on DevSecOps for a long time and she heavily contributed in that section as

742
00:40:22,560 --> 00:40:23,560
well.

743
00:40:23,560 --> 00:40:27,680
Funny you should mention about DevSecOps and infrastructure people and what have you.

744
00:40:27,680 --> 00:40:31,600
One thing I've been doing a lot recently with customers is talking to them.

745
00:40:31,600 --> 00:40:35,160
And this isn't really security related, but just something I think is really important

746
00:40:35,160 --> 00:40:44,160
is talking to customers about their ops people understanding CI CD pipeline tooling.

747
00:40:44,160 --> 00:40:49,720
So for example, I've been giving me a lot of chats to IT folks about Visual Studio Code

748
00:40:49,720 --> 00:40:54,520
and infrastructure is code and using as your DevOps or GitHub.

749
00:40:54,520 --> 00:40:59,080
All of a sudden they've got these new words added to their lexicon around pull request

750
00:40:59,080 --> 00:41:03,040
and diff and all that sort of stuff and pipelines.

751
00:41:03,040 --> 00:41:08,360
And the joke is, we're sort of talking about tooling that is very familiar to developers

752
00:41:08,360 --> 00:41:12,840
that may be relatively new to IT folks, but you've got to know it.

753
00:41:12,840 --> 00:41:17,440
And if you don't know how to feel comfortable in front of Visual Studio Code, for example,

754
00:41:17,440 --> 00:41:24,880
integrating with GitHub, making edits to Terraform files or ARM templates or Bicep files and

755
00:41:24,880 --> 00:41:28,640
creating a pull request, you kind of have to know all that stuff.

756
00:41:28,640 --> 00:41:34,560
Again, it's not really security related, but this is a great example, this intersection

757
00:41:34,560 --> 00:41:37,440
of dev and ops.

758
00:41:37,440 --> 00:41:41,120
We're adding second there for security.

759
00:41:41,120 --> 00:41:43,880
So yeah, I've been spending a lot of time with customers talking about that sort of

760
00:41:43,880 --> 00:41:45,080
stuff of light.

761
00:41:45,080 --> 00:41:50,120
Yeah, it affects security as well because security can't function unless they understand

762
00:41:50,120 --> 00:41:51,920
the process as they're trying to secure.

763
00:41:51,920 --> 00:41:56,040
The last thing you want to do is say, okay, we patched the server, well, great, the CI

764
00:41:56,040 --> 00:42:01,000
CT process just went and unpatched it automatically on the next build.

765
00:42:01,000 --> 00:42:03,680
You don't want to be in that situation.

766
00:42:03,680 --> 00:42:06,160
And so it's really important to sort of understand and embrace it.

767
00:42:06,160 --> 00:42:11,000
I've learned that the DevSecOps space is almost like a microcosm of everything else you have

768
00:42:11,000 --> 00:42:16,640
to do in the security program, just focused on a workload at a time at high speed, kind

769
00:42:16,640 --> 00:42:21,000
of learning that environment around you and applying the principles, everything from the

770
00:42:21,000 --> 00:42:26,480
app sec to the infrastructure elements and the identity and access control and the monitoring,

771
00:42:26,480 --> 00:42:31,520
all that stuff applies to the DevSecOps space in miniature at speed.

772
00:42:31,520 --> 00:42:33,960
And so it's a really fascinating space.

773
00:42:33,960 --> 00:42:35,360
Yeah, I love it.

774
00:42:35,360 --> 00:42:40,080
Again, these are the tools that I'm used to using for a long time.

775
00:42:40,080 --> 00:42:44,080
And a lot of IT folks are kind of a little bit scared when all of a sudden they're sitting

776
00:42:44,080 --> 00:42:48,280
in front of Visual Studio Code with a whole bunch of infrastructures carried in front

777
00:42:48,280 --> 00:42:52,160
of them and they don't understand how to do things like pull requests and that sort of

778
00:42:52,160 --> 00:42:53,160
stuff.

779
00:42:53,160 --> 00:42:57,080
Yeah, I actually had a personal version of that because I've been doing a lot more work

780
00:42:57,080 --> 00:42:59,320
in the doc site lately, right?

781
00:42:59,320 --> 00:43:03,080
And so our doc site, for those that don't know, Microsoft is actually built on GitHub.

782
00:43:03,080 --> 00:43:08,200
And so we have pull requests and all those kind of things and branches and repos and

783
00:43:08,200 --> 00:43:09,200
all that.

784
00:43:09,200 --> 00:43:12,840
And I had to learn at least the basics of that, which is very over complicated in my

785
00:43:12,840 --> 00:43:13,840
opinion.

786
00:43:13,840 --> 00:43:18,080
But I had to learn the basics of that just to be able to sort of edit and do docs work

787
00:43:18,080 --> 00:43:22,680
and get some stuff into CAF and the MCRA site and all these others.

788
00:43:22,680 --> 00:43:26,520
And by the way, for those that don't know, anyone can submit docs changes at Microsoft.

789
00:43:26,520 --> 00:43:30,200
So if you find something that's wrong or you want to add something, you can just hit

790
00:43:30,200 --> 00:43:32,800
the little pencil edit icon and do that.

791
00:43:32,800 --> 00:43:34,200
But yeah, I live that experience.

792
00:43:34,200 --> 00:43:36,600
Yeah, actually, you bring up a really important point there.

793
00:43:36,600 --> 00:43:39,880
Docs.Microsoft.com, like you say, is all hosted in GitHub.

794
00:43:39,880 --> 00:43:46,520
If you see an error, just go in and just either A, create an issue that can be tracked, or

795
00:43:46,520 --> 00:43:52,200
actually go and make the edit yourself and create a pull request and someone can review

796
00:43:52,200 --> 00:43:53,200
it.

797
00:43:53,200 --> 00:43:58,560
I'm always making edits to docs.Microsoft.com, especially when any of the documentation like

798
00:43:58,560 --> 00:44:05,240
refers to sort of cryptographic elements like SSL rather than TLS or they're using a certificate

799
00:44:05,240 --> 00:44:10,320
for signing because you don't use a certificate for signing, you use the private key for signing.

800
00:44:10,320 --> 00:44:11,880
Just silly things like that.

801
00:44:11,880 --> 00:44:16,120
I mean, it may seem silly, but little things like that just make the documents better.

802
00:44:16,120 --> 00:44:19,040
So yeah, that's an interesting point.

803
00:44:19,040 --> 00:44:21,720
I didn't even think about that, the fact that you probably have this little baptism of fire

804
00:44:21,720 --> 00:44:22,720
as well.

805
00:44:22,720 --> 00:44:23,720
Yeah.

806
00:44:23,720 --> 00:44:25,320
And as an architect, I was extremely frustrated.

807
00:44:25,320 --> 00:44:28,160
I'm like, my God, there's a simpler way of doing this.

808
00:44:28,160 --> 00:44:29,160
Well, there is.

809
00:44:29,160 --> 00:44:32,280
No, there is, but it's not as tightly controlled.

810
00:44:32,280 --> 00:44:33,280
That's the problem.

811
00:44:33,280 --> 00:44:37,120
You want to make sure that the right edits are being made by the right people and stuff

812
00:44:37,120 --> 00:44:38,120
to make sure it's true.

813
00:44:38,120 --> 00:44:39,240
It has to go to the nth degree of detail.

814
00:44:39,240 --> 00:44:40,240
Yeah, you're right.

815
00:44:40,240 --> 00:44:45,000
I mean, but it just, I'm like, can I get a simpler version of GitHub, like just overlaid

816
00:44:45,000 --> 00:44:48,000
where you don't expose all the complexity, please?

817
00:44:48,000 --> 00:44:49,000
No.

818
00:44:49,000 --> 00:44:55,520
So I know I don't need to tell you this, but every time we do a podcast, we always ask,

819
00:44:55,520 --> 00:44:57,680
do you have a final thought?

820
00:44:57,680 --> 00:45:01,480
So what would your final thought here you'd like to leave our listeners with?

821
00:45:01,480 --> 00:45:08,520
The big thing I would recommend is don't get too intimidated by the complexity and also

822
00:45:08,520 --> 00:45:13,920
make sure to take advantage of all the sort of supporting elements within the cyber reference

823
00:45:13,920 --> 00:45:17,240
architecture and the CAF secure that we put in there.

824
00:45:17,240 --> 00:45:19,920
We've documented a lot of this and the reasoning for it.

825
00:45:19,920 --> 00:45:22,080
So take the time to read through that.

826
00:45:22,080 --> 00:45:26,280
The slide deck itself, there's, we talked about in the videos that are associated with

827
00:45:26,280 --> 00:45:30,960
it for the MCRA as well as for the CAF secure methodology.

828
00:45:30,960 --> 00:45:36,720
There are extensive slide notes within the MCRA and you can also hover over all the different

829
00:45:36,720 --> 00:45:40,440
product names and different boxes all over the slide deck.

830
00:45:40,440 --> 00:45:43,900
And it's got a little short description of what the heck the thing is.

831
00:45:43,900 --> 00:45:46,000
So it's definitely a good learning tool there.

832
00:45:46,000 --> 00:45:52,040
So don't get too intimidated by it and just take your time, learn it one piece of a time.

833
00:45:52,040 --> 00:45:54,120
And if you get any feedback, happy to take it.

834
00:45:54,120 --> 00:45:58,080
I'm out there on Twitter and LinkedIn, pretty easy to find with a fairly unique name.

835
00:45:58,080 --> 00:45:59,920
Yeah, I think that's a really good point.

836
00:45:59,920 --> 00:46:05,040
When I was, in fact, whenever I'm consuming any kind of large amounts of data or documentation,

837
00:46:05,040 --> 00:46:10,520
same with the MCRA, I'll set aside like an hour a day and just bite off an hour and then

838
00:46:10,520 --> 00:46:15,280
next day bite off another 45 minutes, next day bite off 50 minutes, you know, that's

839
00:46:15,280 --> 00:46:16,280
the old handage, right?

840
00:46:16,280 --> 00:46:19,880
It's like, how do you, how do you eat an elephant, you know, one bite at a time?

841
00:46:19,880 --> 00:46:22,320
So yeah, and it's good, you know, it's good stuff.

842
00:46:22,320 --> 00:46:23,320
Thanks a lot for that, Mark.

843
00:46:23,320 --> 00:46:25,120
That was really insightful.

844
00:46:25,120 --> 00:46:26,320
Thanks for having me on the show.

845
00:46:26,320 --> 00:46:27,320
Yeah, that is right.

846
00:46:27,320 --> 00:46:29,040
We'll see you in a couple of weeks, right?

847
00:46:29,040 --> 00:46:31,360
Well, let's, let's bring this thing to an end.

848
00:46:31,360 --> 00:46:34,000
I would say thank you, Mark, but you're going to be here in a couple of weeks anyway.

849
00:46:34,000 --> 00:46:37,480
So, but to all our listeners, thank you very much for listening.

850
00:46:37,480 --> 00:46:39,760
Stay safe out there and we'll see you next time.

851
00:46:39,760 --> 00:46:42,800
Thanks for listening to the Azure Security Podcast.

852
00:46:42,800 --> 00:46:49,640
You can find show notes and other resources at our website azsecuritypodcast.net.

853
00:46:49,640 --> 00:46:54,800
If you have any questions, please find us on Twitter at Azure Setpod.

854
00:46:54,800 --> 00:47:10,840
Music is from ccmixter.com and licensed under the Creative Commons license.

