1
00:00:00,000 --> 00:00:09,200
Welcome to the Azure Security Podcast,

2
00:00:09,200 --> 00:00:12,360
where we discuss topics relating to security, privacy,

3
00:00:12,360 --> 00:00:16,680
reliability, and compliance on the Microsoft Cloud Platform.

4
00:00:16,680 --> 00:00:20,120
Hey everybody, welcome to Episode 55.

5
00:00:20,120 --> 00:00:22,400
This week is myself, Michael and Sarah.

6
00:00:22,400 --> 00:00:24,800
Glad there's a mark of taking a vacation.

7
00:00:24,800 --> 00:00:26,280
We also have a guest this week,

8
00:00:26,280 --> 00:00:28,840
Matt Sosman, who's here to talk to us about

9
00:00:28,840 --> 00:00:31,080
some of the practicalities of Zero Trust.

10
00:00:31,080 --> 00:00:32,520
But before we get to Matt,

11
00:00:32,520 --> 00:00:34,440
let's take a quick lap around the news.

12
00:00:34,440 --> 00:00:36,120
Sarah, why don't you kick things off?

13
00:00:36,120 --> 00:00:37,800
At the time we're recording this,

14
00:00:37,800 --> 00:00:40,320
it was RSA last week.

15
00:00:40,320 --> 00:00:42,960
I didn't go to RSA this time,

16
00:00:42,960 --> 00:00:46,280
but as you would expect at RSA,

17
00:00:46,280 --> 00:00:48,600
there's quite a few announcements.

18
00:00:48,600 --> 00:00:50,400
The big one, of course,

19
00:00:50,400 --> 00:00:53,960
is we announced Microsoft Entra.

20
00:00:53,960 --> 00:00:59,920
Entra is our suite of identity-based security solutions.

21
00:00:59,920 --> 00:01:02,880
So that includes good old Azure AD,

22
00:01:02,880 --> 00:01:06,320
Microsoft Entra permissions management.

23
00:01:06,320 --> 00:01:09,720
Now that's our Cloud Infrastructure Entitlement Management,

24
00:01:09,720 --> 00:01:10,680
or CIM.

25
00:01:10,680 --> 00:01:13,160
I'm not sure how we're going to say that,

26
00:01:13,160 --> 00:01:14,400
because that's CIM,

27
00:01:14,400 --> 00:01:16,920
but then we have CIM within S as well.

28
00:01:16,920 --> 00:01:19,840
So I know some in the US,

29
00:01:19,840 --> 00:01:22,880
often people say CIM for the one that starts with an S.

30
00:01:22,880 --> 00:01:24,400
I pronounce them the same.

31
00:01:24,400 --> 00:01:26,760
So I guess we'll find out how you pronounce those.

32
00:01:26,760 --> 00:01:29,760
That was, for those of you who are aware,

33
00:01:29,760 --> 00:01:32,040
Microsoft Entra permissions management

34
00:01:32,040 --> 00:01:34,640
is what we're now calling what was Cloud Knox,

35
00:01:34,640 --> 00:01:38,400
which we acquired not too long ago.

36
00:01:38,400 --> 00:01:40,440
And then the other product in there

37
00:01:40,440 --> 00:01:43,000
is Microsoft Entra Verified ID.

38
00:01:43,000 --> 00:01:44,640
So a lot of stuff,

39
00:01:44,640 --> 00:01:48,520
a lot of cool things that we've put into one suite of products.

40
00:01:48,520 --> 00:01:51,160
So there's a lot of,

41
00:01:51,160 --> 00:01:54,240
we'll have links of course in the show notes as usual,

42
00:01:54,240 --> 00:01:56,440
but go check out Entra,

43
00:01:56,440 --> 00:01:58,200
because that is our,

44
00:01:58,200 --> 00:02:00,920
we call it our modern identity and access solutions.

45
00:02:00,920 --> 00:02:03,200
And I'm sure we'll have some people,

46
00:02:03,200 --> 00:02:05,160
in fact, we'll have to go find some people, Michael,

47
00:02:05,160 --> 00:02:07,920
to come on and have a chat to us about that.

48
00:02:07,920 --> 00:02:09,080
For some other announcements,

49
00:02:09,080 --> 00:02:12,120
then we've also got Defender for Cloud.

50
00:02:12,120 --> 00:02:17,320
They have announced that they've now got GA for GCP onboarding.

51
00:02:17,320 --> 00:02:19,120
So if you're using GCP

52
00:02:19,120 --> 00:02:23,320
and you want to use Defender for Cloud to monitor your VMs over there,

53
00:02:23,320 --> 00:02:24,440
you can do that.

54
00:02:24,440 --> 00:02:26,880
And you could do it at the organization level

55
00:02:26,880 --> 00:02:28,560
rather than having to do each project,

56
00:02:28,560 --> 00:02:30,480
which is much nicer.

57
00:02:30,480 --> 00:02:33,720
We've got just-in-time access for AWS VMs.

58
00:02:33,720 --> 00:02:37,920
So it's not just an Azure thing anymore.

59
00:02:37,920 --> 00:02:40,400
I'm not gonna list all of these,

60
00:02:40,400 --> 00:02:45,520
but Defender for Cosmos DB is also GA now.

61
00:02:45,520 --> 00:02:48,880
And I'm gonna jump in and steal Michael's thunder.

62
00:02:48,880 --> 00:02:53,480
We're also, Defender for Cloud is also offering protection

63
00:02:53,480 --> 00:03:00,280
for SQL servers running on AWS EC2, GCP Compute Engine,

64
00:03:00,280 --> 00:03:02,040
and a couple of other ones as well.

65
00:03:02,040 --> 00:03:04,880
So sorry, Michael, I know I shouldn't touch SQL

66
00:03:04,880 --> 00:03:07,720
because that's your baby.

67
00:03:07,720 --> 00:03:09,560
So, yeah, again, I'll link in the show notes.

68
00:03:09,560 --> 00:03:10,360
I'm not gonna...

69
00:03:10,360 --> 00:03:14,160
And then, of course, I've got to finish with my baby Sentinel.

70
00:03:14,160 --> 00:03:16,240
Not as many things to talk about,

71
00:03:16,240 --> 00:03:19,360
but our main thing is that we're really

72
00:03:19,360 --> 00:03:21,720
expanding our solutions marketplace.

73
00:03:21,720 --> 00:03:24,160
That's the thing that comes under Content Hub.

74
00:03:24,160 --> 00:03:26,680
So we've got now...

75
00:03:26,680 --> 00:03:29,280
There's more than 175 solutions.

76
00:03:29,280 --> 00:03:30,480
If you don't know the difference,

77
00:03:30,480 --> 00:03:33,120
I'll just recap because I couldn't not...

78
00:03:33,120 --> 00:03:34,800
If you don't know the difference between a solution

79
00:03:34,800 --> 00:03:36,360
and a data connector,

80
00:03:36,360 --> 00:03:39,160
we used to very much focus on data connectors in Sentinel,

81
00:03:39,160 --> 00:03:41,160
but now we focus on solutions.

82
00:03:41,160 --> 00:03:44,240
So a solution is a data connector.

83
00:03:44,240 --> 00:03:47,520
It can also be workbooks, notebooks,

84
00:03:47,520 --> 00:03:50,040
analytics rules, hunting queries.

85
00:03:50,040 --> 00:03:52,240
A solution is like the whole package of things

86
00:03:52,240 --> 00:03:55,400
that you need to work with a particular data source.

87
00:03:55,400 --> 00:03:58,200
And going forward, we'll definitely be talking

88
00:03:58,200 --> 00:04:00,760
about solutions rather than connectors.

89
00:04:00,760 --> 00:04:02,400
So go and check that out.

90
00:04:02,400 --> 00:04:03,360
We're also...

91
00:04:03,360 --> 00:04:05,760
Another thing that's gonna be really cool

92
00:04:05,760 --> 00:04:08,120
is we're gonna have a unified GitHub community

93
00:04:08,120 --> 00:04:10,760
for our CIM and SOAR and XDR

94
00:04:10,760 --> 00:04:13,440
because you may know, if you spend time in our GitHub,

95
00:04:13,440 --> 00:04:16,560
that Sentinel is currently separate for two defender.

96
00:04:16,560 --> 00:04:18,560
And of course, they are all...

97
00:04:18,560 --> 00:04:19,880
They all work together

98
00:04:19,880 --> 00:04:22,080
and there are overlaps between the two.

99
00:04:22,080 --> 00:04:25,640
So we're actually gonna unify all of that

100
00:04:25,640 --> 00:04:28,880
into a new GitHub community, which is really nice.

101
00:04:28,880 --> 00:04:31,040
And I'm gonna stop there.

102
00:04:31,040 --> 00:04:32,960
That's my news for this time.

103
00:04:32,960 --> 00:04:35,040
Over to you, Michael.

104
00:04:35,040 --> 00:04:37,040
So in case you missed it,

105
00:04:37,040 --> 00:04:41,440
so Sarah just mentioned that SQL Server is my baby.

106
00:04:41,440 --> 00:04:43,640
So even though we announced this,

107
00:04:43,640 --> 00:04:45,440
I announced it briefly in episode 54.

108
00:04:45,440 --> 00:04:48,320
Yes, I've moved over now from working directly

109
00:04:48,320 --> 00:04:52,040
with our customers to the Azure database platform.

110
00:04:52,040 --> 00:04:54,440
So I'm working on SQL Server,

111
00:04:54,440 --> 00:04:58,000
Azure SQL DB, Cosmos DB on the back end,

112
00:04:58,000 --> 00:04:59,760
security compliance, governments,

113
00:04:59,760 --> 00:05:00,640
all that sort of good stuff.

114
00:05:00,640 --> 00:05:03,120
So really nice low level engineering stuff.

115
00:05:03,120 --> 00:05:05,440
So I get to work on code and designs

116
00:05:05,440 --> 00:05:07,120
and threat models and that sort of stuff,

117
00:05:07,120 --> 00:05:09,000
which is right in my wheelhouse.

118
00:05:09,000 --> 00:05:11,240
I'm super excited for that move.

119
00:05:11,240 --> 00:05:12,400
It's also interesting, Sarah,

120
00:05:12,400 --> 00:05:13,760
that you mentioned Microsoft Entra.

121
00:05:13,760 --> 00:05:16,840
So something else that's not been talked about,

122
00:05:16,840 --> 00:05:17,920
but I'll mention it briefly.

123
00:05:17,920 --> 00:05:20,600
So myself and two colleagues,

124
00:05:20,600 --> 00:05:23,080
Heinrich Ganssenbein and Simone Kurtz,

125
00:05:23,080 --> 00:05:25,400
are working on a book for Microsoft Press right now,

126
00:05:25,400 --> 00:05:26,480
tentatively entitled,

127
00:05:26,480 --> 00:05:29,160
Designing and Developing Secure Azure Solutions.

128
00:05:29,160 --> 00:05:31,560
We're on the last chapter right now,

129
00:05:31,560 --> 00:05:33,360
which is actually on identity.

130
00:05:33,360 --> 00:05:35,240
I left it to the last chapter on purpose

131
00:05:35,240 --> 00:05:37,760
because I'm absolutely terrified of the topic.

132
00:05:37,760 --> 00:05:39,160
But as we're sort of writing this thing,

133
00:05:39,160 --> 00:05:41,600
next thing you know, this whole suite of products

134
00:05:41,600 --> 00:05:43,800
comes out that sort of wraps products

135
00:05:43,800 --> 00:05:44,640
that we're talking about,

136
00:05:44,640 --> 00:05:46,360
like as your active directory.

137
00:05:46,360 --> 00:05:47,200
So we've had to go through

138
00:05:47,200 --> 00:05:48,720
and make some little edits to the document.

139
00:05:48,720 --> 00:05:50,800
But yeah, so we're really excited about this book.

140
00:05:50,800 --> 00:05:53,160
Scott Guthrie is writing the forward for the book.

141
00:05:53,160 --> 00:05:56,120
We're gonna be covering design, development,

142
00:05:56,120 --> 00:06:01,120
compliance, cryptography, network for developers.

143
00:06:01,320 --> 00:06:02,920
We're gonna cover the whole gamut.

144
00:06:02,920 --> 00:06:04,480
So I'm really excited for the book to come out.

145
00:06:04,480 --> 00:06:05,640
Not sure the exact date yet,

146
00:06:05,640 --> 00:06:07,680
but yeah, we're on the last chapter right now

147
00:06:07,680 --> 00:06:08,760
in terms of the drafts anyway.

148
00:06:08,760 --> 00:06:10,480
So I'm really excited about that.

149
00:06:10,480 --> 00:06:12,320
So on the topic of news,

150
00:06:12,320 --> 00:06:13,800
so I have a couple of items.

151
00:06:13,800 --> 00:06:16,080
Yes, thank you Sarah for stealing that one from me

152
00:06:16,080 --> 00:06:19,240
about the SQL Server databases on AWS.

153
00:06:19,240 --> 00:06:22,320
But anyway, first one is we now have some new server roles

154
00:06:22,320 --> 00:06:25,360
for Azure SQL Database and SQL Server 2022.

155
00:06:25,360 --> 00:06:27,440
These are in public preview.

156
00:06:27,440 --> 00:06:30,080
They allow you to do certain levels of administration

157
00:06:30,080 --> 00:06:32,880
within the SQL Database without being an admin.

158
00:06:32,880 --> 00:06:34,400
And that is really cool

159
00:06:34,400 --> 00:06:37,920
because that is a great example of least privilege, right?

160
00:06:37,920 --> 00:06:39,520
I mean, historically there are some tasks

161
00:06:39,520 --> 00:06:41,400
that require you basically be the sys admin,

162
00:06:41,400 --> 00:06:43,640
come on in and do all these things.

163
00:06:43,640 --> 00:06:44,800
Well, the problem is a sys admin

164
00:06:44,800 --> 00:06:47,800
has basically unfettered access to absolutely everything.

165
00:06:47,800 --> 00:06:50,080
And that can be really problematic for some customers.

166
00:06:50,080 --> 00:06:53,400
So now we have some new server roles

167
00:06:53,400 --> 00:06:56,320
that will allow you to grant people specific rights

168
00:06:56,320 --> 00:06:58,280
to do certain administrative tasks

169
00:06:58,280 --> 00:07:00,240
without being able to do everything else.

170
00:07:00,240 --> 00:07:04,520
So really granular access and that's really great to see.

171
00:07:04,520 --> 00:07:07,480
Another one, which is another one from SQL Server

172
00:07:07,480 --> 00:07:12,480
is we now have common criteria EAL4 certification

173
00:07:12,520 --> 00:07:15,880
for the SQL 19 series of products.

174
00:07:15,880 --> 00:07:20,000
So EAL4 is a big deal for certain customers,

175
00:07:20,000 --> 00:07:23,520
especially those in governments or in the military.

176
00:07:23,520 --> 00:07:26,480
They require certain levels of assurance.

177
00:07:26,480 --> 00:07:28,040
Notice we say assurance.

178
00:07:28,040 --> 00:07:30,360
There's more to security than just security.

179
00:07:30,360 --> 00:07:33,640
There's basically trustworthyness of the system.

180
00:07:33,640 --> 00:07:36,680
And so we've now attained the EAL4 certification.

181
00:07:36,680 --> 00:07:40,720
Again, I'll provide notes for that in the show notes.

182
00:07:40,720 --> 00:07:43,000
We now also have, funnily enough,

183
00:07:43,000 --> 00:07:45,720
another SQL related news item.

184
00:07:45,720 --> 00:07:49,280
We now have import and export of SQL database

185
00:07:49,280 --> 00:07:50,680
using over private link.

186
00:07:50,680 --> 00:07:52,240
So this is actually a big deal

187
00:07:52,240 --> 00:07:55,800
because right now for the import and export service

188
00:07:55,800 --> 00:07:58,400
requires you basically trust Azure services,

189
00:07:58,400 --> 00:07:59,640
which basically means you trust absolutely

190
00:07:59,640 --> 00:08:01,120
all the Azure services.

191
00:08:01,120 --> 00:08:04,120
And some customers just don't like that.

192
00:08:04,120 --> 00:08:06,640
So in public preview right now,

193
00:08:06,640 --> 00:08:08,240
we have private link support.

194
00:08:08,240 --> 00:08:11,200
And again, as I mentioned, there's some private podcasts

195
00:08:11,200 --> 00:08:13,240
that you'll see more and more PaaS offerings

196
00:08:13,240 --> 00:08:15,280
moved to using private link

197
00:08:15,280 --> 00:08:18,720
for their sort of network isolations story.

198
00:08:18,720 --> 00:08:21,160
One other one is Azure Bastion now allows you

199
00:08:21,160 --> 00:08:25,040
to use IP based connections rather than just DNS names.

200
00:08:25,040 --> 00:08:28,080
This is just to make it easy for some types of customers

201
00:08:28,080 --> 00:08:31,640
who prefer to use IP addresses versus DNS names.

202
00:08:31,640 --> 00:08:34,240
Historically, it would only work with DNS names.

203
00:08:34,240 --> 00:08:36,880
And the last one talking of naming

204
00:08:36,880 --> 00:08:39,680
is as you container apps now has full support

205
00:08:39,680 --> 00:08:42,160
for custom domains in TLS certificates.

206
00:08:42,160 --> 00:08:45,040
So back in prior to now,

207
00:08:45,040 --> 00:08:48,240
you had to use sort of Azure wildcard certificates.

208
00:08:48,240 --> 00:08:49,240
Well, now you don't have to.

209
00:08:49,240 --> 00:08:50,840
You can have your own custom domain

210
00:08:50,840 --> 00:08:53,880
and you can have your own TLS certificate in there.

211
00:08:53,880 --> 00:08:56,040
So that sort of wraps up the news.

212
00:08:56,040 --> 00:08:58,840
So now let's turn our attention to our guest.

213
00:08:58,840 --> 00:09:01,680
Thank you so much for joining us this week, Matt.

214
00:09:01,680 --> 00:09:03,880
Do you want to give us a quick overview

215
00:09:03,880 --> 00:09:06,120
of kind of what you do at Microsoft?

216
00:09:06,120 --> 00:09:07,840
Yeah, thanks again for having me.

217
00:09:07,840 --> 00:09:12,720
So, so Matt Sosman, I've been at Microsoft 10 years,

218
00:09:12,720 --> 00:09:14,200
multiple roles,

219
00:09:14,200 --> 00:09:16,040
working in our consulting services division,

220
00:09:16,040 --> 00:09:18,160
worked in marketing for a while,

221
00:09:18,160 --> 00:09:20,720
worked in our field as a field seller.

222
00:09:20,720 --> 00:09:23,560
And now I'm over on the identity engineering team.

223
00:09:23,560 --> 00:09:28,280
And so my primary focus is the strategy for ISV partners

224
00:09:28,280 --> 00:09:29,880
when it comes to zero trust.

225
00:09:29,880 --> 00:09:31,640
So we'll talk a little bit about that today

226
00:09:31,640 --> 00:09:33,400
and what that means.

227
00:09:33,400 --> 00:09:36,760
But I really try to think about zero trust all day long

228
00:09:36,760 --> 00:09:39,160
and see how we can go out and help our customers

229
00:09:39,160 --> 00:09:40,440
get out there and get secured.

230
00:09:40,440 --> 00:09:41,800
So that's a little bit about me.

231
00:09:41,800 --> 00:09:44,960
We've had two other people talking about zero trust

232
00:09:44,960 --> 00:09:47,720
and I always say the very same thing

233
00:09:47,720 --> 00:09:49,640
which I'm going to say right here,

234
00:09:49,640 --> 00:09:52,760
which is a lot of people see zero trust

235
00:09:52,760 --> 00:09:54,600
and they think, oh, that's just architecture, right?

236
00:09:54,600 --> 00:09:56,560
That's just some marketing speak,

237
00:09:56,560 --> 00:10:00,680
you know, some marketing speak for buying Microsoft products.

238
00:10:00,680 --> 00:10:01,680
I don't agree with that.

239
00:10:01,680 --> 00:10:03,400
I've seen a lot of customers actually take

240
00:10:03,400 --> 00:10:06,240
a very, very practical view of zero trust.

241
00:10:06,240 --> 00:10:09,680
I've just been working recently with a bank, a large bank,

242
00:10:09,680 --> 00:10:14,000
looking at a zero trust sort of plan of attack

243
00:10:14,000 --> 00:10:15,880
for the next few years.

244
00:10:15,880 --> 00:10:19,240
And they're looking at very specific areas around zero trust

245
00:10:19,240 --> 00:10:20,640
and then adopting technologies,

246
00:10:20,640 --> 00:10:23,480
some of the technologies they already have.

247
00:10:23,480 --> 00:10:25,760
But one of the key things is doing like a gap analysis.

248
00:10:25,760 --> 00:10:27,920
Let's find out where you're missing some defenses

249
00:10:27,920 --> 00:10:31,440
that can help sort of flesh out the zero trust story.

250
00:10:31,440 --> 00:10:33,880
So kind of what's your perspective on zero trust

251
00:10:33,880 --> 00:10:36,920
and how are you seeing customers practically use it?

252
00:10:36,920 --> 00:10:38,520
Yeah, it's a great question.

253
00:10:38,520 --> 00:10:41,440
You know, it's funny when you go out and talk to people,

254
00:10:41,440 --> 00:10:43,560
everybody has a kind of a different definition

255
00:10:43,560 --> 00:10:45,720
of zero trust and how they think about it.

256
00:10:46,800 --> 00:10:48,880
And different vendors out there have their own definition.

257
00:10:48,880 --> 00:10:51,200
But for me, when I look at it,

258
00:10:51,200 --> 00:10:54,560
it's more around having a structure and a rhyme

259
00:10:54,560 --> 00:10:57,040
and a reason to your security program

260
00:10:57,040 --> 00:11:01,160
and making sure that it's set up in such a way

261
00:11:01,160 --> 00:11:04,040
that it actually enables the business.

262
00:11:04,040 --> 00:11:05,560
You know, oftentimes I've been doing IT

263
00:11:05,560 --> 00:11:07,360
for ongoing 20 years now.

264
00:11:07,360 --> 00:11:11,480
And oftentimes as an IT pro, we forget about,

265
00:11:11,480 --> 00:11:13,280
you know, what we're actually trying to do for the business.

266
00:11:13,280 --> 00:11:15,240
And so, you know, with zero trust,

267
00:11:15,240 --> 00:11:17,400
it's around enabling users to be more productive,

268
00:11:17,400 --> 00:11:19,560
but also what are the business outcomes, right?

269
00:11:19,560 --> 00:11:20,760
In other words, what are you trying to do

270
00:11:20,760 --> 00:11:22,080
and why are you trying to do it?

271
00:11:22,080 --> 00:11:25,760
Let's not just lock down that SQL database because we can,

272
00:11:25,760 --> 00:11:27,800
what does it actually mean to do it?

273
00:11:27,800 --> 00:11:29,800
And how does that help us in the outcomes

274
00:11:29,800 --> 00:11:32,200
we're actually trying to drive as an example?

275
00:11:32,200 --> 00:11:35,240
And so, you know, so when I think about my definition

276
00:11:35,240 --> 00:11:37,080
around it, it's how do we take those,

277
00:11:37,080 --> 00:11:40,720
the foundational principles to zero trust, right?

278
00:11:40,720 --> 00:11:43,800
Verify explicitly, use at least privilege access

279
00:11:43,800 --> 00:11:45,280
and assume breach.

280
00:11:45,280 --> 00:11:48,920
And how do we translate that into business terms

281
00:11:48,920 --> 00:11:51,120
that can allow the business to take advantage

282
00:11:51,120 --> 00:11:53,680
of the technology and not the other way around?

283
00:11:53,680 --> 00:11:56,200
And so one of the ways we do that

284
00:11:56,200 --> 00:11:59,560
is through meeting the customer where they're at.

285
00:11:59,560 --> 00:12:00,800
And what that means,

286
00:12:00,800 --> 00:12:03,480
and we'll probably get more into this here in a little bit,

287
00:12:03,480 --> 00:12:06,720
but helping them understand how do we take

288
00:12:06,720 --> 00:12:09,240
that heterogeneous architecture you have

289
00:12:09,240 --> 00:12:11,880
and how do we help connect to the dots?

290
00:12:11,880 --> 00:12:14,320
So if you have technology from vendor A

291
00:12:14,320 --> 00:12:16,400
and technology from vendor B,

292
00:12:16,400 --> 00:12:18,440
how do we bring both of those together

293
00:12:18,440 --> 00:12:20,520
that can help your overall zero trust posture

294
00:12:20,520 --> 00:12:22,320
and help that program move forward?

295
00:12:22,320 --> 00:12:24,360
So that's kind of my overall approach to it

296
00:12:24,360 --> 00:12:25,520
and how I think about it.

297
00:12:25,520 --> 00:12:26,640
That makes complete sense.

298
00:12:26,640 --> 00:12:29,520
And I'd like to sort of throw sort of my hat into the ring

299
00:12:29,520 --> 00:12:32,840
here and explain how we did it with this one bank

300
00:12:32,840 --> 00:12:35,520
that really, really resonated with the customer.

301
00:12:35,520 --> 00:12:37,960
And I'd like to sort of get your feeling.

302
00:12:37,960 --> 00:12:39,240
So as you mentioned,

303
00:12:39,240 --> 00:12:41,680
there are three pillars to zero trust,

304
00:12:41,680 --> 00:12:44,520
at least the way we talk about it at Microsoft.

305
00:12:44,520 --> 00:12:48,880
Number one is assume breach.

306
00:12:48,880 --> 00:12:50,160
By the way, these are in no particular order,

307
00:12:50,160 --> 00:12:51,160
but number one is assume breach.

308
00:12:51,160 --> 00:12:53,040
Number two is verify explicitly.

309
00:12:53,040 --> 00:12:55,640
And number three is lease privilege.

310
00:12:55,640 --> 00:12:58,920
And then we have the sort of six major categories.

311
00:12:58,920 --> 00:13:01,160
So we have identity endpoints,

312
00:13:01,160 --> 00:13:04,800
applications, network, infrastructure and data.

313
00:13:04,800 --> 00:13:06,440
And so what I did for this customer was I said,

314
00:13:06,440 --> 00:13:08,120
okay, let's look at these six categories

315
00:13:08,120 --> 00:13:09,840
and let's look at the three pillars of zero trust

316
00:13:09,840 --> 00:13:11,960
and let's map what current products you have

317
00:13:11,960 --> 00:13:15,440
that sort of meet the goals of this intersection.

318
00:13:15,440 --> 00:13:17,120
And let's see where there are gaps.

319
00:13:17,120 --> 00:13:19,800
And so let me just pick on just on two, for example,

320
00:13:19,800 --> 00:13:22,920
verify explicitly and identity.

321
00:13:22,920 --> 00:13:24,520
So what they were using there was

322
00:13:24,520 --> 00:13:27,880
as your active directory and conditional access.

323
00:13:27,880 --> 00:13:30,320
And that made absolute sense to me, right?

324
00:13:30,320 --> 00:13:33,520
So you've got conditional access policies coming into play

325
00:13:33,520 --> 00:13:35,280
and perhaps they may force the use

326
00:13:35,280 --> 00:13:37,280
of multifactual authentication for certain accounts

327
00:13:37,280 --> 00:13:40,040
or based on some signal that's coming in to say,

328
00:13:40,040 --> 00:13:41,240
hey, this guy's, you know,

329
00:13:41,240 --> 00:13:43,400
well, this person's logging in from a, you know,

330
00:13:43,400 --> 00:13:45,120
an interesting IP address.

331
00:13:45,120 --> 00:13:47,800
We need to prompt for a second fact or that kind of stuff.

332
00:13:47,800 --> 00:13:49,440
Another one they have, which is really interesting

333
00:13:49,440 --> 00:13:52,480
is in the area of assume breach and data,

334
00:13:52,480 --> 00:13:55,720
they were using Microsoft information protection.

335
00:13:55,720 --> 00:13:57,440
The nice thing about that is, you know,

336
00:13:57,440 --> 00:13:59,200
if you assume breach and the attack is on the network

337
00:13:59,200 --> 00:14:00,600
and the attacker has access to everything,

338
00:14:00,600 --> 00:14:03,080
then the attacker only has access to the ciphertext, right?

339
00:14:03,080 --> 00:14:05,840
They don't have access to the plain text messages

340
00:14:05,840 --> 00:14:07,680
because everything is covered by a MIB policy

341
00:14:07,680 --> 00:14:09,960
and Microsoft information protection policy.

342
00:14:09,960 --> 00:14:12,160
And what was really cool here is we took,

343
00:14:12,160 --> 00:14:14,240
what essentially ends up being these 18 sections,

344
00:14:14,240 --> 00:14:17,320
again, it's assume breach, verify explicitly

345
00:14:17,320 --> 00:14:19,680
and lease privilege.

346
00:14:19,680 --> 00:14:22,280
And then that was actually on the Y axis.

347
00:14:22,280 --> 00:14:25,320
And then on the X axis, we had identity, endpoints,

348
00:14:25,320 --> 00:14:28,120
applications, network, infrastructure and data.

349
00:14:28,120 --> 00:14:30,120
And we filled in each of those 18 sections

350
00:14:30,120 --> 00:14:32,320
and said, what do you currently have here?

351
00:14:32,320 --> 00:14:33,720
And then are there any gaps?

352
00:14:33,720 --> 00:14:35,360
And we actually found like four gaps

353
00:14:35,360 --> 00:14:36,920
that had absolutely nothing whatsoever.

354
00:14:36,920 --> 00:14:39,440
And one of them was, for example,

355
00:14:39,440 --> 00:14:42,240
lease privilege and network.

356
00:14:42,240 --> 00:14:45,840
They actually had nothing to sort of restrict

357
00:14:45,840 --> 00:14:48,000
network access to, you know, lease privilege manner.

358
00:14:48,000 --> 00:14:49,760
And another one was actually

359
00:14:49,760 --> 00:14:52,480
lease privilege again and infrastructure.

360
00:14:52,480 --> 00:14:53,320
So it was really interesting

361
00:14:53,320 --> 00:14:55,600
because this really gave the customer a nice insight

362
00:14:55,600 --> 00:14:59,120
into what they currently had in terms of services

363
00:14:59,120 --> 00:15:03,840
and both Microsoft and other vendors

364
00:15:03,840 --> 00:15:05,640
to see what they currently had,

365
00:15:05,640 --> 00:15:07,280
what their inventory looked like today

366
00:15:07,280 --> 00:15:09,400
and what they should do moving forward.

367
00:15:09,400 --> 00:15:12,760
And that made absolute sense to the customer

368
00:15:12,760 --> 00:15:16,360
because it's not sort of ethereal anymore, right?

369
00:15:16,360 --> 00:15:18,240
You're sort of saying, here are these three parts

370
00:15:18,240 --> 00:15:20,480
of zero trust and here are these six, you know,

371
00:15:20,480 --> 00:15:22,560
things that we sort of focus on.

372
00:15:22,560 --> 00:15:25,120
What products or services do you have

373
00:15:25,120 --> 00:15:26,520
in each of those areas?

374
00:15:26,520 --> 00:15:27,840
One that they had, which turns out one thing

375
00:15:27,840 --> 00:15:31,920
they had a lot of for assuming breach, lease privilege

376
00:15:31,920 --> 00:15:33,760
and verify explicitly was like things like

377
00:15:33,760 --> 00:15:35,200
Microsoft Endpoint Protection.

378
00:15:36,280 --> 00:15:38,080
They made heavy use of that.

379
00:15:38,080 --> 00:15:40,560
So, I mean, is that the sort of thing that,

380
00:15:40,560 --> 00:15:43,280
the sort of thinking that you're seeing some customers use

381
00:15:43,280 --> 00:15:45,360
or am I just totally off track

382
00:15:45,360 --> 00:15:47,400
and you've seen customers do other things?

383
00:15:47,400 --> 00:15:49,680
And by the way, I'm totally open to being said

384
00:15:49,680 --> 00:15:51,760
that, hey, what I did wasn't great.

385
00:15:51,760 --> 00:15:52,600
That's my hit.

386
00:15:52,600 --> 00:15:53,840
I'm here to learn as well.

387
00:15:53,840 --> 00:15:55,760
Personally, I don't think there's any right or wrong way, right?

388
00:15:55,760 --> 00:15:57,920
It's, you know, every customer is going to be different.

389
00:15:57,920 --> 00:16:00,600
Every organization's other group is going to be different.

390
00:16:00,600 --> 00:16:02,960
And so they have to do what they need to do

391
00:16:02,960 --> 00:16:04,440
that works best for them.

392
00:16:04,440 --> 00:16:06,280
And so, you know, what you outlined,

393
00:16:06,280 --> 00:16:07,880
that sounds like a great approach

394
00:16:07,880 --> 00:16:11,120
for what their needs were and what they had to do.

395
00:16:11,120 --> 00:16:13,440
What I find interesting about it is,

396
00:16:13,440 --> 00:16:14,560
you kind of lose this,

397
00:16:14,560 --> 00:16:16,200
when you start peeling back the onion

398
00:16:16,200 --> 00:16:18,320
and you get into the technology,

399
00:16:18,320 --> 00:16:19,360
the technology is really cool.

400
00:16:19,360 --> 00:16:21,600
I mean, I can geek out on that all day long,

401
00:16:21,600 --> 00:16:24,960
but what's even cooler about it is when we can distill that

402
00:16:24,960 --> 00:16:28,520
into terms that, dare I say, a business executive,

403
00:16:28,520 --> 00:16:31,080
something that's non-technical at the organization,

404
00:16:31,080 --> 00:16:33,760
i.e. the CIO, can actually understand

405
00:16:33,760 --> 00:16:36,360
to go tell the rest of the C-suite,

406
00:16:36,360 --> 00:16:37,880
that's when the magic starts to happen.

407
00:16:37,880 --> 00:16:38,960
I'll give you a great example.

408
00:16:38,960 --> 00:16:40,520
You talked about MIP a little bit.

409
00:16:40,520 --> 00:16:41,840
So, you know, over the pandemic,

410
00:16:41,840 --> 00:16:45,200
we saw the boom and work from home and work anywhere.

411
00:16:45,200 --> 00:16:47,000
Yeah, people are starting to come back to the office,

412
00:16:47,000 --> 00:16:48,880
but we do see organizations out there

413
00:16:48,880 --> 00:16:52,040
that they're actually thinking about going 100% remote

414
00:16:52,040 --> 00:16:55,480
or majority of their work force being remote.

415
00:16:55,480 --> 00:16:57,280
And this is where previously,

416
00:16:57,280 --> 00:16:58,360
they never would have entertained it

417
00:16:58,360 --> 00:17:00,320
because the security team, you know,

418
00:17:00,320 --> 00:17:01,800
it's like, oh, that's too insecure.

419
00:17:01,800 --> 00:17:03,720
How would we ever even do this?

420
00:17:03,720 --> 00:17:05,520
But when you start to look at something like MIP,

421
00:17:05,520 --> 00:17:06,680
it encrypts the data.

422
00:17:06,680 --> 00:17:09,160
And so, I can send you whatever I want.

423
00:17:09,160 --> 00:17:11,960
I could send you highly proprietary information,

424
00:17:11,960 --> 00:17:14,400
but it's encrypted, AES-256.

425
00:17:14,400 --> 00:17:16,040
Now, you have to have my identity to open it,

426
00:17:16,040 --> 00:17:17,400
and I would have to explicitly assign you

427
00:17:17,400 --> 00:17:19,400
permissions to open it.

428
00:17:19,400 --> 00:17:21,480
But the cool part about it is I could revoke that data.

429
00:17:21,480 --> 00:17:23,320
So, if I accidentally send you something

430
00:17:23,320 --> 00:17:24,640
and you have permissions to it,

431
00:17:24,640 --> 00:17:27,280
I can actually revoke it in flight with MIP.

432
00:17:28,200 --> 00:17:30,080
And there's all sorts of other controls around it,

433
00:17:30,080 --> 00:17:32,800
auditing and like even block screenshots.

434
00:17:32,800 --> 00:17:34,640
Like, I mean, there's so many details there.

435
00:17:34,640 --> 00:17:35,840
Well, when you started to steal that,

436
00:17:35,840 --> 00:17:38,240
you take that to one of these executives.

437
00:17:38,240 --> 00:17:41,760
What we saw in a particular customer example I had

438
00:17:41,760 --> 00:17:42,840
in the last six months,

439
00:17:42,840 --> 00:17:45,960
where they're thinking about enabling more of their workforce

440
00:17:45,960 --> 00:17:48,440
to actually be permanently remote,

441
00:17:48,440 --> 00:17:51,160
they got a concern around, well, how do we prevent data

442
00:17:51,160 --> 00:17:53,560
from being accessed on personal devices, right?

443
00:17:53,560 --> 00:17:55,880
You're at home, you've got a personal computer right next

444
00:17:55,880 --> 00:17:56,720
to your work computer.

445
00:17:56,720 --> 00:17:58,920
How are you preventing data spillage?

446
00:17:58,920 --> 00:18:00,680
Well, in comes MIP.

447
00:18:00,680 --> 00:18:03,840
And now, all of a sudden, the ideas start flying around.

448
00:18:03,840 --> 00:18:06,880
Here's how we can, we might be able to make this work

449
00:18:06,880 --> 00:18:07,840
kind of thing, right?

450
00:18:07,840 --> 00:18:09,800
And so, and that's where it gets excited.

451
00:18:09,800 --> 00:18:11,920
And then you start going above and beyond

452
00:18:11,920 --> 00:18:13,640
the other technologies.

453
00:18:13,640 --> 00:18:16,240
And that's where the use cases really start

454
00:18:16,240 --> 00:18:17,240
coming out of the woodwork.

455
00:18:17,240 --> 00:18:20,560
And the business starts seeing the potential

456
00:18:20,560 --> 00:18:23,440
and going back to my kind of my introduction there.

457
00:18:23,440 --> 00:18:24,760
It's all about what are you trying to do

458
00:18:24,760 --> 00:18:25,760
and why are you trying to do it?

459
00:18:25,760 --> 00:18:27,760
But in business terms, how does this,

460
00:18:27,760 --> 00:18:29,360
what's the outcome for the business?

461
00:18:29,360 --> 00:18:31,640
How does it actually move us forward?

462
00:18:31,640 --> 00:18:35,880
Well, if I can enable my data to be securely accessed

463
00:18:35,880 --> 00:18:38,840
anywhere anytime, well, that enables me to, you know,

464
00:18:38,840 --> 00:18:40,920
work remotely like in this case.

465
00:18:40,920 --> 00:18:43,760
Now, all of a sudden, that enables all sorts of new outcomes,

466
00:18:43,760 --> 00:18:44,080
right?

467
00:18:44,080 --> 00:18:48,560
Think about contractors, think about vendors, even customers.

468
00:18:48,560 --> 00:18:51,120
So what you did, I wouldn't say it's not wrong, right?

469
00:18:51,120 --> 00:18:53,320
It's absolutely right for those circumstances

470
00:18:53,320 --> 00:18:55,240
of that customer, but everybody's different.

471
00:18:55,240 --> 00:18:58,080
But it's a matter of getting creative, I think is my point.

472
00:18:58,080 --> 00:19:01,280
And that requires an understanding of what

473
00:19:01,280 --> 00:19:03,080
are you trying to actually do?

474
00:19:03,080 --> 00:19:04,920
Because there's so many times out there

475
00:19:04,920 --> 00:19:06,480
where you get into these conversations.

476
00:19:06,480 --> 00:19:09,480
And we're so deep in the technology,

477
00:19:09,480 --> 00:19:11,920
we kind of lose sight of the pot of gold

478
00:19:11,920 --> 00:19:13,680
into the rainbow, right?

479
00:19:13,680 --> 00:19:16,520
We almost had to take a step back, regroup,

480
00:19:16,520 --> 00:19:19,120
and actually understand what's our objective.

481
00:19:19,120 --> 00:19:20,600
And if we get that clarity, then we

482
00:19:20,600 --> 00:19:23,200
could start to figure out, OK, now how do we use that technology

483
00:19:23,200 --> 00:19:24,080
to get there?

484
00:19:24,080 --> 00:19:27,480
So in your case, you did exactly that with MIPS

485
00:19:27,480 --> 00:19:28,760
and some of those other technologies.

486
00:19:28,760 --> 00:19:31,560
So that was kind of a long, long-winded answer.

487
00:19:31,560 --> 00:19:34,440
But that's kind of how I think about it.

488
00:19:34,440 --> 00:19:36,680
And everybody's different too.

489
00:19:36,680 --> 00:19:39,160
And so that's where, if you don't have an understanding

490
00:19:39,160 --> 00:19:41,800
of the technology, it's not necessarily a bad thing.

491
00:19:41,800 --> 00:19:44,800
But that's where you need to get help.

492
00:19:44,800 --> 00:19:47,040
And you need to understand that, hey, maybe we

493
00:19:47,040 --> 00:19:49,960
need to bring in some outside help, a consultancy firm,

494
00:19:49,960 --> 00:19:53,680
or whoever that can come in to help you understand

495
00:19:53,680 --> 00:19:55,880
the technology that you already own.

496
00:19:55,880 --> 00:19:57,400
Now, yeah, I work for Microsoft, right?

497
00:19:57,400 --> 00:20:01,320
But let's pretend for a moment, you're my customer, Michael.

498
00:20:01,320 --> 00:20:03,760
And Sarah works at your company as well.

499
00:20:03,760 --> 00:20:05,760
And obviously, you use Microsoft technologies,

500
00:20:05,760 --> 00:20:07,800
but you're going to use third-party security vendors

501
00:20:07,800 --> 00:20:08,840
as well, right?

502
00:20:08,840 --> 00:20:13,400
So now it's about having a deep understanding

503
00:20:13,400 --> 00:20:15,480
of how all those technologies work,

504
00:20:15,480 --> 00:20:18,360
and then figuring out how do we piece them together

505
00:20:18,360 --> 00:20:19,920
so we can achieve those business outcomes.

506
00:20:19,920 --> 00:20:23,120
And that's often the challenge.

507
00:20:23,120 --> 00:20:26,080
And again, that's kind of what I'm looking at here at Microsoft,

508
00:20:26,080 --> 00:20:29,000
is how do we help a customer stitch that together?

509
00:20:29,000 --> 00:20:31,120
Let me pause here, kind of going on a tangent here.

510
00:20:31,120 --> 00:20:33,040
But that's how I'm thinking through it.

511
00:20:33,040 --> 00:20:36,400
Yeah, so back to this bank scenario.

512
00:20:36,400 --> 00:20:41,280
So at Microsoft, we have this zero-trust maturity

513
00:20:41,280 --> 00:20:42,920
sort of assessment.

514
00:20:42,920 --> 00:20:45,440
You talk about sort of the pragmatics of it.

515
00:20:45,440 --> 00:20:47,280
So what we do with this customer is we actually went

516
00:20:47,280 --> 00:20:49,240
through this maturity assessment.

517
00:20:49,240 --> 00:20:50,560
It's relatively short.

518
00:20:50,560 --> 00:20:55,600
And the whole point is to understand sort of how mature

519
00:20:55,600 --> 00:20:58,760
you are in these different areas, like what do you use

520
00:20:58,760 --> 00:20:59,680
for endpoint protection?

521
00:20:59,680 --> 00:21:01,040
Where is the endpoint protection?

522
00:21:01,040 --> 00:21:03,400
What do you use for information protection?

523
00:21:03,400 --> 00:21:05,920
There's a whole bunch of stuff.

524
00:21:05,920 --> 00:21:09,520
And then from that, we end up, one of the outcomes of that

525
00:21:09,520 --> 00:21:12,400
is a whole bunch of user stories.

526
00:21:12,400 --> 00:21:16,240
So for example, you could load them into Azure DevOps or GRO

527
00:21:16,240 --> 00:21:21,560
and whatever your system is for managing stories or work items.

528
00:21:21,560 --> 00:21:23,720
Then you assign those to people and people go and investigate

529
00:21:23,720 --> 00:21:25,960
to find out what they have and where the gaps are

530
00:21:25,960 --> 00:21:26,960
and what have you.

531
00:21:26,960 --> 00:21:29,440
And that's why we end up coming with this grid, right?

532
00:21:29,440 --> 00:21:32,280
With the shows, the three areas of zero trust

533
00:21:32,280 --> 00:21:34,680
and then these six categories.

534
00:21:34,680 --> 00:21:37,200
Because that then helps take these stories

535
00:21:37,200 --> 00:21:40,760
and assign them a point on this grid.

536
00:21:40,760 --> 00:21:43,120
So that way we could always find out who the right person was

537
00:21:43,120 --> 00:21:44,960
to assign them to.

538
00:21:44,960 --> 00:21:48,320
And again, that really helped drive the customer.

539
00:21:48,320 --> 00:21:50,160
I'm not going to be wrong, these customers,

540
00:21:50,160 --> 00:21:52,240
this customer is incredibly savvy.

541
00:21:52,240 --> 00:21:54,760
They just wanted some guidance on just the structure

542
00:21:54,760 --> 00:21:56,080
of zero trust.

543
00:21:56,080 --> 00:21:58,480
They like the idea of the three pillars of zero trust.

544
00:21:58,480 --> 00:22:01,200
They like the idea of assuming breach.

545
00:22:01,200 --> 00:22:05,600
And that's a whole mental flip, assuming breach.

546
00:22:05,600 --> 00:22:08,000
I'll give you another example of that.

547
00:22:08,000 --> 00:22:10,760
I was working with a large bank in South America.

548
00:22:10,760 --> 00:22:13,160
I'm not going to name, even name the country.

549
00:22:13,160 --> 00:22:15,120
There's a very large bank in South America

550
00:22:15,120 --> 00:22:19,240
and they were going online with an online presence.

551
00:22:19,240 --> 00:22:23,520
And I was reviewing the security architecture of their design

552
00:22:23,520 --> 00:22:28,000
and we got to a point where they were storing passwords.

553
00:22:28,000 --> 00:22:31,560
And I'm like, so what are you actually storing?

554
00:22:31,560 --> 00:22:32,880
And they said, well, we're encrypting.

555
00:22:32,880 --> 00:22:34,400
We're encrypting the passwords.

556
00:22:34,400 --> 00:22:36,600
So where's the key stored to decrypt that?

557
00:22:36,600 --> 00:22:38,480
Well, it's over there.

558
00:22:38,480 --> 00:22:41,320
It doesn't matter where over there is, it's over there.

559
00:22:41,320 --> 00:22:43,680
I said, but what happens if the attacker gets that key?

560
00:22:43,680 --> 00:22:45,160
Well, they can decrypt everything.

561
00:22:45,160 --> 00:22:47,480
Well, OK, so now we need to stop and think about this

562
00:22:47,480 --> 00:22:51,040
for a minute because let's assume breach.

563
00:22:51,040 --> 00:22:52,560
Let's just think about that for a minute.

564
00:22:52,560 --> 00:22:54,280
Assume breach, the attacker is on the network

565
00:22:54,280 --> 00:22:57,680
and has unfettered access to absolutely everything.

566
00:22:57,680 --> 00:23:02,240
Well, now the attacker can decrypt every single password.

567
00:23:02,240 --> 00:23:04,400
And the customer nodded.

568
00:23:04,400 --> 00:23:07,200
Didn't like the fact they had to nod, but they nodded.

569
00:23:07,200 --> 00:23:09,280
I said, yeah, we need to think a little bit harder than that

570
00:23:09,280 --> 00:23:12,480
because you're a large bank and what you're doing

571
00:23:12,480 --> 00:23:16,840
is significant and will be attacked, guaranteed.

572
00:23:16,840 --> 00:23:19,480
So let's assume breach.

573
00:23:19,480 --> 00:23:22,120
So I said, do you actually need to know the password

574
00:23:22,120 --> 00:23:25,680
or do you need to know the user owns the password?

575
00:23:25,680 --> 00:23:28,640
And I said, well, we just need to know the user owns the password.

576
00:23:28,640 --> 00:23:30,840
Well, rather than encrypting it, we can do it.

577
00:23:30,840 --> 00:23:32,920
I don't want to get into all the crypto wonkiness here,

578
00:23:32,920 --> 00:23:34,480
but I'm going to.

579
00:23:34,480 --> 00:23:37,840
We can store a what's called a salted iterated hash

580
00:23:37,840 --> 00:23:40,080
of the password.

581
00:23:40,080 --> 00:23:42,480
The nice thing about that is you can quickly verify

582
00:23:42,480 --> 00:23:44,400
the user possesses the password, but not actually

583
00:23:44,400 --> 00:23:46,120
store the password.

584
00:23:46,120 --> 00:23:49,200
So now actually look at it from an assumed reach perspective.

585
00:23:49,200 --> 00:23:50,840
So if the attacker gets on the network

586
00:23:50,840 --> 00:23:53,720
and has absolutely unfettered access to absolutely everything,

587
00:23:53,720 --> 00:23:56,360
let's just say it's 10 million people,

588
00:23:56,360 --> 00:23:58,880
the data that the attacker gets is something

589
00:23:58,880 --> 00:24:00,520
the attacker cannot use.

590
00:24:00,520 --> 00:24:01,480
It's not the password.

591
00:24:01,480 --> 00:24:02,320
You can't decrypt it.

592
00:24:02,320 --> 00:24:03,720
There is no password to decrypt.

593
00:24:03,720 --> 00:24:06,840
You're just basically storing something that verifies

594
00:24:06,840 --> 00:24:08,840
the user actually owns the password.

595
00:24:08,840 --> 00:24:09,960
That's all you're doing.

596
00:24:09,960 --> 00:24:12,800
It's nothing the attacker can replay.

597
00:24:12,800 --> 00:24:14,600
So they had about just a few weeks

598
00:24:14,600 --> 00:24:16,200
to go until they were finally shipping,

599
00:24:16,200 --> 00:24:17,800
and they made this small change.

600
00:24:17,800 --> 00:24:19,040
And it was a relatively small change.

601
00:24:19,040 --> 00:24:21,120
And obviously we had to go through all the appropriate checks

602
00:24:21,120 --> 00:24:24,360
and tests to make sure that it didn't cause regressions.

603
00:24:24,360 --> 00:24:27,360
But they ended up shipping with this change,

604
00:24:27,360 --> 00:24:30,000
or they went live with this change, I should say.

605
00:24:30,000 --> 00:24:32,240
And we made a point of telling upper management

606
00:24:32,240 --> 00:24:34,440
that the change that was made, even though it was relatively

607
00:24:34,440 --> 00:24:38,400
late, supported zero trust ideas of assumed region.

608
00:24:38,400 --> 00:24:39,640
If the attacker is on the network,

609
00:24:39,640 --> 00:24:42,640
the attacker has nothing that they can use, nothing whatsoever.

610
00:24:42,640 --> 00:24:46,000
Doesn't matter by any stretch of anyone's imagination,

611
00:24:46,000 --> 00:24:48,480
there's just nothing you can use there.

612
00:24:48,480 --> 00:24:50,360
And the board loved that, especially

613
00:24:50,360 --> 00:24:52,600
when it was a very small engineering change.

614
00:24:52,600 --> 00:24:55,000
But the main change wasn't just the engineering change.

615
00:24:55,000 --> 00:24:57,640
It was actually basically a mental change.

616
00:24:57,640 --> 00:25:01,400
Was it just thinking about the design of the system?

617
00:25:01,400 --> 00:25:03,360
Let's just assume the attacker is on the network

618
00:25:03,360 --> 00:25:06,160
and has access to absolutely everything.

619
00:25:06,160 --> 00:25:07,760
Now what?

620
00:25:07,760 --> 00:25:11,840
And if the answer is, well, we're in deep trouble,

621
00:25:11,840 --> 00:25:13,560
that's not the right answer.

622
00:25:13,560 --> 00:25:16,400
You need to think longer and harder than just,

623
00:25:16,400 --> 00:25:18,240
well, we're in trouble.

624
00:25:18,240 --> 00:25:22,560
So I like that idea of putting new concepts into people's heads

625
00:25:22,560 --> 00:25:25,280
because it will help you design systems in a much more

626
00:25:25,280 --> 00:25:26,640
secure manner.

627
00:25:26,640 --> 00:25:28,440
Yeah, that's a great point.

628
00:25:28,440 --> 00:25:31,800
That's kind of what I meant by my comment around,

629
00:25:31,800 --> 00:25:35,440
if you were my customer, I may make a recommendation,

630
00:25:35,440 --> 00:25:38,000
bring in a consultancy firm to help you think through that.

631
00:25:38,000 --> 00:25:41,200
Because it is a huge mental mind shift

632
00:25:41,200 --> 00:25:43,640
to think about what if an attacker is actually

633
00:25:43,640 --> 00:25:46,440
on the network, what could be compromised right now

634
00:25:46,440 --> 00:25:47,840
that maybe we don't know about?

635
00:25:47,840 --> 00:25:50,840
And when you start to think about in large enterprise,

636
00:25:50,840 --> 00:25:53,280
many of these legacy line of business apps,

637
00:25:53,280 --> 00:25:56,680
they were written years ago, the person, or Tina wrote it,

638
00:25:56,680 --> 00:25:57,720
they're long gone.

639
00:25:57,720 --> 00:26:00,800
It's just kind of, the lights are still being kept on,

640
00:26:00,800 --> 00:26:03,000
but nobody's really maintaining it.

641
00:26:03,000 --> 00:26:05,040
Well, when we start to think about assumed reach,

642
00:26:05,040 --> 00:26:06,360
it means a lot of different things.

643
00:26:06,360 --> 00:26:09,240
But one of it means, let's go back and start to review,

644
00:26:09,240 --> 00:26:12,160
are there passwords stored in the database?

645
00:26:12,160 --> 00:26:16,840
How are these legacy apps functioning and architected?

646
00:26:16,840 --> 00:26:19,680
Do we need to start looking at maybe third party solutions

647
00:26:19,680 --> 00:26:22,480
that can help secure some of these apps that are legacy,

648
00:26:22,480 --> 00:26:24,680
that maybe the developer no longer supports

649
00:26:24,680 --> 00:26:27,360
or anything like that, but there's solutions out there

650
00:26:27,360 --> 00:26:30,560
that can help to protect those and specialize in it.

651
00:26:30,560 --> 00:26:33,240
So it causes you to start to think about a lot.

652
00:26:33,240 --> 00:26:34,960
Another thing around assumed breach

653
00:26:34,960 --> 00:26:39,320
that I come across frequently is from a SOC perspective,

654
00:26:39,320 --> 00:26:42,240
are you actually monitoring the things that are important?

655
00:26:42,240 --> 00:26:45,960
So oftentimes we think about, oh, just send all the logs.

656
00:26:45,960 --> 00:26:49,400
Okay, that's, yes, that's helpful,

657
00:26:49,400 --> 00:26:52,600
but somebody has to put in the logic,

658
00:26:52,600 --> 00:26:55,120
and somebody has to write the workflows

659
00:26:55,120 --> 00:26:58,120
and all of the other automation pieces around,

660
00:26:58,120 --> 00:27:00,440
what do you do with that data coming from the logs?

661
00:27:00,440 --> 00:27:02,840
Not only that, but you also have to think about manpower,

662
00:27:02,840 --> 00:27:05,000
and you have to think about how is your SOC structured?

663
00:27:05,000 --> 00:27:08,640
And the phrase that I keep hearing time and time again

664
00:27:08,640 --> 00:27:10,920
is if nobody's behind the wheel,

665
00:27:10,920 --> 00:27:13,680
or you have a couple of people behind the wheel,

666
00:27:13,680 --> 00:27:15,320
and you have a lot of wheels,

667
00:27:15,320 --> 00:27:17,120
you're probably gonna get alert fatigue,

668
00:27:17,120 --> 00:27:19,200
and you're gonna have gaps in visibility.

669
00:27:19,200 --> 00:27:22,280
So now how do we become more efficient?

670
00:27:22,280 --> 00:27:25,200
And that's another lens of assumed breach,

671
00:27:25,200 --> 00:27:27,240
and how do we actually monitor

672
00:27:27,240 --> 00:27:28,920
when there is something going on?

673
00:27:28,920 --> 00:27:31,800
And making sure we have the right solutions in place

674
00:27:31,800 --> 00:27:32,880
from a technology perspective,

675
00:27:32,880 --> 00:27:34,760
make sure we have the right processes in place

676
00:27:34,760 --> 00:27:35,880
to respond to it.

677
00:27:36,800 --> 00:27:39,880
And it goes way beyond just looking at data in a log.

678
00:27:39,880 --> 00:27:42,560
And so that's kind of that other mental mind shift,

679
00:27:42,560 --> 00:27:45,640
for years we've been used to just send all of the

680
00:27:45,640 --> 00:27:48,600
syslog data to the SEM, and the SEM will deal with it.

681
00:27:48,600 --> 00:27:50,080
May not always be the case.

682
00:27:50,080 --> 00:27:52,000
And so you definitely wanna go through

683
00:27:52,000 --> 00:27:54,280
and make sure that things are properly set up

684
00:27:54,280 --> 00:27:56,280
and that you've accounted for that.

685
00:27:56,280 --> 00:27:59,760
Mind if I share a quick story here about Zero Trust?

686
00:27:59,760 --> 00:28:01,200
Yeah, go for it.

687
00:28:01,200 --> 00:28:04,920
So back when the pandemic first started,

688
00:28:04,920 --> 00:28:08,040
and it was like the first week of March 20th,

689
00:28:08,040 --> 00:28:10,480
and the world sent everybody to work from home,

690
00:28:10,480 --> 00:28:13,080
we started having customers call us,

691
00:28:13,080 --> 00:28:16,200
phone was ringing off the hook, how do we do this?

692
00:28:16,200 --> 00:28:18,520
There was one customer, they were a bank,

693
00:28:18,520 --> 00:28:20,600
they had this culture,

694
00:28:20,600 --> 00:28:24,280
they never actually sent people to work from home.

695
00:28:24,280 --> 00:28:28,320
So they actually all had desktop PCs, very old school,

696
00:28:28,320 --> 00:28:30,600
but they had desktop PCs,

697
00:28:30,600 --> 00:28:33,040
and okay, now we all gotta work from home.

698
00:28:33,040 --> 00:28:34,600
And this was not a large enterprise,

699
00:28:34,600 --> 00:28:36,080
this was more of a mid-sized organization,

700
00:28:36,080 --> 00:28:38,200
but it was a big struggle for them.

701
00:28:38,200 --> 00:28:39,680
So they went out and told their employees,

702
00:28:39,680 --> 00:28:42,200
just go out and here's a stipend,

703
00:28:42,200 --> 00:28:45,080
go out there and buy the laptop of choice.

704
00:28:45,080 --> 00:28:48,240
We started having people buy iPads and Macs and PCs

705
00:28:48,240 --> 00:28:49,240
and who knows what.

706
00:28:50,160 --> 00:28:52,040
And then they started figuring out,

707
00:28:52,040 --> 00:28:56,160
well, with the pandemic came the Great Reshuffle

708
00:28:56,160 --> 00:28:57,360
and the Great Resignations,

709
00:28:57,360 --> 00:29:00,240
and some of these employees started leaving the organization.

710
00:29:00,240 --> 00:29:01,960
Well, here they have a personal device

711
00:29:01,960 --> 00:29:06,320
that they are using to access company data on.

712
00:29:06,320 --> 00:29:07,480
They now left the organization,

713
00:29:07,480 --> 00:29:10,240
the company data is still there.

714
00:29:10,240 --> 00:29:11,960
So you've left the organization,

715
00:29:11,960 --> 00:29:13,360
but you still have data on the device.

716
00:29:13,360 --> 00:29:14,560
Now, have they been using MIPS?

717
00:29:14,560 --> 00:29:18,080
Sure, that would have helped provide some protection.

718
00:29:18,080 --> 00:29:19,400
So make a long story short,

719
00:29:19,400 --> 00:29:21,400
we started working through some scenarios with them

720
00:29:21,400 --> 00:29:23,560
and what we came to the conclusion was,

721
00:29:23,560 --> 00:29:24,880
let the employee go out there

722
00:29:24,880 --> 00:29:27,840
and purchase whatever device they want with that stipend,

723
00:29:27,840 --> 00:29:30,960
but let's employ something like Azure Virtual Desktop

724
00:29:30,960 --> 00:29:32,560
or some kind of VDI,

725
00:29:32,560 --> 00:29:35,800
where everything you're accessing is within that workspace.

726
00:29:35,800 --> 00:29:39,400
And you can't spill data outside that workspace.

727
00:29:39,400 --> 00:29:42,120
So if I'm on my personal iPad

728
00:29:42,120 --> 00:29:45,080
and I try to copy a file from that Azure Virtual Desktop

729
00:29:45,080 --> 00:29:48,000
to my local iPad, policy will block it.

730
00:29:48,000 --> 00:29:49,840
So we started looking at that with that customer

731
00:29:49,840 --> 00:29:50,800
and make a long story short,

732
00:29:50,800 --> 00:29:53,880
I just talked to them about three or four months ago.

733
00:29:53,880 --> 00:29:56,040
You're doing quite well from a security perspective

734
00:29:56,040 --> 00:29:57,640
because everything is happening within

735
00:29:57,640 --> 00:30:00,280
that Azure Virtual Desktop kind of instance, if you will,

736
00:30:00,280 --> 00:30:02,560
everything's being monitored

737
00:30:02,560 --> 00:30:04,040
and they've got policy

738
00:30:04,040 --> 00:30:05,960
where they can actually control the flow of data.

739
00:30:05,960 --> 00:30:08,040
And if employee leaves the organization,

740
00:30:08,040 --> 00:30:10,800
their standard process of killing email access, of course,

741
00:30:10,800 --> 00:30:14,360
but then killing access that Azure Virtual Desktop instance.

742
00:30:14,360 --> 00:30:17,000
So that's another way of looking at zero trust

743
00:30:17,000 --> 00:30:21,800
is kind of containerizing how your employees access the data.

744
00:30:21,800 --> 00:30:25,000
One more short story for you is kind of the opposite.

745
00:30:25,000 --> 00:30:27,320
How do we give the employee the ability to do

746
00:30:27,320 --> 00:30:28,400
whatever they want?

747
00:30:28,400 --> 00:30:30,680
And so we had another organization during the pandemic

748
00:30:30,680 --> 00:30:33,640
that went out, they purchased Microsoft 365

749
00:30:33,640 --> 00:30:36,280
and they got all the great productivity tools.

750
00:30:36,280 --> 00:30:38,360
One of the really cool parts about security

751
00:30:38,360 --> 00:30:40,640
that Microsoft has, like MIP as an example,

752
00:30:40,640 --> 00:30:42,440
it's actually integrated with all the tools.

753
00:30:42,440 --> 00:30:43,280
For this organization,

754
00:30:43,280 --> 00:30:45,280
they never had to actually think about

755
00:30:45,280 --> 00:30:49,760
how do we get MIP encryption to work in an office?

756
00:30:49,760 --> 00:30:51,320
It just works, right?

757
00:30:51,320 --> 00:30:52,560
I mean, sure, there's policy,

758
00:30:52,560 --> 00:30:54,120
you have to configure and that kind of thing,

759
00:30:54,120 --> 00:30:56,440
but you're not having to go out and make code changes

760
00:30:56,440 --> 00:30:59,640
or anything like that, API calls and that kind of thing.

761
00:30:59,640 --> 00:31:01,640
So what I'm trying to say is, again,

762
00:31:01,640 --> 00:31:04,440
you have to kind of look at what are you trying

763
00:31:04,440 --> 00:31:05,560
to actually get done

764
00:31:05,560 --> 00:31:07,520
and how does it actually help the business?

765
00:31:07,520 --> 00:31:08,640
And then from there,

766
00:31:08,640 --> 00:31:10,040
try to figure out the right technology.

767
00:31:10,040 --> 00:31:12,280
Now, what we often hear is,

768
00:31:12,280 --> 00:31:14,440
man, I'm not using everything that's from Microsoft, right?

769
00:31:14,440 --> 00:31:18,600
Sure, I've got maybe Sentinel or I've got MIP as an example,

770
00:31:18,600 --> 00:31:22,320
but I might be using a third-party EDR platform.

771
00:31:22,320 --> 00:31:27,080
I may be using a third-party identity provider.

772
00:31:27,080 --> 00:31:29,080
How do I get all of those to work together?

773
00:31:29,080 --> 00:31:31,680
And so that's where, here at Microsoft,

774
00:31:31,680 --> 00:31:33,680
we've got a program called MISA,

775
00:31:33,680 --> 00:31:36,280
the Microsoft Intelligence Security Association,

776
00:31:36,280 --> 00:31:39,280
where we've gone out and we've worked with these third parties

777
00:31:39,280 --> 00:31:43,880
to perform native integrations with our first-party products

778
00:31:43,880 --> 00:31:48,280
and kind of analogy of we're able to tie the knot together.

779
00:31:48,280 --> 00:31:51,480
So if you're using a third-party,

780
00:31:51,480 --> 00:31:53,880
whatever, third-party EDR platform,

781
00:31:53,880 --> 00:31:54,880
how do we make sure it works

782
00:31:54,880 --> 00:31:57,480
with the rest of your Microsoft tool sets?

783
00:31:57,480 --> 00:32:00,080
Well, through that MISA program, we've gone out and done that work.

784
00:32:00,080 --> 00:32:02,080
And so that can help benefit those customers out there

785
00:32:02,080 --> 00:32:04,480
that have that heterogeneous architecture

786
00:32:04,480 --> 00:32:05,480
without having to worry about,

787
00:32:05,480 --> 00:32:08,480
oh, do I need to go out and flesh out everything for vendor A

788
00:32:08,480 --> 00:32:10,480
to vendor B and that kind of thing?

789
00:32:10,480 --> 00:32:12,080
Let's take a look at it,

790
00:32:12,080 --> 00:32:14,080
because more than likely what you already own

791
00:32:14,080 --> 00:32:15,480
is included at MISA program,

792
00:32:15,480 --> 00:32:17,480
where we actually already have pre-billed integrations

793
00:32:17,480 --> 00:32:19,480
that just need to be enabled and configured.

794
00:32:19,480 --> 00:32:22,080
So again, that's kind of another way to look at zero trust

795
00:32:22,080 --> 00:32:24,480
and that last mile,

796
00:32:24,480 --> 00:32:27,080
but that's often your biggest risk.

797
00:32:27,080 --> 00:32:29,080
And so just wrap my tangent here.

798
00:32:29,080 --> 00:32:32,680
You could do everything that you can to try to protect it

799
00:32:32,680 --> 00:32:34,680
and try to get these policies configured,

800
00:32:34,680 --> 00:32:37,080
but that last mile is still going to be a problem for you.

801
00:32:37,080 --> 00:32:40,080
Whether that last mile is third-party solutions,

802
00:32:40,080 --> 00:32:42,080
try to integrate with Microsoft first-party,

803
00:32:42,080 --> 00:32:44,080
or it's the end user.

804
00:32:44,080 --> 00:32:48,080
And like an example I gave about them using personal devices.

805
00:32:48,080 --> 00:32:52,080
And so as IT, that's something that we often don't really think about.

806
00:32:52,080 --> 00:32:54,080
And so I think kind of the moral of all these stories

807
00:32:54,080 --> 00:32:56,080
is bringing it back to zero trust

808
00:32:56,080 --> 00:32:58,080
is a mindset change.

809
00:32:58,080 --> 00:33:01,080
And it's not a product, you don't go out and buy it.

810
00:33:01,080 --> 00:33:03,080
It's not an end state.

811
00:33:03,080 --> 00:33:06,080
It's not a, hey, how long will it take us to achieve zero trust?

812
00:33:06,080 --> 00:33:09,080
It's what's our structure and our framework?

813
00:33:09,080 --> 00:33:11,080
What's the journey look like that we're going to take?

814
00:33:11,080 --> 00:33:13,080
And how are we going to do it?

815
00:33:13,080 --> 00:33:15,080
And what changes need to be made in the organization

816
00:33:15,080 --> 00:33:16,080
to allow us to do it,

817
00:33:16,080 --> 00:33:19,080
not just with technology, but organizationally as well.

818
00:33:19,080 --> 00:33:24,080
So lots to think about, but I'll pause there.

819
00:33:24,080 --> 00:33:27,080
I'm going to have to jump in here because you talked about SOCs

820
00:33:27,080 --> 00:33:32,080
and as a security operation center.

821
00:33:32,080 --> 00:33:34,080
Matt, what's your take?

822
00:33:34,080 --> 00:33:36,080
Because you know I love to talk about Sentinel.

823
00:33:36,080 --> 00:33:38,080
Everybody knows that.

824
00:33:38,080 --> 00:33:41,080
Going back more to that, because one of my favorite phrases,

825
00:33:41,080 --> 00:33:43,080
and I used to have stickers with it on,

826
00:33:43,080 --> 00:33:45,080
and I need to get them reprinted,

827
00:33:45,080 --> 00:33:47,080
is that collection is not detection.

828
00:33:47,080 --> 00:33:53,080
You know, just collecting logs is not in itself going to do anything.

829
00:33:53,080 --> 00:33:55,080
So what are your thoughts on that,

830
00:33:55,080 --> 00:34:00,080
assuming breach and collecting logs in that security operations environment?

831
00:34:00,080 --> 00:34:03,080
I mean, if we rewind 10, even 20 years, right,

832
00:34:03,080 --> 00:34:06,080
and we look at sending log data to a SEM,

833
00:34:06,080 --> 00:34:09,080
SEM seemed, you know, whatever we want to call it, right,

834
00:34:09,080 --> 00:34:12,080
but send it to that centralized platform that collects all the log data.

835
00:34:12,080 --> 00:34:14,080
Somebody still has to look at that.

836
00:34:14,080 --> 00:34:16,080
And so we started to think about years ago,

837
00:34:16,080 --> 00:34:18,080
well, let's hire the best talent.

838
00:34:18,080 --> 00:34:21,080
Let's send to Vegas every year, go to the conferences from the vendor,

839
00:34:21,080 --> 00:34:23,080
throw all the certifications at them,

840
00:34:23,080 --> 00:34:26,080
and then they can go through the log data and try to figure it out.

841
00:34:26,080 --> 00:34:28,080
Well, fast forward, that didn't work out too well,

842
00:34:28,080 --> 00:34:33,080
because no human can possibly go through the massive volume in these logs, right?

843
00:34:33,080 --> 00:34:34,080
So that doesn't work.

844
00:34:34,080 --> 00:34:37,080
So now we need machine learning to look at patterns.

845
00:34:37,080 --> 00:34:40,080
We need, you know, artificial intelligence to be able to make decisions.

846
00:34:40,080 --> 00:34:47,080
But we also still need humans that possess the skill to build out the policy.

847
00:34:47,080 --> 00:34:50,080
So like in Sentinel terms, analytic rules.

848
00:34:50,080 --> 00:34:54,080
So I've got all this data that's coming in from this, you know, security tool,

849
00:34:54,080 --> 00:34:56,080
all this log data.

850
00:34:56,080 --> 00:34:59,080
Sentinel can, you know, obviously, and so you could talk way more about this on IKAM,

851
00:34:59,080 --> 00:35:01,080
but it's intelligent.

852
00:35:01,080 --> 00:35:05,080
But however, there's probably some additional context around that log data

853
00:35:05,080 --> 00:35:08,080
that I would need to build some rules in so that Sentinel's aware of it.

854
00:35:08,080 --> 00:35:11,080
So my point is like somebody has to do the work.

855
00:35:11,080 --> 00:35:15,080
And then on top of that, somebody has to be behind the wheel to actually monitor.

856
00:35:15,080 --> 00:35:19,080
And so I once met a managed service provider just a couple of years ago.

857
00:35:19,080 --> 00:35:23,080
They did all of this work to send, and they weren't using Sentinel.

858
00:35:23,080 --> 00:35:26,080
They were using another SIM, but they did all this work to send all of their log data

859
00:35:26,080 --> 00:35:29,080
from all their tools to a SIM.

860
00:35:29,080 --> 00:35:32,080
But they didn't have anybody actually monitoring it.

861
00:35:32,080 --> 00:35:35,080
They were relying on just email alerts.

862
00:35:35,080 --> 00:35:38,080
It's like, well, it's a little bit more to it than that, right?

863
00:35:38,080 --> 00:35:42,080
And so you don't want to also get that false sense of security, right?

864
00:35:42,080 --> 00:35:44,080
Oh, we'll just send it all to the SIM and be done with it.

865
00:35:44,080 --> 00:35:49,080
But you know, there's a lot more that you need to invest in that from time and energy

866
00:35:49,080 --> 00:35:52,080
to be able to build this out in such a way, but then have that process

867
00:35:52,080 --> 00:35:57,080
to make a green field to where you're constantly iterating on it and you're documenting it, right?

868
00:35:57,080 --> 00:36:00,080
And then what's the reporting structure around it, right?

869
00:36:00,080 --> 00:36:03,080
How often are you sitting down, running the reports and sitting down

870
00:36:03,080 --> 00:36:07,080
if you're a managed service provider running with your clients or sitting down with the SOC

871
00:36:07,080 --> 00:36:12,080
and maybe the CSO and reviewing what's happened over this past quarter according to the data

872
00:36:12,080 --> 00:36:15,080
and trying to do better, right?

873
00:36:15,080 --> 00:36:20,080
So I mean, there's a lot that goes in there that, again, that assumed reachment mindset

874
00:36:20,080 --> 00:36:23,080
and the whole zero trust mindset, it starts to unpack all of that.

875
00:36:23,080 --> 00:36:26,080
But, you know, Sarah, I would love to get your thoughts on that,

876
00:36:26,080 --> 00:36:30,080
but that's how I start to think about just from a SOC perspective.

877
00:36:30,080 --> 00:36:32,080
We got to do it differently than we have been.

878
00:36:32,080 --> 00:36:34,080
I think it's my point.

879
00:36:34,080 --> 00:36:38,080
Yeah, I mean, we probably don't have enough time on the podcast right now

880
00:36:38,080 --> 00:36:42,080
for me to talk about all of my thoughts, but I'll tell you a story as well.

881
00:36:42,080 --> 00:36:45,080
Seeing as we're in story mode this week.

882
00:36:45,080 --> 00:36:52,080
I once saw a couple and again, it wasn't Sentinel, it was a different scene.

883
00:36:52,080 --> 00:36:55,080
But in some of my previous consulting life,

884
00:36:55,080 --> 00:37:00,080
they had some organizations that were working to a particular standard

885
00:37:00,080 --> 00:37:03,080
that was part of their industry vertical.

886
00:37:03,080 --> 00:37:08,080
And the standard basically said that they needed to log to a central place.

887
00:37:08,080 --> 00:37:12,080
And so they locked to a central place that they put a seam in place

888
00:37:12,080 --> 00:37:16,080
and they were sending logs to it, but they didn't actually look at it.

889
00:37:16,080 --> 00:37:21,080
But because the requirement that the standard they were following

890
00:37:21,080 --> 00:37:27,080
just said they had to log it, they had met that standard that they could tick a box.

891
00:37:27,080 --> 00:37:30,080
Was it really very useful from a security perspective?

892
00:37:30,080 --> 00:37:32,080
I would argue definitely not.

893
00:37:32,080 --> 00:37:34,080
But these are the things we see, right?

894
00:37:34,080 --> 00:37:39,080
And definitely we can have a whole different discussion on standards

895
00:37:39,080 --> 00:37:44,080
and whether adhering to standards, well, compliance, well, compliance is not security.

896
00:37:44,080 --> 00:37:46,080
That's probably an episode in itself.

897
00:37:46,080 --> 00:37:49,080
But yeah, I know exactly what you mean.

898
00:37:49,080 --> 00:37:52,080
Yeah, you know, again, like when I think of my,

899
00:37:52,080 --> 00:37:56,080
when you talk about, you know, how do I approach your trust differently than maybe others?

900
00:37:56,080 --> 00:37:58,080
I think everybody has their own approach to it, right?

901
00:37:58,080 --> 00:38:01,080
But again, like my approach, I look at it from a practical standpoint,

902
00:38:01,080 --> 00:38:03,080
but more from a business standpoint.

903
00:38:03,080 --> 00:38:07,080
And how do we actually make this work for the organization?

904
00:38:07,080 --> 00:38:09,080
How does it help us move things forward?

905
00:38:09,080 --> 00:38:16,080
And in many cases, we've seen Zero Trust do wonderful things at a business level, right?

906
00:38:16,080 --> 00:38:19,080
I.e. enable a company to outsource more.

907
00:38:19,080 --> 00:38:24,080
If that was their end goal was let's outsource more, but security was stopping it.

908
00:38:24,080 --> 00:38:27,080
We've seen Zero Trust actually empower that and enable that and drive that.

909
00:38:27,080 --> 00:38:33,080
We've seen Zero Trust do things like help the company acquire more customers,

910
00:38:33,080 --> 00:38:37,080
because now they're able to, you know, interact with customers in a digital way

911
00:38:37,080 --> 00:38:40,080
and Zero Trust comes in and provides a security layer for that.

912
00:38:40,080 --> 00:38:45,080
So we've seen all sorts of great use cases, but you do have to take a step back and think about,

913
00:38:45,080 --> 00:38:48,080
you know, what are we trying to do and why?

914
00:38:48,080 --> 00:38:50,080
Really, why is the most important part?

915
00:38:50,080 --> 00:38:51,080
You probably want to start with that.

916
00:38:51,080 --> 00:38:54,080
But it does require almost a rewrite of the book.

917
00:38:54,080 --> 00:38:58,080
And it's tough for me to say, because I've been an IT professional most of my career,

918
00:38:58,080 --> 00:39:02,080
but you do need to make sure that leadership is on board

919
00:39:02,080 --> 00:39:05,080
and making sure that not only are you making technical changes,

920
00:39:05,080 --> 00:39:10,080
but also make an organizational changes, i.e. strategy and mindset shift

921
00:39:10,080 --> 00:39:16,080
and cultural shift and all of those things, if you really want Zero Trust to have an impact.

922
00:39:16,080 --> 00:39:17,080
So it's interesting to bring that up.

923
00:39:17,080 --> 00:39:21,080
So Sarah and I actually have had a conversation over the last week or so.

924
00:39:21,080 --> 00:39:26,080
Like if you look at the audit logs that come out of SQL Server or Azure SQL DB,

925
00:39:26,080 --> 00:39:32,080
they're pretty hard to understand unless you actually know what each of the line items means.

926
00:39:32,080 --> 00:39:37,080
But, you know, when you fold in some of the kind of intelligence that's built into Sentinel,

927
00:39:37,080 --> 00:39:43,080
the fact that it's got functionality built into it to actually make sense of SQL logs,

928
00:39:43,080 --> 00:39:47,080
again, it just makes so much more sense about what's going on in your Azure SQL DB

929
00:39:47,080 --> 00:39:51,080
or your SQL MI or your SQL Server on-prem or in a VM,

930
00:39:51,080 --> 00:39:55,080
it makes so much more sense so you actually understand precisely what's going on.

931
00:39:55,080 --> 00:40:00,080
Make no mistake, those things can produce absolutely massive series of logs.

932
00:40:00,080 --> 00:40:05,080
And so, yeah, it's great to see tools like Sentinel making sense of all that.

933
00:40:05,080 --> 00:40:11,080
Alright Matt, so one question that we ask all our guests is if you had just one thought to leave our listeners with,

934
00:40:11,080 --> 00:40:13,080
what would it be?

935
00:40:13,080 --> 00:40:15,080
You know, I'm going to be really simple.

936
00:40:15,080 --> 00:40:18,080
Great question. Just do the basics.

937
00:40:18,080 --> 00:40:26,080
If you go out and you read the breach reports from Verizon and Microsoft and IBM and whoever,

938
00:40:26,080 --> 00:40:33,080
usually what you'll find in the root cause of these data breaches is the basics were missed.

939
00:40:33,080 --> 00:40:37,080
Identity hygiene or patching of systems.

940
00:40:37,080 --> 00:40:41,080
Again, it's hard for me to say as an IT pro, but you want to almost take a step back

941
00:40:41,080 --> 00:40:45,080
and figure out where we're missing on the basics.

942
00:40:45,080 --> 00:40:50,080
And let's go back in and make sure we are covering the basics first.

943
00:40:50,080 --> 00:40:52,080
And I mean basics, right?

944
00:40:52,080 --> 00:40:57,080
If it's making sure you've got multi-factor authentication deployed for every user,

945
00:40:57,080 --> 00:40:59,080
go through and double check kind of thing, right?

946
00:40:59,080 --> 00:41:04,080
Or your patching program or whatever it may be, password stored in a database.

947
00:41:04,080 --> 00:41:09,080
So that's kind of the final thought I'll leave is just make sure you're doing the basics.

948
00:41:09,080 --> 00:41:12,080
And as you go through that journey, you will like,

949
00:41:12,080 --> 00:41:17,080
guarantee you will start to figure out additional things that you can add to that roadmap

950
00:41:17,080 --> 00:41:20,080
and then start building up your zero trust strategy.

951
00:41:20,080 --> 00:41:21,080
So that's what I'll leave us with.

952
00:41:21,080 --> 00:41:24,080
Yeah, thanks for that. I think, yeah, people need to, you know,

953
00:41:24,080 --> 00:41:26,080
perhaps not take a cynical approach to zero trust.

954
00:41:26,080 --> 00:41:27,080
It's not a marketing term.

955
00:41:27,080 --> 00:41:29,080
It's not a, but it's by the same token, it's not a product.

956
00:41:29,080 --> 00:41:32,080
It's not a product Microsoft will sell to you.

957
00:41:32,080 --> 00:41:33,080
It's a series of practices.

958
00:41:33,080 --> 00:41:34,080
It's a series of technologies.

959
00:41:34,080 --> 00:41:35,080
It's a series of tools.

960
00:41:35,080 --> 00:41:39,080
It's a series of mindsets with a goal of sort of protecting environments

961
00:41:39,080 --> 00:41:43,080
and this sort of this modern world, you know, within which we live.

962
00:41:43,080 --> 00:41:46,080
So again, thank you so much for taking the time to see us this week.

963
00:41:46,080 --> 00:41:51,080
And to all our listeners out there, stay safe and we'll see you next time.

964
00:41:51,080 --> 00:41:54,080
Thanks for listening to the Azure Security Podcast.

965
00:41:54,080 --> 00:42:01,080
You can find show notes and other resources at our website azsecuritypodcast.net.

966
00:42:01,080 --> 00:42:06,080
If you have any questions, please find us on Twitter at azuresetpod.

967
00:42:06,080 --> 00:42:32,080
Background music is from ccmixter.com and licensed under the Creative Commons license.

