1
00:00:00,000 --> 00:00:06,200
Welcome to the Azure Security Podcast,

2
00:00:06,200 --> 00:00:09,360
where we discuss topics relating to security, privacy,

3
00:00:09,360 --> 00:00:13,760
reliability, and compliance on the Microsoft Cloud Platform.

4
00:00:13,760 --> 00:00:16,800
Hey everybody, welcome to Episode 39.

5
00:00:16,800 --> 00:00:18,000
This week we have a full house.

6
00:00:18,000 --> 00:00:21,120
We have myself, Michael, Sarah, Mark, and Gladys.

7
00:00:21,120 --> 00:00:23,400
Our guest this week is Mark McIntyre,

8
00:00:23,400 --> 00:00:26,760
who's here to talk to us about the Microsoft Digital Defense Report.

9
00:00:26,760 --> 00:00:28,080
But before we get to Mark,

10
00:00:28,080 --> 00:00:29,200
let's take a look at the news.

11
00:00:29,200 --> 00:00:30,960
Sarah, why don't you kick things off?

12
00:00:30,960 --> 00:00:35,680
Sure. I'm going to talk about the usual things that I talk about,

13
00:00:35,680 --> 00:00:40,720
which is security center and a bit of Sentinel, of course, my baby.

14
00:00:40,720 --> 00:00:42,720
The first thing I'll talk about,

15
00:00:42,720 --> 00:00:44,400
let's go through some Sentinel stuff.

16
00:00:44,400 --> 00:00:47,680
We released just this week when we were recording this,

17
00:00:47,680 --> 00:00:52,400
Playbook templates, which means now you don't have to go into the GitHub repo.

18
00:00:52,400 --> 00:00:57,040
You can go straight into the UI and you can find Playbooks.

19
00:00:57,040 --> 00:00:59,760
There's a template there that's been prebuilt, tested,

20
00:00:59,760 --> 00:01:01,600
and you can just deploy it straight from the UI,

21
00:01:01,600 --> 00:01:04,640
which is really cool and we'll be adding more of those.

22
00:01:04,640 --> 00:01:08,440
Also, we've released our DHCP normalization schema

23
00:01:08,440 --> 00:01:11,920
for the Azure Sentinel information model or ASIM.

24
00:01:11,920 --> 00:01:16,200
So go and have a look at that if you're wanting to look at how we are

25
00:01:16,200 --> 00:01:20,320
continuing on our normalization journey.

26
00:01:20,320 --> 00:01:22,560
Then in Azure Security Center,

27
00:01:22,560 --> 00:01:27,600
we've got the Microsoft Threat and Vulnerability Management

28
00:01:27,600 --> 00:01:30,240
added as a vulnerability assessment solution.

29
00:01:30,240 --> 00:01:33,040
So that's extending our integration

30
00:01:33,040 --> 00:01:36,080
between Defender for Service and Defender for Endpoint.

31
00:01:36,080 --> 00:01:40,800
You can now auto-enable vulnerability assessment solutions as well.

32
00:01:40,800 --> 00:01:45,680
So if you're using QOLIS or you're using the Microsoft version,

33
00:01:45,680 --> 00:01:47,840
you can now have that auto-enabled.

34
00:01:47,840 --> 00:01:49,760
You don't have to go in and turn it on manually,

35
00:01:49,760 --> 00:01:54,640
which of course is important if you want everything to be monitored.

36
00:01:54,640 --> 00:01:56,400
I'm going to leave it there,

37
00:01:56,400 --> 00:01:58,720
otherwise I'll talk about Sentinel forever.

38
00:01:58,720 --> 00:02:01,840
Hey, before you run away, what's normalization?

39
00:02:02,960 --> 00:02:11,040
So normalization is the process of basically standardizing different data sources

40
00:02:11,040 --> 00:02:13,680
in a way that ASIM understands it.

41
00:02:13,680 --> 00:02:19,520
So say you get a log source in and it's got a number of IP addresses

42
00:02:19,520 --> 00:02:25,760
and maybe it's called IP address in the data column it comes in in.

43
00:02:25,760 --> 00:02:28,160
Another data source might bring in IP addresses as well,

44
00:02:28,160 --> 00:02:30,000
but it might give it a slightly different name.

45
00:02:30,000 --> 00:02:31,280
It might just be IP.

46
00:02:32,080 --> 00:02:33,840
It's still the same piece of data,

47
00:02:34,480 --> 00:02:36,640
but without normalization,

48
00:02:38,080 --> 00:02:42,720
your SIEM won't necessarily understand that that's the same piece of data.

49
00:02:42,720 --> 00:02:44,960
And the reason you want to normalize everything

50
00:02:44,960 --> 00:02:48,080
is so that we can write collateral for the SIEM

51
00:02:48,080 --> 00:02:50,560
that's source agnostic.

52
00:02:50,560 --> 00:02:54,560
So it means that if you've normalized all your data,

53
00:02:54,560 --> 00:02:57,680
Sentinel will know when it comes to this source,

54
00:02:57,680 --> 00:03:01,280
this is an IP or it's a host or it's a message.

55
00:03:01,280 --> 00:03:04,640
And that means that it simplifies our writing rules

56
00:03:04,640 --> 00:03:07,040
and other collateral for a SIEM.

57
00:03:07,040 --> 00:03:12,400
It's an important thing and it just makes everybody's lives a lot easier.

58
00:03:12,400 --> 00:03:14,320
So we love a bit of normalization.

59
00:03:14,320 --> 00:03:17,600
Yeah, I mean, I kind of thought I knew what it was,

60
00:03:17,600 --> 00:03:18,800
but unless it really didn't.

61
00:03:18,800 --> 00:03:20,160
So thanks for that.

62
00:03:21,600 --> 00:03:22,640
So on the news front,

63
00:03:22,640 --> 00:03:25,680
a couple of things sort of caught my interest the last couple of weeks.

64
00:03:25,680 --> 00:03:28,000
The first one is I know it's not directly as you related,

65
00:03:28,000 --> 00:03:31,040
but Windows 11, as some of you are probably well aware,

66
00:03:31,040 --> 00:03:34,880
there's a TPM trusted platform module 2.0 requirement.

67
00:03:35,760 --> 00:03:37,280
That's caused a lot of people to sort of ask,

68
00:03:37,280 --> 00:03:38,160
why is it required?

69
00:03:38,160 --> 00:03:39,920
You know, what's going on there?

70
00:03:40,880 --> 00:03:42,240
We've actually released a video,

71
00:03:42,240 --> 00:03:44,000
which is actually a really cool video actually,

72
00:03:44,000 --> 00:03:49,600
of the need for the TPM 2.0 in light of the current really sophisticated attacks.

73
00:03:49,600 --> 00:03:51,040
In fact, we even give demos.

74
00:03:51,040 --> 00:03:54,000
There's a demo of a Windows 10 machine with all the defenses turned off,

75
00:03:54,560 --> 00:03:57,280
where there's all sorts of really low level attacks

76
00:03:57,280 --> 00:03:59,440
being performed against bootloaders and so on.

77
00:03:59,440 --> 00:04:01,680
Then the same attack is done against Windows 11,

78
00:04:01,680 --> 00:04:04,720
default to Windows 11, and the attack just fails.

79
00:04:04,720 --> 00:04:06,080
The video is well worth watching.

80
00:04:06,080 --> 00:04:07,440
It's about 12 minutes long.

81
00:04:07,440 --> 00:04:09,840
I can almost guarantee you will learn something from it.

82
00:04:09,840 --> 00:04:12,160
If nothing else, the demos are entertaining.

83
00:04:12,160 --> 00:04:14,400
I do want to spell something else as well,

84
00:04:14,400 --> 00:04:16,720
that was called out in the demo, in the video, sorry.

85
00:04:17,520 --> 00:04:19,760
And that is, these defenses have been around for quite some time.

86
00:04:19,760 --> 00:04:21,280
I mean, they've been available in Windows 10.

87
00:04:21,280 --> 00:04:23,600
They just weren't enabled by default.

88
00:04:23,600 --> 00:04:26,640
But whereas now we're requiring them to be enabled by default for Windows 11.

89
00:04:27,360 --> 00:04:30,480
The other item that took my note the last couple of weeks

90
00:04:30,480 --> 00:04:34,080
is the OWASP Open Web Application Security Project.

91
00:04:34,080 --> 00:04:36,640
They just had their 20th birthday.

92
00:04:36,640 --> 00:04:42,800
And it's funny I was talking to the guy that actually started OWASP a few weeks ago.

93
00:04:42,800 --> 00:04:45,360
I'm like, I'm not sure if that's a good thing or a bad thing,

94
00:04:45,360 --> 00:04:48,160
being 20 years old when your job is application security.

95
00:04:48,160 --> 00:04:52,320
But that being said, the impact that OWASP has made,

96
00:04:52,320 --> 00:04:56,640
both in Microsoft, both in Azure, and with our customers,

97
00:04:56,640 --> 00:05:00,960
and with their regulatory requirements, really can't be underplayed.

98
00:05:00,960 --> 00:05:06,960
They've made a really big impact on application security.

99
00:05:06,960 --> 00:05:08,160
And for those of you who are not aware,

100
00:05:08,160 --> 00:05:11,840
they've also released their 2021 OWASP top 10.

101
00:05:11,840 --> 00:05:14,960
And without trying to sound too cynical, this is by far,

102
00:05:14,960 --> 00:05:18,400
in my opinion, their best OWASP top 10.

103
00:05:18,400 --> 00:05:21,680
I agree with just about everything that's in there.

104
00:05:21,680 --> 00:05:24,240
Not that, you know, my opinion is anything special,

105
00:05:24,240 --> 00:05:27,200
but prior top 10s, I've had sort of concerns with it.

106
00:05:27,200 --> 00:05:31,200
Sort of various items being at different levels of abstraction,

107
00:05:31,200 --> 00:05:32,560
different vulnerability classes,

108
00:05:32,560 --> 00:05:34,960
where they've actually gone to be quite consistent

109
00:05:34,960 --> 00:05:37,360
in the way they represent vulnerability classes.

110
00:05:37,360 --> 00:05:39,840
They've also, for the very first time,

111
00:05:39,840 --> 00:05:43,360
called out threat modeling as a requirement,

112
00:05:43,360 --> 00:05:46,160
which is fantastic to see because as you're probably well aware,

113
00:05:46,160 --> 00:05:49,200
Microsoft threat modeling is something that we're really big on.

114
00:05:49,200 --> 00:05:52,000
And it's great to see threat modeling being called out

115
00:05:52,000 --> 00:05:55,360
as a requirement around designing secure systems.

116
00:05:55,360 --> 00:05:59,360
Yeah, so the two things that I picked up in the recent weeks

117
00:05:59,360 --> 00:06:02,160
was I updated Mark's list.

118
00:06:02,160 --> 00:06:04,800
So the set of links that I keep,

119
00:06:04,800 --> 00:06:08,560
that I constantly refer to colleagues, customers, partners,

120
00:06:08,560 --> 00:06:14,000
you name it, and added a few items there around the Ninja training

121
00:06:14,000 --> 00:06:16,000
for all the various different products,

122
00:06:16,000 --> 00:06:19,600
Defender for IoT, Sentinel, Defender for Endpoints, you name it.

123
00:06:19,600 --> 00:06:22,400
There's a really nice set of in-depth training for each of those.

124
00:06:22,400 --> 00:06:23,600
So I added that there.

125
00:06:23,600 --> 00:06:26,400
We also announced on the last podcast,

126
00:06:26,400 --> 00:06:31,360
we have the cybersecurity reference architecture videos are out,

127
00:06:31,360 --> 00:06:32,800
as are the CAF secure.

128
00:06:32,800 --> 00:06:36,160
So we have these sort of program and components

129
00:06:36,160 --> 00:06:38,560
and disciplines of security kind of reference model

130
00:06:38,560 --> 00:06:41,360
of what good looks like, what success looks like,

131
00:06:41,360 --> 00:06:43,760
and then a architectural reference as well

132
00:06:43,760 --> 00:06:45,760
for that more technical view.

133
00:06:45,760 --> 00:06:49,760
And then the other piece, of course, is the MDDR,

134
00:06:49,760 --> 00:06:51,760
the Microsoft Digital Defense Report,

135
00:06:51,760 --> 00:06:54,800
which we're going to talk about in a lot more depth

136
00:06:54,800 --> 00:06:56,800
with Mark McIntyre.

137
00:06:56,800 --> 00:06:57,800
Hello, everyone.

138
00:06:57,800 --> 00:07:01,800
This is Gladys, and I wanted to let you know about a blog

139
00:07:01,800 --> 00:07:06,800
that talked about a 2.4 terabyte DDoS attack

140
00:07:06,800 --> 00:07:12,800
that Microsoft observed targeting an Azure customer in Europe.

141
00:07:12,800 --> 00:07:16,800
This attack was 140% increased

142
00:07:16,800 --> 00:07:20,800
than any previous network volumetric event

143
00:07:20,800 --> 00:07:22,800
experienced on Azure.

144
00:07:22,800 --> 00:07:27,800
The traffic originated from about 70,000 sources

145
00:07:27,800 --> 00:07:29,800
from many different countries.

146
00:07:29,800 --> 00:07:35,800
The blog goes on into explaining how UDP was used

147
00:07:35,800 --> 00:07:37,800
with very short live burst

148
00:07:37,800 --> 00:07:44,800
and how Azure DDoS protection could scale to absorb the volume

149
00:07:44,800 --> 00:07:48,800
and allow the customer to continue business as usual.

150
00:07:48,800 --> 00:07:52,800
The one mitigation that I was pleasantly surprised to learn about

151
00:07:52,800 --> 00:07:56,800
was that Azure dynamically allocated mitigation resources

152
00:07:56,800 --> 00:08:02,800
to the optimal location which were closest to the attack sources.

153
00:08:02,800 --> 00:08:09,800
So the traffic that originated in Asia Pacific and U.S.

154
00:08:09,800 --> 00:08:13,800
basically never reached the customer region,

155
00:08:13,800 --> 00:08:16,800
but was instead mitigated at the source countries.

156
00:08:16,800 --> 00:08:22,800
Another thing discussed in this blog is how to enable the DDoS protection.

157
00:08:22,800 --> 00:08:27,800
Every property in Azure is protected by Azure infrastructure

158
00:08:27,800 --> 00:08:31,800
DDoS basic protection at no additional cost.

159
00:08:31,800 --> 00:08:36,800
DDoS protection basic helps protect all Azure services

160
00:08:36,800 --> 00:08:40,800
including past services like Azure DNS.

161
00:08:40,800 --> 00:08:42,800
But DDoS protection is standard.

162
00:08:42,800 --> 00:08:48,800
It's availability guarantee, cost protection, mitigation reports,

163
00:08:48,800 --> 00:08:50,800
and many others.

164
00:08:50,800 --> 00:08:55,800
So I recommend looking, searching for the Azure customer

165
00:08:55,800 --> 00:09:01,800
despite a continue business as usual despite of 2.4

166
00:09:01,800 --> 00:09:05,800
DDoS attack blog to read more about it.

167
00:09:05,800 --> 00:09:10,800
Also, if you have listened to our previous podcast,

168
00:09:10,800 --> 00:09:15,800
you have heard me talking about how Microsoft focused on enabling

169
00:09:15,800 --> 00:09:21,800
the interconnection and cross-service collaboration of first and third party services.

170
00:09:21,800 --> 00:09:25,800
Basically enabling this data integration provide customers

171
00:09:25,800 --> 00:09:31,800
with more comprehensive analysis due to the many sources of data correlated

172
00:09:31,800 --> 00:09:38,800
and it helps speed up resolution since automation can be used to deal

173
00:09:38,800 --> 00:09:42,800
with the issues across those services.

174
00:09:42,800 --> 00:09:50,800
Well, Microsoft has been named a leader in the 2021 Gardner Magic Quadrant

175
00:09:50,800 --> 00:09:52,800
for data integration tools.

176
00:09:52,800 --> 00:09:56,800
It basically shows our continual commitment

177
00:09:56,800 --> 00:10:02,800
delivering comprehensive and cost effective data integration solution.

178
00:10:02,800 --> 00:10:11,800
The blog also talks about the future of analytics since we have the capability

179
00:10:11,800 --> 00:10:19,800
of correlating so much data in the use of AI and ML to accelerate insight.

180
00:10:19,800 --> 00:10:25,800
In addition, it goes into explaining how Azure Synapse Analytics

181
00:10:25,800 --> 00:10:30,800
make it possible to ingest, explore, prepare, transform, manage,

182
00:10:30,800 --> 00:10:34,800
and serve data for business intelligence and machine learning

183
00:10:34,800 --> 00:10:37,800
in a centralized and secure environment.

184
00:10:37,800 --> 00:10:41,800
This is one service that I'm spending a lot of time learning about

185
00:10:41,800 --> 00:10:49,800
because I see how the automation and the insight that this service enables.

186
00:10:49,800 --> 00:10:54,800
So I recommend adding this service to your future learning roadmap.

187
00:10:54,800 --> 00:10:56,800
Alright, thanks for the news everyone.

188
00:10:56,800 --> 00:10:59,800
As Mark alluded to, our conversation this week

189
00:10:59,800 --> 00:11:02,800
is going to be about the Microsoft Digital Defense Report.

190
00:11:02,800 --> 00:11:05,800
To talk to us this week about that is Mark McIntyre.

191
00:11:05,800 --> 00:11:08,800
Hey, Mark, welcome to the podcast. Thank you so much for joining us.

192
00:11:08,800 --> 00:11:11,800
Why don't you spend a moment and just explain your background to our listeners?

193
00:11:11,800 --> 00:11:17,800
Sure thing. This is almost like getting the band back together with Simonos and Michael here.

194
00:11:17,800 --> 00:11:21,800
Actually, you know what? It kind of is. How long have we worked together?

195
00:11:21,800 --> 00:11:24,800
Well, I'm not going to sing, so you won't get that out of me.

196
00:11:24,800 --> 00:11:25,800
Long enough.

197
00:11:25,800 --> 00:11:29,800
Yeah, I've been here about 14 years. Who would have thunk?

198
00:11:29,800 --> 00:11:31,800
I joined the company in 2007.

199
00:11:31,800 --> 00:11:36,800
Prior to that, I had been in the US government and my intention was to do one year in the private sector,

200
00:11:36,800 --> 00:11:40,800
scratch that itch, learn some things, and go back to government service.

201
00:11:40,800 --> 00:11:45,800
And for a variety of reasons, free Starbucks, whatever the reasons,

202
00:11:45,800 --> 00:11:48,800
here I am 14 years later. That's been great.

203
00:11:48,800 --> 00:11:50,800
Last five years, I've been one of our executive security advisors

204
00:11:50,800 --> 00:11:55,800
in the old cybersecurity solutions group, which has now been re-ordered

205
00:11:55,800 --> 00:11:58,800
into the modern work and security organization.

206
00:11:58,800 --> 00:12:02,800
And so I work primarily with the United States government CISO community

207
00:12:02,800 --> 00:12:07,800
and with key US state and local government CISO customers as well

208
00:12:07,800 --> 00:12:10,800
on digital transformation and risk management.

209
00:12:10,800 --> 00:12:15,800
And I should give props to Simonos for his reference.

210
00:12:15,800 --> 00:12:18,800
No pun intended. His reference to the reference architecture,

211
00:12:18,800 --> 00:12:22,800
because I show that frequently. It's really, really popular.

212
00:12:22,800 --> 00:12:26,800
It's just a great set of material for us to show our partners and customers.

213
00:12:26,800 --> 00:12:30,800
Don't keep saying that too much. We'll never get back on the podcast again.

214
00:12:30,800 --> 00:12:31,800
We'll start charging.

215
00:12:31,800 --> 00:12:32,800
Is that charging for you?

216
00:12:32,800 --> 00:12:34,800
I mean, the first question is pretty obvious, right?

217
00:12:34,800 --> 00:12:38,800
So what is the Microsoft Digital Defense Report and why should anybody care?

218
00:12:38,800 --> 00:12:40,800
Well, I'm a huge fan of this.

219
00:12:40,800 --> 00:12:45,800
You know, when we were planning this podcast, I had a few other topics in mind,

220
00:12:45,800 --> 00:12:48,800
but then when the day landed, I said, this is too good to pass up.

221
00:12:48,800 --> 00:12:49,800
Timing is too good.

222
00:12:49,800 --> 00:12:58,800
So just last week, Microsoft released our latest annual MDR Digital Defense Report.

223
00:12:58,800 --> 00:13:03,800
Some of you might remember this as the old Security Intelligence Report

224
00:13:03,800 --> 00:13:08,800
that was published back in the day by the old Microsoft Malware Protection Center.

225
00:13:08,800 --> 00:13:10,800
That was twice a year, cadence back then.

226
00:13:10,800 --> 00:13:14,800
But this is a look back on the previous year, you know,

227
00:13:14,800 --> 00:13:19,800
what is Microsoft's C going on around the global threat landscape?

228
00:13:19,800 --> 00:13:24,800
What types of incidents, emergencies, how are people,

229
00:13:24,800 --> 00:13:27,800
have our security teams been dealing with the most?

230
00:13:27,800 --> 00:13:32,800
It's a great way, you know, looking back to put a lot of our, you know,

231
00:13:32,800 --> 00:13:38,800
Microsoft's, let's say, global security data estate into an executive level summary like this.

232
00:13:38,800 --> 00:13:42,800
And of course, other companies have, you know, really good products too as well.

233
00:13:42,800 --> 00:13:46,800
But just a really good, you know, look back a good level set.

234
00:13:46,800 --> 00:13:51,800
And of course, no report like this would be complete or would truly be a service

235
00:13:51,800 --> 00:13:56,800
if it didn't actually, you know, also leave readers with things to do, you know,

236
00:13:56,800 --> 00:13:58,800
what's in it for me? What does this mean to me?

237
00:13:58,800 --> 00:14:03,800
You know, what do I have to do so that, you know, the next year before Microsoft comes out with the next one,

238
00:14:03,800 --> 00:14:05,800
I don't end up in the news.

239
00:14:05,800 --> 00:14:09,800
Yeah, I mean, that's one thing I just want to reinforce that I really love about this report

240
00:14:09,800 --> 00:14:12,800
is how much it focuses on the actionable insights.

241
00:14:12,800 --> 00:14:17,800
And it's not just here's a bunch of data, here's a bunch of analysis, it's actually okay.

242
00:14:17,800 --> 00:14:19,800
And what should you do about it?

243
00:14:19,800 --> 00:14:22,800
That's one of the things that I love about it.

244
00:14:22,800 --> 00:14:27,800
Yeah, so actually, I had a look at the reports a few days ago, and you know, it's a big report.

245
00:14:27,800 --> 00:14:29,800
I'll be honest with you, I kind of glanced through it.

246
00:14:29,800 --> 00:14:32,800
I didn't really read the whole thing, at least not yet anyway.

247
00:14:32,800 --> 00:14:33,800
I probably will.

248
00:14:33,800 --> 00:14:36,800
So, you know, could you just sort of share with our listeners what the key findings are,

249
00:14:36,800 --> 00:14:40,800
you know, sort of the quick sort of list of interesting items.

250
00:14:40,800 --> 00:14:41,800
Sure.

251
00:14:41,800 --> 00:14:43,800
So, a couple of big takeaways for me.

252
00:14:43,800 --> 00:14:49,800
You know, first of all, the growing, let's say, sophistication and maturity of the,

253
00:14:49,800 --> 00:14:51,800
let's say, attacker landscape.

254
00:14:51,800 --> 00:15:00,800
You know, this is truly a, you know, sophisticated service driven, bifurcated environment where

255
00:15:00,800 --> 00:15:06,800
attackers, you know, with a credit card or with motive, without perhaps even much technical

256
00:15:06,800 --> 00:15:13,800
acumen, can reach into the underground economy and essentially procure, you know, attack as a service, right?

257
00:15:13,800 --> 00:15:19,800
And so, you know, it's, for me, it's a really important way to drive home the economics,

258
00:15:19,800 --> 00:15:25,800
you know, that if you're a state local organization or educational institution or small medium business,

259
00:15:25,800 --> 00:15:31,800
you know, whatever business you're in, if you don't think that your data is of interest to an attacker,

260
00:15:31,800 --> 00:15:36,800
the economics of the, you know, for the attacker just too attractive.

261
00:15:36,800 --> 00:15:42,800
And so, you know, from a pragmatic risk management perspective and from an assumed compromise,

262
00:15:42,800 --> 00:15:45,800
you know, perspective, just understand that that's what we're up against.

263
00:15:45,800 --> 00:15:47,800
That was one, you know, one key finding.

264
00:15:47,800 --> 00:15:53,800
The other one, of course, you know, was that we're seeing cyber attacks going into pretty

265
00:15:53,800 --> 00:15:57,800
much, you know, all economic or let's say all sectors, right?

266
00:15:57,800 --> 00:16:03,800
And so, you know, obviously government and critical infrastructure and healthcare,

267
00:16:03,800 --> 00:16:05,800
financial services or retail.

268
00:16:05,800 --> 00:16:10,800
So, for a variety of reasons, the attackers are equal opportunity.

269
00:16:10,800 --> 00:16:14,800
So, again, you know, no one should assume that you're a little corner of the earth and that your,

270
00:16:14,800 --> 00:16:19,800
you know, data are not attractive for whatever reason, you know, to an attacker.

271
00:16:19,800 --> 00:16:22,800
That reason typically has profit, but not always.

272
00:16:22,800 --> 00:16:24,800
That was a key finding.

273
00:16:24,800 --> 00:16:28,800
Another one that I thought was really interesting in this, there's no rocket science here,

274
00:16:28,800 --> 00:16:30,800
but it should come as no surprise.

275
00:16:30,800 --> 00:16:38,800
You know, we have some content in the, you know, in the blog posts and then the accompanying PDF

276
00:16:38,800 --> 00:16:44,800
that once again, you know, shows a list of compromise, you know, harvested passwords, right?

277
00:16:44,800 --> 00:16:46,800
And it's just, it's the usual.

278
00:16:46,800 --> 00:16:50,800
And in this case, we looked at some operational technology devices.

279
00:16:50,800 --> 00:16:54,800
But, you know, we saw the same thing admin, right?

280
00:16:54,800 --> 00:17:02,800
User default administrator, you know, admin one, user one, so a little bit of creativity there.

281
00:17:02,800 --> 00:17:06,800
And so, you know, again, it's, and this is just Microsoft report, you know,

282
00:17:06,800 --> 00:17:11,800
I have to imagine that others in the industry are going to, as they come out with their annual reports,

283
00:17:11,800 --> 00:17:16,800
which I assume we'll get to as the end of the year approaches, which expect to see more and more of this, you know,

284
00:17:16,800 --> 00:17:21,800
and this isn't naming and shaming, this is not naming and shaming, this is just, you know, reality.

285
00:17:21,800 --> 00:17:27,800
And this is, we're all humans, we like using simple things and passwords can be quite simple,

286
00:17:27,800 --> 00:17:29,800
but unfortunately, it's just kind of a losing game.

287
00:17:29,800 --> 00:17:34,800
That hit me as well, you know, not because it wasn't expected, but because it's just, it's still there, right?

288
00:17:34,800 --> 00:17:39,800
And we, for whatever reason, collectively, we're not, you know, doing as good a job as we should

289
00:17:39,800 --> 00:17:45,800
in incentivizing, you know, others or doing our own work to move away from passwords.

290
00:17:45,800 --> 00:17:49,800
I just wanted a key off of what you're saying about the different industries.

291
00:17:49,800 --> 00:17:53,800
I really liked, one of the things I really like to see in the report, not that I like to see the attacks,

292
00:17:53,800 --> 00:17:59,800
but there was a sort of an industry by industry analysis in the IOT section that, you know, kind of went through,

293
00:17:59,800 --> 00:18:06,800
you know, because a lot of good research went in there from Section 52 from the Cyberx Acquisition from about a year ago,

294
00:18:06,800 --> 00:18:12,800
and some of the other IOT and OT research we've done, you know, it was just really nice to see sort of the way that

295
00:18:12,800 --> 00:18:16,800
the attacks were playing out against different industries, they could start to think about, you know,

296
00:18:16,800 --> 00:18:19,800
what I should worry about, you know, depending on the industry you're in.

297
00:18:19,800 --> 00:18:26,800
Sure thing. I mean, we're talking about, you know, some industries that are, you know, literally using operational technology,

298
00:18:26,800 --> 00:18:32,800
you know, from, you know, little apps or solutions from companies that may no longer even be in business.

299
00:18:32,800 --> 00:18:39,800
And, of course, the stakes are much higher in some of those, you know, healthcare and oil and gas and such.

300
00:18:39,800 --> 00:18:46,800
Yeah, my new rule of thumb when I think about the OT space is that the equipment might be 50 to 100 years old,

301
00:18:46,800 --> 00:18:49,800
and there's literally some stuff still running on steam.

302
00:18:49,800 --> 00:18:55,800
And then the electronics were modernized anywhere between, say, 30 and 50 years ago, you know, with quote, unquote,

303
00:18:55,800 --> 00:19:00,800
modern electronics at the time, which are, of course, nowhere near usable today.

304
00:19:00,800 --> 00:19:07,800
They barely support IP, let alone, you know, modern authentication protocols are absolutely out of the realm of possibility.

305
00:19:07,800 --> 00:19:16,800
I know it wasn't covered directly in the paper and report, you know, but just this morning I was doing a briefing virtual, of course,

306
00:19:16,800 --> 00:19:21,800
with a critical infra company back east.

307
00:19:21,800 --> 00:19:28,800
We were talking about a couple of some of the findings here and, you know, I mentioned, you know, Azure Sphere as an example of where we can help,

308
00:19:28,800 --> 00:19:38,800
in a sense, democratize, lower the cost of entry for, you know, creating what could be, you know, repromising net new ecosystems, you know, of devices.

309
00:19:38,800 --> 00:19:44,800
No question it's going to take, you know, innovation, it's going to take, in some cases, you know, just fundamental rebuild.

310
00:19:44,800 --> 00:19:52,800
But if, you know, the more we can put data like this out there, the more we can just make people aware that there are some really basic need to do.

311
00:19:52,800 --> 00:19:55,800
You know, this will help us, help us secure the ecosystem.

312
00:19:55,800 --> 00:20:01,800
I'm a huge fan of Azure Sphere. I mean, you'd be amazed how many threat models I built with customers where they've had these IoT devices.

313
00:20:01,800 --> 00:20:05,800
And it's like, okay, so how do you authenticate the IoT device where we use TLS?

314
00:20:05,800 --> 00:20:07,800
I'm like, okay, so where are the keys stored?

315
00:20:07,800 --> 00:20:11,800
Well, stored like in some configuration file on this micro OS.

316
00:20:11,800 --> 00:20:14,800
I'm like, well, what are the protections on those, you know, on those files?

317
00:20:14,800 --> 00:20:15,800
And it's like nothing.

318
00:20:15,800 --> 00:20:22,800
But then when you look at something like, you know, Azure Sphere, that idea of protecting sensitive data is just part of the product.

319
00:20:22,800 --> 00:20:25,800
I've done quite a bit of development actually with the Azure Sphere.

320
00:20:25,800 --> 00:20:28,800
There's an Azure Sphere SDK for those of you not aware.

321
00:20:28,800 --> 00:20:35,800
And if you are, you know, either intellectually curious or you actually have a need for it, play around with it, kick the tires,

322
00:20:35,800 --> 00:20:42,800
just load up Visual Studio Code, load up the Azure Sphere SDK and get a device that's cheap and an experiment.

323
00:20:42,800 --> 00:20:44,800
It's actually a breath of fresh air.

324
00:20:44,800 --> 00:20:48,800
There's so many great security services that come with Azure Sphere.

325
00:20:48,800 --> 00:20:49,800
It's good to see.

326
00:20:49,800 --> 00:20:54,800
Yeah, I mean, they actually did include in this report the seven properties of highly secured devices,

327
00:20:54,800 --> 00:20:56,800
which is basically what we figured out.

328
00:20:56,800 --> 00:21:02,800
Okay, let's take an analysis of what we learned on, you know, Xbox and Surface and all the different hardware that we've done.

329
00:21:02,800 --> 00:21:06,800
And, you know, how do you actually keep, you know, a device that's going to be out there in the wild and the world secure?

330
00:21:06,800 --> 00:21:15,800
And then, you know, instead of just doing principles, they actually took it to properties, you know, something you could measure and you can ask a question and answer a question of, you know, does it do all these things?

331
00:21:15,800 --> 00:21:18,800
And if so, great, you know, that's a secure device.

332
00:21:18,800 --> 00:21:24,800
Things like air reporting, small trusted computing base, defense in depth, hardware, trust, et cetera.

333
00:21:24,800 --> 00:21:27,800
And so I'm a huge fan of Azure Sphere.

334
00:21:27,800 --> 00:21:37,800
The one thing that I did learn is it's something that because of the whole SDK element of it, it's something that you do actually have to sort of write an app for it for the particular device,

335
00:21:37,800 --> 00:21:40,800
which if you're producing a device makes all the sense in the world because you have to anyway.

336
00:21:40,800 --> 00:21:47,800
But the thing that was interesting was like the first case study on that was actually Starbucks.

337
00:21:47,800 --> 00:21:50,800
It's a public case study that's on the Microsoft site.

338
00:21:50,800 --> 00:21:54,800
We can add a link to the show notes because they had standardized on a single device.

339
00:21:54,800 --> 00:21:57,800
They were actually able to put this secure Azure Sphere device.

340
00:21:57,800 --> 00:22:12,800
I think they called it a Guardian module in front of all of those different pieces of equipment and write one app that worked, you know, however many thousand times at Starbucks as machines and provided that secure sort of instrumentation that, you know, secure.

341
00:22:12,800 --> 00:22:28,800
But it also gave them important business benefits, which is I think how they probably justify the project and that, hey, you know, they knew, you know, because they have these, you know, sort of, you know, younger employees there that are operating these machines that may not have been fully trained

342
00:22:28,800 --> 00:22:35,800
or may not know fully like, okay, this is when this happens, it's probably that the little dispenser for grinds in the back is full.

343
00:22:35,800 --> 00:22:40,800
And so they had to, you know, end up sending up trucks for false alarms all the time.

344
00:22:40,800 --> 00:22:41,800
That was costly.

345
00:22:41,800 --> 00:22:47,800
And so the business ended up saving money by having this, you know, kind of modern day instrumentation on these machines.

346
00:22:47,800 --> 00:22:51,800
But they had to have it secure because that's how Starbucks rolls.

347
00:22:51,800 --> 00:22:54,800
And so it's really a sort of an interesting case study there.

348
00:22:54,800 --> 00:23:00,800
I applaud any company that focuses on, you know, making it easier and more secure for me to get coffee.

349
00:23:00,800 --> 00:23:08,800
So it's, you know, it's interesting because I can, it's funny how conversations are framed by before and after the pandemic.

350
00:23:08,800 --> 00:23:15,800
But right before the pandemic, I remember meeting in a briefing center with the CISO team of a large European power generator,

351
00:23:15,800 --> 00:23:22,800
forgetting quite the company making power equipment, and they were looking at sphere and because in their mind,

352
00:23:22,800 --> 00:23:33,800
they were determined to essentially recommend that they essentially sell off part of their power, power generation capability just to a local,

353
00:23:33,800 --> 00:23:40,800
to a very like a provincial authority and just kind of let them use whatever power system they were using as long as they need to.

354
00:23:40,800 --> 00:23:43,800
They didn't see the benefit in trying to go back to these large systems.

355
00:23:43,800 --> 00:23:45,800
They wanted to start over essentially.

356
00:23:45,800 --> 00:23:52,800
They wanted to create net new, you know, from the ground up, you know, chip up clean energy, you know, with, you know,

357
00:23:52,800 --> 00:23:55,800
essentially wire or I guess wireless from the beginning.

358
00:23:55,800 --> 00:23:57,800
And it's kind of, you know, it's co-opportunities.

359
00:23:57,800 --> 00:24:01,800
Whether it's, you know, again, the technology is there for you.

360
00:24:01,800 --> 00:24:05,800
It's a matter of what business decisions that you want to make coming out of how you use it.

361
00:24:05,800 --> 00:24:12,800
Just quickly, the, you know, the what's in it for me, you know, on this report, the very last substantive slide,

362
00:24:12,800 --> 00:24:17,800
I guess infographic is, I don't know who did this design this slide.

363
00:24:17,800 --> 00:24:23,800
The first time I've seen it, at least with the Microsoft, it's the, it's a bell curve, a cybersecurity bell curve.

364
00:24:23,800 --> 00:24:27,800
And I really found this visually impactful and essentially it's a bell curve.

365
00:24:27,800 --> 00:24:34,800
And it just says that, says that basic security hygiene still protects against 90% of attacks.

366
00:24:34,800 --> 00:24:44,800
So use anti malware, apply lease privilege, enable multi-factor authentication, keep your software up to date and protect your data.

367
00:24:44,800 --> 00:24:52,800
And so it was good to see there's so much innovation out there, so much interesting work going on, important work being, you know, being done.

368
00:24:52,800 --> 00:24:58,800
But in the end, even after this report that summarizes all of our, you know, what we see in our data over the years,

369
00:24:58,800 --> 00:25:06,800
we close with a slide that is sort of so elemental and so timeless in a way that, you know, it's responsible message and it's still very relevant.

370
00:25:06,800 --> 00:25:12,800
One of the interesting things, I'm actually kind of switching to the earlier part of the report, the ransomware piece,

371
00:25:12,800 --> 00:25:16,800
actually was, I did some of the review work on that before it went out.

372
00:25:16,800 --> 00:25:23,800
And there was some really interesting stuff because, you know, it's really kind of, even though that the vulnerabilities are remaining constant,

373
00:25:23,800 --> 00:25:28,800
the monetization has really transformed in the past couple of years.

374
00:25:28,800 --> 00:25:33,800
And there's some really good analysis there. You know, some of the things that kind of caught my eye around, you know,

375
00:25:33,800 --> 00:25:41,800
essentially the amount of money people are paying for ransoms are giving these ransomware gangs budgets that are probably rivaling, you know, nation states.

376
00:25:41,800 --> 00:25:47,800
And some of the reasons we really don't want people to pay because it's kind of a tragedy, the commons type of thing where, you know,

377
00:25:47,800 --> 00:25:53,800
maybe individually, it's your best interest to do it, but, you know, that's going to boomerang back on you and everybody else.

378
00:25:53,800 --> 00:25:59,800
And there's also some data there on like how much stuff costs on the dark markets, including ransomware kits.

379
00:25:59,800 --> 00:26:08,800
Because like one of the things a lot of people don't realize is that the ransomware thing is not just a single dude doing this or lady,

380
00:26:08,800 --> 00:26:15,800
there might be, you know, lady ransomware attackers too. Ultimately, it's an ecosystem in many ways.

381
00:26:15,800 --> 00:26:21,800
And like the most, at its most basic, there's the kit providers, you know, that either sell, you know, access to the kit or, you know,

382
00:26:21,800 --> 00:26:26,800
take a cut and sort of help and assist and sort of a ransomware as a service model.

383
00:26:26,800 --> 00:26:31,800
And then there's the operators. These kits often have multiple tools. They have different attack techniques.

384
00:26:31,800 --> 00:26:33,800
They have different pieces of malware they'll install.

385
00:26:33,800 --> 00:26:38,800
And then the operators don't necessarily stick with just one kit provider.

386
00:26:38,800 --> 00:26:44,800
They might try two or three different kits and, you know, then the kit providers offer new compelling features.

387
00:26:44,800 --> 00:26:51,800
And so sometimes they use multiple different kits, even though it's the same actor or same operator that's operating the malware.

388
00:26:51,800 --> 00:27:00,800
And so it's really an interesting sort of view into the complexity of that ecosystem and that you're really facing like an underground or a dark economy,

389
00:27:00,800 --> 00:27:06,800
more so than you are the skills and limitations of one particular person or small group.

390
00:27:06,800 --> 00:27:08,800
It's almost commodified.

391
00:27:08,800 --> 00:27:09,800
Yeah, completely.

392
00:27:09,800 --> 00:27:16,800
In fact, I would suspect that if anything, the price, the prices that attackers will pay.

393
00:27:16,800 --> 00:27:23,800
So for example, you can, you know, procure a rental list of sole and password,

394
00:27:23,800 --> 00:27:28,800
username, password pairs for under $1,000, right?

395
00:27:28,800 --> 00:27:32,800
Even one estimate here, I'm looking at this as $150 or $400 million.

396
00:27:32,800 --> 00:27:45,800
And so I suspect that as more and more, you know, this information is harvest at the price price goes down because they have a supply and demand dynamic as well in the ecosystem.

397
00:27:45,800 --> 00:27:51,800
So I'm going to ask something a little bit, well, in the report, but a little bit different about.

398
00:27:51,800 --> 00:27:58,800
We've been talking about disinformation and disinformation and misinformation and spreading doubt.

399
00:27:58,800 --> 00:28:03,800
So it's slightly less technical, but I think it's a really important thing because it does also affect security.

400
00:28:03,800 --> 00:28:13,800
What's your take on that, Mark, and what we should be doing or what organizations should be doing because, you know, it can affect an organization, not just a person.

401
00:28:13,800 --> 00:28:22,800
First of all, just full disclosure or interest of transparency, I actually recently, just recently joined advisory board of a startup that's working on this issue.

402
00:28:22,800 --> 00:28:28,800
So I want to be very clear that I'm speaking on behalf of Microsoft, what I think is best for us and our customers.

403
00:28:28,800 --> 00:28:34,800
But I joined that other companies board because this issue, I wasn't even paying attention to this issue, Sarah.

404
00:28:34,800 --> 00:28:36,800
I never thought about it until the last couple of years.

405
00:28:36,800 --> 00:28:38,800
I was always generally aware it was going on.

406
00:28:38,800 --> 00:28:45,800
But I haven't been on Facebook in several years, you know, so LinkedIn is really my only social networking tool or app.

407
00:28:45,800 --> 00:28:49,800
And so the more I'm reading about this, it's terrifying.

408
00:28:49,800 --> 00:28:55,800
And it's indicative of the larger of some of the findings in this report.

409
00:28:55,800 --> 00:28:59,800
That number one is clearly a nation-state driven.

410
00:28:59,800 --> 00:29:07,800
I'm not going to pick on any particular governments or government sponsored actors, but it's clearly done for nation-state purposes to so discord and

411
00:29:07,800 --> 00:29:11,800
tear, you know, literally tear populations apart in countries.

412
00:29:11,800 --> 00:29:18,800
But it's also becoming a for-profit economy or for-profit segment, I guess, in the underground economy.

413
00:29:18,800 --> 00:29:27,800
And that sounds very troubling to me because once these attackers can, you know, or once they see motive, financial motive, whatever, it's really hard to stop

414
00:29:27,800 --> 00:29:35,800
because it's the motive is there, the tools are there, the AI is there now for attackers to use.

415
00:29:35,800 --> 00:29:41,800
And so like any other criminal issue is going to come down to we're not going to stamp it out.

416
00:29:41,800 --> 00:29:43,800
I think it'd be really hard to stamp it out.

417
00:29:43,800 --> 00:29:52,800
In this case, you know, political, let's say political election security, things like that, it's going to really be required to really, really help election

418
00:29:52,800 --> 00:30:01,800
officials and, you know, and related personnel understand what they're up against, you know, and this can be a really tough one because of differences

419
00:30:01,800 --> 00:30:06,800
in budgets and differences in perception of how people view, you know, threats and such.

420
00:30:06,800 --> 00:30:17,800
And so I know that, for example, our team does quite a bit of work, you know, within Microsoft of our legal team, you know, training people that run voting infrastructure.

421
00:30:17,800 --> 00:30:19,800
And this is that's really important work.

422
00:30:19,800 --> 00:30:21,800
So I was going to take a lot of that type of work.

423
00:30:21,800 --> 00:30:27,800
But again, it's sort of an asymmetric, you know, arms race here because the attackers can just keep doing this.

424
00:30:27,800 --> 00:30:35,800
And especially as they reach into the citizenry, people who aren't really incentivized to care about something's accurate, make them feel good, they're going to click on it.

425
00:30:35,800 --> 00:30:36,800
It's a very tough issue.

426
00:30:36,800 --> 00:30:45,800
Yeah, I mean, that's that's one of the ones that worries me a lot because we've had such essentially trustworthy news sources in a lot of countries around the world.

427
00:30:45,800 --> 00:30:51,800
You know, they might be biased here or there, but they weren't, you know, deliberately trying to misinform in a lot of countries.

428
00:30:51,800 --> 00:31:01,800
And so when you have, you know, this sort of switch to sources of social media, which are much more, I don't know, I kind of jokingly call it gossip, you know, because it fits that more.

429
00:31:01,800 --> 00:31:07,800
It's just whatever someone says, as opposed to, you know, like a vetted and validated kind of statements, etc.

430
00:31:07,800 --> 00:31:14,800
As many of our societies today just don't have the muscle to doubt automatically like we used to.

431
00:31:14,800 --> 00:31:17,800
And so we tend to trust what we see.

432
00:31:17,800 --> 00:31:21,800
So it's definitely something that bothers and concerns me as well.

433
00:31:21,800 --> 00:31:25,800
One thing that really stood out for me was there was an infographic in there.

434
00:31:25,800 --> 00:31:29,800
There is an infographic in there about email attacks or email-born attacks.

435
00:31:29,800 --> 00:31:32,800
I think most people think about phishing and that's about it.

436
00:31:32,800 --> 00:31:40,800
But this infographic actually goes through all different kinds of email-born malicious attacks, which I learned a lot just from that, you know, just from that alone.

437
00:31:40,800 --> 00:31:43,800
But is there any kinds of attacks?

438
00:31:43,800 --> 00:31:48,800
I know Mark is always going on about ransomware and discussing ransomware, especially human-operated ransomware.

439
00:31:48,800 --> 00:31:52,800
But I mean, are we seeing other kinds? Yeah, I don't know.

440
00:31:52,800 --> 00:31:54,800
I don't mean to sound that way.

441
00:31:54,800 --> 00:31:59,800
But the point is, we're seeing, you know, we're seeing big increases in certain classes of attack.

442
00:31:59,800 --> 00:32:01,800
And what sort of driving that?

443
00:32:01,800 --> 00:32:12,800
Most threat actors don't need to be good, you know, they can be pretty good, because they can always rely on, you know, they'll always find, you know, weaker links somewhere in a supply chain.

444
00:32:12,800 --> 00:32:23,800
The ones that can be very good, the ones that have the, you know, let's say national security imperative, or say national interest imperative, they have the patients to do so.

445
00:32:23,800 --> 00:32:28,800
And they can take months or even years, I suppose.

446
00:32:28,800 --> 00:32:32,800
You know, those are the ones that a macro, I guess, really concerned me.

447
00:32:32,800 --> 00:32:35,800
Like, if you look at nobellium, that was really interesting.

448
00:32:35,800 --> 00:32:41,800
It was shot across the bow in terms of what it revealed about the patients and the stealthiness of an attacker.

449
00:32:41,800 --> 00:32:44,800
I'll be interested in a mark take on this, given the ransom or angle.

450
00:32:44,800 --> 00:32:47,800
But I think, you know, you have nobellium on the one side.

451
00:32:47,800 --> 00:32:49,800
Again, that's a just clarification.

452
00:32:49,800 --> 00:33:00,800
That's Microsoft's term for essentially, you know, what was also known as the Slargate, SolarWinds attackers going back, I guess about 10, 10 and a half months or so now.

453
00:33:00,800 --> 00:33:03,800
By the way, Microsoft uses the table of elements because it can't be a trademarked.

454
00:33:03,800 --> 00:33:06,800
So it's just a way for us to refer to attackers.

455
00:33:06,800 --> 00:33:20,800
Those concern me because they have outsized impact, they make policymakers, they make our companies certainly think in a different way about all the various things that we have to do sort of in conjunction or in parallel to take on the attackers.

456
00:33:20,800 --> 00:33:34,800
And this, you know, this, I mean, these real societal issues for most of us, for most people, most organizations, companies would have you, you know, it's the more commodity type ransomware, you know, pretty good threat actors.

457
00:33:34,800 --> 00:33:36,800
They just want money.

458
00:33:36,800 --> 00:33:40,800
Some of them claim to have a code of ethics.

459
00:33:40,800 --> 00:33:41,800
We shall see.

460
00:33:41,800 --> 00:33:44,800
There are, I guess, because honor among thieves.

461
00:33:44,800 --> 00:33:52,800
It wasn't only covered in the report, but my, you know, my concern, like I'd really hate for us to be having this discussion next year.

462
00:33:52,800 --> 00:33:57,800
And we're having to talk about an increase in killerware, right?

463
00:33:57,800 --> 00:34:05,800
In attackers that just say, we don't care, we're just, we're taking this pipeline down, we're taking this hospital offline consequences be darned.

464
00:34:05,800 --> 00:34:16,800
You know, that's, that's what really scares me, that we'll just start getting people who are so nihilistic or so dedicated to a certain cause that can't sway them.

465
00:34:16,800 --> 00:34:28,800
Yeah, that's, that's one that also worries me, you know, as we get into sort of more and more critical infrastructure targets, or as we see them getting into those, which, you know, which, you know, has increased recently for sure.

466
00:34:28,800 --> 00:34:32,800
That's definitely something that I watch and worry about as well.

467
00:34:32,800 --> 00:34:40,800
And one of the, one of the things just on the, to kind of wrap up on the ransomware topic is there is a link there to our ransomware guidance.

468
00:34:40,800 --> 00:34:50,800
It's just the aka.ms slash ransomware. And it does actually highlight the order that we recommend organizations focus on mitigation.

469
00:34:50,800 --> 00:34:58,800
And it is in many ways opposite of what most people expect, because everyone was like, oh yeah, let's just block block first and then I can forget about the rest.

470
00:34:58,800 --> 00:35:00,800
That's, that's all I need is the front of the roadmap.

471
00:35:00,800 --> 00:35:05,800
We actually did it in a different order on purpose because of how hard it is to prevent them from getting in.

472
00:35:05,800 --> 00:35:10,800
And the last thing we want to do is set up a front line that, you know, is only the front line.

473
00:35:10,800 --> 00:35:17,800
And then if they get one way through, then, well, sorry, we don't have a plan and the whole thing goes, you know, in a hand basket to somewhere.

474
00:35:17,800 --> 00:35:34,800
So the, so the way that we actually ordered that was deliberately to focus on making sure your backups and your ability to restore them are your top priority, because you don't want to have to pay them to recover in the worst case scenario.

475
00:35:34,800 --> 00:35:42,800
We want you to not do other things and not try to prevent, not try to detect, not keep them out of the admins group and getting control of them, etc.

476
00:35:42,800 --> 00:35:54,800
But we want to make sure that like one of the first things you check is to make sure you don't have to pay them and use their usually terrible and ugly tool to actually recover.

477
00:35:54,800 --> 00:35:59,800
So that's, that's one of the top priorities that we do recommend there in that ransomware space.

478
00:35:59,800 --> 00:36:04,800
Hey, do we ever have people like paying the ransom and then the attack is saying, haha, just kidding?

479
00:36:04,800 --> 00:36:09,800
It happens from what I understand. But I don't think it happens a lot.

480
00:36:09,800 --> 00:36:16,800
Now the ransomware gangs, when it's expedient for them, do go offline. Sometimes they share the keys. Sometimes they don't.

481
00:36:16,800 --> 00:36:23,800
There is no guarantee, right? Because I mean, these are, you know, they pretend to have morals and ethics and try to impose them on you to force you to pay.

482
00:36:23,800 --> 00:36:37,800
But really, you're trusting the word of a criminal that's anonymous. And we actually have some chats included in the report that actually talk about the dialogue where you can kind of see how rough and hard knows they get in the negotiations.

483
00:36:37,800 --> 00:36:42,800
It's, it's not pleasant because they have access to your financial records and whatnot.

484
00:36:42,800 --> 00:36:47,800
And so they know what's in your accounts because they're getting over the systems that you use to manage those accounts.

485
00:36:47,800 --> 00:36:53,800
You know, what I have to imagine, think about it, I guess, incentives, this is sort of pure capitalism.

486
00:36:53,800 --> 00:36:58,800
They're just trying to make money, you know, lower risk and costs and maximize gain.

487
00:36:58,800 --> 00:37:10,800
And so obviously not to be sympathetic there, but their business model wouldn't, you know, wouldn't be so attractive or appealing if they took the payment and didn't didn't pay back.

488
00:37:10,800 --> 00:37:12,800
The next victim is going to know better.

489
00:37:12,800 --> 00:37:18,800
Yeah, I had a quick couple of highlights of you don't want the one that was a couple things, a couple of things caught my eye.

490
00:37:18,800 --> 00:37:31,800
Like there was a really interesting one on sort of the browser search results manipulation, which I had never personally experienced, but it was a nice little screenshot that showed like a before and after of how they're kind of monetize, you know, attacks that way.

491
00:37:31,800 --> 00:37:43,800
I really, really like the adversarial machine learning, which is like how are the attackers going to, you know, go after your defensive machine learning and the different types of attacks and what they look like and how they work.

492
00:37:43,800 --> 00:37:45,800
I thought was kind of interesting there.

493
00:37:45,800 --> 00:37:51,800
And then just the nation state section, like there's definitely insights on a per nation state basis.

494
00:37:51,800 --> 00:38:01,800
But one of the interesting points that I picked up was that they, you know, I mean, sometimes they do, I mean, they tend to be the ones that do new techniques, right, like supply chain attacks and whatnot.

495
00:38:01,800 --> 00:38:10,800
But generally they tend to also use a lot of the same attack techniques that the commodity attackers of different flavors or in somewhere and whatnot also do.

496
00:38:10,800 --> 00:38:19,800
But the point that that was made in the purple, I really like to set there actually resources sophisticated enough to actually figure out which one would work best in this circumstance.

497
00:38:19,800 --> 00:38:26,800
So they're not, you know, trying to do a phishing email when it would be a lot easier to take another technique or vice versa.

498
00:38:26,800 --> 00:38:31,800
So I thought that was a really interesting observation on what kind of sets nation states apart.

499
00:38:31,800 --> 00:38:33,800
But Mark, I'd love to hear your perspective.

500
00:38:33,800 --> 00:38:42,800
For me, it's a little more personal because, you know, my previous life before coming to Microsoft, I spent some time as a North Korea analyst.

501
00:38:42,800 --> 00:38:49,800
So anytime my eyes is always gravitated to the face of something about North Korea.

502
00:38:49,800 --> 00:38:51,800
So in this case, you know, it's interesting.

503
00:38:51,800 --> 00:39:01,800
This report does a few tidbits, couple slides or infographics on North Korea as one of the sort of five or six actors that we focused on.

504
00:39:01,800 --> 00:39:07,800
To me, it's the most interesting because it embodies so much from what's at stake here.

505
00:39:07,800 --> 00:39:10,800
They have their neighborhood, dangerous neighborhood.

506
00:39:10,800 --> 00:39:22,800
And so they're using, you know, they're using offensive cyber or cybercrime to, let's say, project strength, you know, of course, in absence of national strength.

507
00:39:22,800 --> 00:39:29,800
And of course, because they're largely cut off from the global economy, they're conducting cyber activities simply to make money.

508
00:39:29,800 --> 00:39:35,800
There's a, you know, national security and national economic comparatives that come together.

509
00:39:35,800 --> 00:39:41,800
It's probably the most, like the crisp, the most, the cleanest way, the crispest way you could imagine.

510
00:39:41,800 --> 00:39:43,800
It's very visceral there in North Korea.

511
00:39:43,800 --> 00:39:48,800
Now, this report, for example, also highlighted Vietnam and Turkey.

512
00:39:48,800 --> 00:39:54,800
And I'm forgetting if it's this one or last year's, but they even had, I think, a reference to South Korea.

513
00:39:54,800 --> 00:39:59,800
This is where it gets more and more interesting because now you're seeing, I'm not going to say second tier, that's not fair.

514
00:39:59,800 --> 00:40:07,800
But you're seeing governments or government affiliated groups or actors clearly operating with some level of government protection.

515
00:40:07,800 --> 00:40:09,800
It's fanting out, right?

516
00:40:09,800 --> 00:40:13,800
Activity isn't just the usual suspects, China and Russia, Iran, North Korea.

517
00:40:13,800 --> 00:40:17,800
Now you're seeing, you know, sort of secondary actors.

518
00:40:17,800 --> 00:40:24,800
That's troubling because there's no reason to believe that next year when we're covering the report, we're not going to see, you know, two or three more examples.

519
00:40:24,800 --> 00:40:30,800
New or, you know, new or criminal groups from other parts of the world getting into this for a variety of reasons.

520
00:40:30,800 --> 00:40:41,800
So that makes it very dangerous because, you know, as Microsoft's President Brad Smith still says, and Tom Byrd, who runs our customer trust, you know, there really aren't any guardrails right now.

521
00:40:41,800 --> 00:40:43,800
There aren't any real agreed on norms.

522
00:40:43,800 --> 00:40:50,800
And that's, that's, that's very tough, especially for those of us, you know, trying to play, you know, defense.

523
00:40:50,800 --> 00:40:55,800
It's very hard to act against actors that don't have guardrails.

524
00:40:55,800 --> 00:41:02,800
Yeah, essentially, there's no equivalent of a Geneva Convention that says this is out of boundaries, always kind of thing.

525
00:41:02,800 --> 00:41:12,800
Again, my concern about the term you're hearing now about killer where where you might just get any number of groups, disinfected, you know, people, criminal groups, whoever, nation states,

526
00:41:12,800 --> 00:41:14,800
they might just say, we're just going to take this thing offline.

527
00:41:14,800 --> 00:41:15,800
And that's it.

528
00:41:15,800 --> 00:41:29,800
So getting back to Mark's point earlier about, you know, ransomware and we have to do a much better job, all of us working together to help make sure that our partners and customers understand how to do backups and, you know, and how to protect and modernize their infrastructure.

529
00:41:29,800 --> 00:41:33,800
I have a question that maybe it may be a little contentious.

530
00:41:33,800 --> 00:41:35,800
I realize I'm putting on the spot.

531
00:41:35,800 --> 00:41:36,800
I should probably both of you on the spot.

532
00:41:36,800 --> 00:41:43,800
What role do we see here for digital currencies being used to pay ransoms and so on?

533
00:41:43,800 --> 00:41:58,800
I mean, is this are we seeing a spike in ransomware and appropriate criminal activity because of digital currencies, which are to some degree anonymous and, you know, wiring someone some money to a bank account?

534
00:41:58,800 --> 00:42:04,800
Yeah, I mean, I'd love to hear what Marcus thinking, but from my perspective, it's absolutely an enabler.

535
00:42:04,800 --> 00:42:09,800
Now, if we if those disappeared tomorrow, would this model continue?

536
00:42:09,800 --> 00:42:11,800
Very potentially so.

537
00:42:11,800 --> 00:42:17,800
You know, they're bold enough and brazen enough that they might deal with a wire transfer and find some other way to kind of like figure that out.

538
00:42:17,800 --> 00:42:23,800
But it's definitely the ease of use of that is definitely an enabler for these models for sure.

539
00:42:23,800 --> 00:42:24,800
Yeah.

540
00:42:24,800 --> 00:42:39,800
I like to believe that investigators, regulators, law enforcement would have you in the poll, perhaps other groups that, you know, they're also I suppose, I guess, ironically, they may see opportunities to use technology in this case to chase technology.

541
00:42:39,800 --> 00:42:44,800
You know, there might be a way you're establishing essentially, you know, what's called a paper trail.

542
00:42:44,800 --> 00:42:49,800
There's probably some level of visibility into into tracking some of these transactions.

543
00:42:49,800 --> 00:42:54,800
So I'm not by any means an expert on digital currencies.

544
00:42:54,800 --> 00:42:56,800
I still have dollar bills my wallet.

545
00:42:56,800 --> 00:43:05,800
So, you know, but I have heard some interesting briefings recently from, you know, international police agencies talking about this.

546
00:43:05,800 --> 00:43:09,800
And, you know, they're they're certainly I think learning as they go.

547
00:43:09,800 --> 00:43:12,800
But again, technology is a is agnostic, right?

548
00:43:12,800 --> 00:43:16,800
It's how people for good and bad enable it and use it.

549
00:43:16,800 --> 00:43:17,800
All right.

550
00:43:17,800 --> 00:43:18,800
Let's start to bring this thing to a close.

551
00:43:18,800 --> 00:43:24,800
So one thing we ask all I guess Mark is if you had one final thought to leave our listeners with, what would it be?

552
00:43:24,800 --> 00:43:27,800
This isn't so hard for me because I've been doing this.

553
00:43:27,800 --> 00:43:31,800
I've said this recently in a couple other panel, online panels and such.

554
00:43:31,800 --> 00:43:46,800
But, you know, despite the threat environment, despite the innovation and the attack or ecosystem and all the advanced tax and such, despite what's in the news, the much of what you can do, you know, is there at your own fingertips, but it's right there for you.

555
00:43:46,800 --> 00:44:00,800
It's this I call I said earlier, this cybersecurity bell curve, you know, if it can still help you take a fresh look at practices, you're probably already utilizing modernize them in a way, you know, we think that's a good idea.

556
00:44:00,800 --> 00:44:06,800
We think confident that this can make a measurable improvement in your cybersecurity posture.

557
00:44:06,800 --> 00:44:07,800
So nothing else.

558
00:44:07,800 --> 00:44:12,800
Please, please pay attention to, you know, to the hygiene.

559
00:44:12,800 --> 00:44:17,800
Yeah, I know Mark has mentioned hygiene at length as well in the past.

560
00:44:17,800 --> 00:44:20,800
So it's good to see someone else sort of concurring there.

561
00:44:20,800 --> 00:44:22,800
Well, look, hey, Mike, thank you so much for joining us this week.

562
00:44:22,800 --> 00:44:25,800
I know we already appreciate you taking the time and are you really busy?

563
00:44:25,800 --> 00:44:31,800
And yeah, this is like every episode I've learned something and this is an area I don't normally spend a lot of time on.

564
00:44:31,800 --> 00:44:37,800
So I probably learned significantly more here than I would do on other podcast recordings.

565
00:44:37,800 --> 00:44:41,800
And so all you listening out there, thank you so much for tuning in this week. Stay safe.

566
00:44:41,800 --> 00:45:10,800
And we'll see you next time.

