1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:17,360
Hey everybody, welcome to episode 80.

4
00:00:17,360 --> 00:00:22,840
This week it's just myself, Michael and Sarah and our guest this week is Matt Zorich who's

5
00:00:22,840 --> 00:00:29,520
here to talk to us about a team at Microsoft called DART and Microsoft Instant Response.

6
00:00:29,520 --> 00:00:33,440
Before we get into the discussion with Matt, let's take a little lap around the news.

7
00:00:33,440 --> 00:00:35,480
Sarah, why don't you kick things off?

8
00:00:35,480 --> 00:00:40,200
Okay, well, you know me and my news, I always love to talk about something related to my

9
00:00:40,200 --> 00:00:42,880
baby, Microsoft Sentinel.

10
00:00:42,880 --> 00:00:50,840
This is tangentially related, which is Azure Monitor logs now have better table level RBAC.

11
00:00:50,840 --> 00:00:58,160
So you may remember if you've tried to use table level RBAC in Azure Monitor, it hasn't

12
00:00:58,160 --> 00:01:02,280
been the most flexible and it's had some restrictions on how you could use it.

13
00:01:02,280 --> 00:01:07,200
But we now have public preview for much better and much more granular RBAC.

14
00:01:07,200 --> 00:01:11,320
And it will also allow you to do it for custom tables, which is something we couldn't do

15
00:01:11,320 --> 00:01:12,320
before.

16
00:01:12,320 --> 00:01:14,560
So we'll have a link in the show notes.

17
00:01:14,560 --> 00:01:18,680
If that's something that you looked at in the past and it didn't quite work for you,

18
00:01:18,680 --> 00:01:24,400
go and have another look because it's definitely much improved on what it was.

19
00:01:24,400 --> 00:01:27,240
And that's it for me this time, Michael.

20
00:01:27,240 --> 00:01:28,240
So over to you.

21
00:01:28,240 --> 00:01:29,240
Yeah, I've got three items.

22
00:01:29,240 --> 00:01:30,600
They're actually all sort of somewhat related.

23
00:01:30,600 --> 00:01:34,440
They're all to do with web application firewalls at some level.

24
00:01:34,440 --> 00:01:38,880
So the first one is in public preview, we now have sensitive data protection for app

25
00:01:38,880 --> 00:01:41,920
gateway web application firewall logs.

26
00:01:41,920 --> 00:01:45,960
So you can enable this and certain things that may be deemed a little bit sensitive.

27
00:01:45,960 --> 00:01:52,360
So for example, JSON argument names, request IP addresses, request header names, they will

28
00:01:52,360 --> 00:01:56,100
be sort of scrubbed with a whole series of asterisks.

29
00:01:56,100 --> 00:02:01,360
So again, things like IP addresses, what could be potentially passwords, those kinds of things,

30
00:02:01,360 --> 00:02:03,720
they can be all sort of erased.

31
00:02:03,720 --> 00:02:08,120
So that way there's loads of chance of PII or other kinds of sensitive data appearing

32
00:02:08,120 --> 00:02:11,020
in the web application firewall logs.

33
00:02:11,020 --> 00:02:14,800
Next one is also in public preview.

34
00:02:14,800 --> 00:02:20,080
We have the default rule set 2.1 for regional WAF with application gateway.

35
00:02:20,080 --> 00:02:24,560
So there's a whole bunch of rules that we have that are available for the web application

36
00:02:24,560 --> 00:02:27,960
firewall and the different versions.

37
00:02:27,960 --> 00:02:32,440
So now what we've done is just sort of standardized on a whole bunch of different types of rules.

38
00:02:32,440 --> 00:02:36,240
And this one is version 2.1, so includes things like cross-site scripting, includes things

39
00:02:36,240 --> 00:02:41,720
like well-known JavaScript attacks, remote code execution, session fixation, SQL injection

40
00:02:41,720 --> 00:02:45,360
right in my wheelhouse, those kinds of things.

41
00:02:45,360 --> 00:02:46,360
So these are the default rules.

42
00:02:46,360 --> 00:02:48,240
I mean, obviously you can go and change them.

43
00:02:48,240 --> 00:02:53,560
We want to sort of continue sort of raising the bar in terms of core security.

44
00:02:53,560 --> 00:02:56,400
Otherwise if I said there was three items, that's only two items allied.

45
00:02:56,400 --> 00:02:58,540
So there we go.

46
00:02:58,540 --> 00:03:02,420
So with that, let's turn our attention to our guest.

47
00:03:02,420 --> 00:03:06,400
As I mentioned at the beginning, our guest this week is Matt Zorich, who's here to talk

48
00:03:06,400 --> 00:03:11,680
to us about the team at Microsoft called Dart and its role in incident response.

49
00:03:11,680 --> 00:03:13,160
Matt, welcome to the podcast.

50
00:03:13,160 --> 00:03:16,040
We'd like to take a moment and introduce yourself to our listeners.

51
00:03:16,040 --> 00:03:17,800
Yeah, thanks for having me.

52
00:03:17,800 --> 00:03:21,500
Actually, yeah, to clarify, just like a lot of things at Microsoft, we have also gone

53
00:03:21,500 --> 00:03:23,180
under a rebranding.

54
00:03:23,180 --> 00:03:28,160
So we are now Microsoft Incident Response, previously Dart.

55
00:03:28,160 --> 00:03:33,760
And yes, I'm a consultant on the Microsoft Incident Response team and I'm based out of

56
00:03:33,760 --> 00:03:40,560
Australia with Sarah and basically we're Microsoft's customer facing incident response team.

57
00:03:40,560 --> 00:03:47,360
So if one of our customers is compromised and they feel the need for a full investigation,

58
00:03:47,360 --> 00:03:48,840
then we can be brought in to do that.

59
00:03:48,840 --> 00:03:55,520
And we'll do the investigation and some hardening work and kind of uncover the story of the

60
00:03:55,520 --> 00:04:01,280
attack and try to give the customers some good advice to hopefully not be a return customer.

61
00:04:01,280 --> 00:04:07,880
We always say we're the one team at Microsoft that can say, we don't really want to get

62
00:04:07,880 --> 00:04:09,640
your call ever again.

63
00:04:09,640 --> 00:04:13,680
I find the stuff that Dart does fascinating.

64
00:04:13,680 --> 00:04:22,840
And I know you're not able to talk about specifics or mention specific customers for obvious

65
00:04:22,840 --> 00:04:24,000
reasons.

66
00:04:24,000 --> 00:04:30,640
Tell us about, obviously, this is very fluid and changes all the time, but tell us a bit

67
00:04:30,640 --> 00:04:38,000
about some of the trends that you're seeing in your incident response land at the moment.

68
00:04:38,000 --> 00:04:42,560
What are the call outs from customers that you're seeing a lot of at the moment?

69
00:04:42,560 --> 00:04:47,360
Because I'm sure if I were a listener, this, well, I like to hear this from Dart and I'm

70
00:04:47,360 --> 00:04:48,720
going to keep calling you Dart.

71
00:04:48,720 --> 00:04:49,880
I know you're not Dart.

72
00:04:49,880 --> 00:04:52,600
You're Microsoft incident response.

73
00:04:52,600 --> 00:04:58,280
But I am always fascinated by the kind of things that you folks do and what you see.

74
00:04:58,280 --> 00:05:01,120
So yeah, what are the trends?

75
00:05:01,120 --> 00:05:04,680
What are the things you're getting called out for a lot at the moment?

76
00:05:04,680 --> 00:05:06,680
Yeah, you're right.

77
00:05:06,680 --> 00:05:11,200
It's kind of fluid and I think trying to understand why that's the case.

78
00:05:11,200 --> 00:05:14,040
Certainly there's no reason why it changes or moves over time.

79
00:05:14,040 --> 00:05:19,640
It just so happens to be the engagements that land with our team or things going on.

80
00:05:19,640 --> 00:05:26,040
Certainly in the last couple of months, we've been engaged in a lot more kind of cloud engagement.

81
00:05:26,040 --> 00:05:27,040
Certainly I have.

82
00:05:27,040 --> 00:05:34,360
So when we do investigations, they may be on-premises focused or they may be cloud,

83
00:05:34,360 --> 00:05:40,080
so kind of Azure and Azure Active Directory and Microsoft 365.

84
00:05:40,080 --> 00:05:46,560
From my last few months, they've certainly skewed towards Microsoft 365 and Azure.

85
00:05:46,560 --> 00:05:50,120
So I'm seeing a lot of those kind of come across my way.

86
00:05:50,120 --> 00:05:53,080
What the reason is, I don't really know.

87
00:05:53,080 --> 00:05:55,680
It could be any number of things.

88
00:05:55,680 --> 00:06:02,400
But I think we're, you know, to kind of guess, I guess I would say or try and kind of

89
00:06:02,400 --> 00:06:09,280
hypothesize about it is I think, you know, maybe post-COVID, now we're in this hybrid

90
00:06:09,280 --> 00:06:13,080
work kind of work style again.

91
00:06:13,080 --> 00:06:15,040
We still have a lot of people working from home.

92
00:06:15,040 --> 00:06:21,240
So kind of the first entry point they might have to your data is like a Microsoft 365.

93
00:06:21,240 --> 00:06:28,560
So maybe ATT&CK is a kind of understanding that people are working at home, maybe not

94
00:06:28,560 --> 00:06:32,440
protected in the same way that they would be in a traditional corporate network.

95
00:06:32,440 --> 00:06:38,560
So they're kind of attempting to fish or compromise like a cloud user and then kind of pivot from

96
00:06:38,560 --> 00:06:39,560
there.

97
00:06:39,560 --> 00:06:43,480
But, you know, we see all kinds of ATT&CK still.

98
00:06:43,480 --> 00:06:45,440
So we still see a lot of ransomware.

99
00:06:45,440 --> 00:06:49,240
We see a lot of like data exfiltration and things like that.

100
00:06:49,240 --> 00:06:53,760
But they certainly appear to have skewed a little bit more towards the cloud in the last

101
00:06:53,760 --> 00:06:55,520
few months for whatever reason.

102
00:06:55,520 --> 00:06:59,720
So you said that you're seeing a lot of incidents in cloud.

103
00:06:59,720 --> 00:07:05,840
Can you talk a little bit more about the kind of incidents you're seeing in cloud and M365?

104
00:07:05,840 --> 00:07:11,360
Yeah, yeah, certainly we can talk to kind of the broad tactics that we see.

105
00:07:11,360 --> 00:07:16,400
So look, I think a lot of people know if you read kind of any incident reports is that

106
00:07:16,400 --> 00:07:20,840
often like large scale compromise begins with a regular user.

107
00:07:20,840 --> 00:07:23,040
And that's certainly what we see reflected.

108
00:07:23,040 --> 00:07:29,400
So it might be like a non-privilege, just a regular user that gets fished or they get

109
00:07:29,400 --> 00:07:34,560
kind of the MFA bombing that you hear about or, you know, they click on a dodgy link or

110
00:07:34,560 --> 00:07:35,560
they have a weak password.

111
00:07:35,560 --> 00:07:37,880
However, kind of the attacker gets that foothold.

112
00:07:37,880 --> 00:07:44,240
And then what we're seeing is them kind of use the M365 kind of platform and just search

113
00:07:44,240 --> 00:07:46,400
within a user's mailbox.

114
00:07:46,400 --> 00:07:48,960
So they might search for other credentials.

115
00:07:48,960 --> 00:07:54,760
They might search for VPN information and things like that to kind of progress their

116
00:07:54,760 --> 00:07:55,760
attack.

117
00:07:55,760 --> 00:08:00,760
I think ultimately towards the end of the attacks in the cloud, we're seeing kind of

118
00:08:00,760 --> 00:08:02,500
a mixture of things.

119
00:08:02,500 --> 00:08:04,440
So it can be very financially motivated.

120
00:08:04,440 --> 00:08:10,200
So, you know, if an attacker can take over an Azure Active Directory tenant, they might

121
00:08:10,200 --> 00:08:17,760
then look to pivot into Azure and spin up kind of crypto mining infrastructure and mine

122
00:08:17,760 --> 00:08:19,880
crypto on the customer's dollar.

123
00:08:19,880 --> 00:08:25,880
Obviously the customer gets left with the bill so that can be very quick and very kind

124
00:08:25,880 --> 00:08:27,440
of expensive.

125
00:08:27,440 --> 00:08:33,120
The other one we're seeing is kind of the data exfiltration and extortion type of attacks

126
00:08:33,120 --> 00:08:39,440
where an attacker might get full tenant level access to everything that's in SharePoint

127
00:08:39,440 --> 00:08:45,080
and OneDrive and Teams and everything else and kind of take copies of that data and then

128
00:08:45,080 --> 00:08:51,520
try to extort the company saying, you know, I've got X amount of data, pay us this amount

129
00:08:51,520 --> 00:08:52,880
before we leak it.

130
00:08:52,880 --> 00:08:58,640
So they're kind of the two strains I would guess we see that the crypto mining and the

131
00:08:58,640 --> 00:09:03,840
extortion in, you know, with that data theft.

132
00:09:03,840 --> 00:09:08,160
For those two strains, are you able to tell us a bit about kind of how you would deal

133
00:09:08,160 --> 00:09:09,160
with them?

134
00:09:09,160 --> 00:09:10,160
Yeah, certainly.

135
00:09:10,160 --> 00:09:16,920
So look, the crypto mining side of things is usually quite easy to detect because generally

136
00:09:16,920 --> 00:09:23,380
the customer gets an alert or something saying you might have a huge bill that's been rung

137
00:09:23,380 --> 00:09:28,880
up because I think the crypto mining kind of the people that are skewed towards crypto

138
00:09:28,880 --> 00:09:33,080
mining know that they're on a very short time frame that they're going to be detected and

139
00:09:33,080 --> 00:09:34,480
they're going to get kicked back out.

140
00:09:34,480 --> 00:09:36,780
So really it's kind of like a smash and grab.

141
00:09:36,780 --> 00:09:41,800
You know, we might only have access to this this tenant for a few days.

142
00:09:41,800 --> 00:09:46,880
So we're just going to spin up as many VMs as we can to mine crypto before we're back

143
00:09:46,880 --> 00:09:53,960
out the door with the extortion and those kind of more sneaky tactics.

144
00:09:53,960 --> 00:09:58,160
What we try to do when we are first engaged with a customer that's kind of in the middle

145
00:09:58,160 --> 00:10:03,960
of that is try to take back control of the tenant from kind of a global administrator

146
00:10:03,960 --> 00:10:05,520
level.

147
00:10:05,520 --> 00:10:08,960
And we call that kind of positive identity control.

148
00:10:08,960 --> 00:10:13,840
So we want to make sure that all the global admins, they're under our control, that there's

149
00:10:13,840 --> 00:10:19,560
no persistence mechanisms left behind so the attacker can get back in.

150
00:10:19,560 --> 00:10:25,480
Once we're kind of happy that the environment's stable and it's in our control, then that's

151
00:10:25,480 --> 00:10:30,960
when we start doing kind of the deep dive and you know what did the access, sorry, what

152
00:10:30,960 --> 00:10:32,800
did the actor access?

153
00:10:32,800 --> 00:10:34,880
Did they access data?

154
00:10:34,880 --> 00:10:39,320
You know, have they created accounts to get back in later and things like that.

155
00:10:39,320 --> 00:10:45,000
But yeah, the first step in these kind of compromises is always try to, we call it positive

156
00:10:45,000 --> 00:10:49,760
identity control because someone's always in control of your tenant.

157
00:10:49,760 --> 00:10:55,920
We just, we prefer it's you that's in control of your tenant, not the bad guys.

158
00:10:55,920 --> 00:11:01,080
And we kind of pivot from there and really with these investigations, it's where the

159
00:11:01,080 --> 00:11:05,640
breadcrumbs lead us because no two investigations are the same.

160
00:11:05,640 --> 00:11:11,360
And look, we certainly see the same tactics and the same motivations, but they're constantly

161
00:11:11,360 --> 00:11:12,360
evolving.

162
00:11:12,360 --> 00:11:15,680
There's always novel ways and novel things that they're trying.

163
00:11:15,680 --> 00:11:18,360
So we just try to get all the data.

164
00:11:18,360 --> 00:11:19,520
We talk to the customer.

165
00:11:19,520 --> 00:11:20,760
We try to get logs.

166
00:11:20,760 --> 00:11:25,320
We try to get information and it's a bit like a detective story, right?

167
00:11:25,320 --> 00:11:27,720
You're just trying to piece together what happened.

168
00:11:27,720 --> 00:11:33,520
You're also trying to, it's always hard to distinguish between sometimes threat actor

169
00:11:33,520 --> 00:11:38,200
activity and users doing silly things activity as well.

170
00:11:38,200 --> 00:11:42,080
So often we have a lot of questions for the customers to clarify those things.

171
00:11:42,080 --> 00:11:47,960
It's interesting you just say, you know, data X fill and Bitcoin mining or some kind of

172
00:11:47,960 --> 00:11:49,240
digital currency mining.

173
00:11:49,240 --> 00:11:52,480
I mean, that's the result, you know, the end result, right?

174
00:11:52,480 --> 00:11:54,880
Someone obviously got into the system and so on, but it's interesting you just say things

175
00:11:54,880 --> 00:11:57,280
like fishing or clicking on a bad link.

176
00:11:57,280 --> 00:11:59,160
It's almost like what's old is new again.

177
00:11:59,160 --> 00:12:02,160
I mean, how long have we had these kinds of attacks?

178
00:12:02,160 --> 00:12:08,720
And we're still seeing them today as the way the bad guys get into the environment, right?

179
00:12:08,720 --> 00:12:15,480
I mean, so I mean, am I kind of right in that or are we seeing updated variations of these

180
00:12:15,480 --> 00:12:20,720
kinds of attacks or is it really just the absolute basics?

181
00:12:20,720 --> 00:12:21,720
It's a bit of both.

182
00:12:21,720 --> 00:12:22,720
Yeah.

183
00:12:22,720 --> 00:12:25,040
Look, you're certainly seeing some of the, yeah, look, people still click on fishing

184
00:12:25,040 --> 00:12:26,040
links.

185
00:12:26,040 --> 00:12:27,440
That's the reality of it.

186
00:12:27,440 --> 00:12:32,640
But we are seeing some more kind of novel and modern attacks, so certainly things like

187
00:12:32,640 --> 00:12:36,000
social engineering are becoming more popular.

188
00:12:36,000 --> 00:12:41,800
So you know, an attacker might email the help desk of a company and pretend to be a, you

189
00:12:41,800 --> 00:12:48,200
know, a user of that company and say, I've got a new mobile because I lost my old one

190
00:12:48,200 --> 00:12:51,200
or my old one broke and here's my new number.

191
00:12:51,200 --> 00:12:53,040
So can you update it in this system?

192
00:12:53,040 --> 00:12:59,880
And then the attacker can use the self-service password reset function to kind of take control

193
00:12:59,880 --> 00:13:00,880
of that account.

194
00:13:00,880 --> 00:13:04,400
We're also seeing, you know, you see things, not just traditional fishing, but you might

195
00:13:04,400 --> 00:13:05,400
see smishing.

196
00:13:05,400 --> 00:13:10,640
So, you know, SMS is sent to users and credentials lost that way.

197
00:13:10,640 --> 00:13:12,440
So it's still true.

198
00:13:12,440 --> 00:13:17,080
It's still fishing, I guess, but there are some modern kind of takes on it and things

199
00:13:17,080 --> 00:13:18,080
like that.

200
00:13:18,080 --> 00:13:25,360
But for social engineering, that's the hard one to kind of protect as well because you're

201
00:13:25,360 --> 00:13:32,160
relying on your help desk and your IT teams to kind of, it's a very soft skill to understand

202
00:13:32,160 --> 00:13:37,760
whether something's not quite right and kind of challenge users and things like that.

203
00:13:37,760 --> 00:13:42,400
So it's, I appreciate it's very hard to stop as well.

204
00:13:42,400 --> 00:13:45,800
So when you said smishing, so using SMS as an attack vector.

205
00:13:45,800 --> 00:13:49,600
Yeah, so is this because I'm thinking of multi-factor stuff, right?

206
00:13:49,600 --> 00:13:54,000
Something about multi-factual authentication using just SMS messages.

207
00:13:54,000 --> 00:13:57,400
Is that the attack we're talking about or are we just talking about in general, hey,

208
00:13:57,400 --> 00:13:58,800
this is Bob from support.

209
00:13:58,800 --> 00:14:02,800
Please log in and click this link and change your password.

210
00:14:02,800 --> 00:14:05,360
Because the big thing in multi-factual authentication is not to use SMS, right?

211
00:14:05,360 --> 00:14:08,040
It's to use an app instead or something like that.

212
00:14:08,040 --> 00:14:09,040
Yeah.

213
00:14:09,040 --> 00:14:14,240
So it tends to be the latter of that, which is, yeah, send some kind of message out, which

214
00:14:14,240 --> 00:14:20,640
is, you know, update your payroll information or this is IT security and can you confirm

215
00:14:20,640 --> 00:14:23,360
your details or whatever they look like.

216
00:14:23,360 --> 00:14:29,200
And then to combine with that, you might get some kind of social engineering to get through

217
00:14:29,200 --> 00:14:30,400
the MFA process.

218
00:14:30,400 --> 00:14:36,480
So they might call them up and say, you know, this is Bob from the help desk or it could

219
00:14:36,480 --> 00:14:41,680
be they often do like what we call like kind of spamming different MFA methods.

220
00:14:41,680 --> 00:14:45,640
So they might send a message and then they might do a prompt and then they might do a

221
00:14:45,640 --> 00:14:50,320
phone call and et cetera, et cetera, until one kind of gets through.

222
00:14:50,320 --> 00:14:55,800
And once they've taken control of that account, they can put their own MFA method in as a

223
00:14:55,800 --> 00:14:56,800
persistence mechanism.

224
00:14:56,800 --> 00:15:01,200
So they only really need to get through just once.

225
00:15:01,200 --> 00:15:04,840
And then, you know, they register their own phone or their own authenticator app.

226
00:15:04,840 --> 00:15:06,760
It's the law of large numbers, right?

227
00:15:06,760 --> 00:15:11,520
It's like if you hit enough people, someone's going to click on the link, right?

228
00:15:11,520 --> 00:15:14,920
A very small percentage of a very large number is still a large number.

229
00:15:14,920 --> 00:15:18,040
So that's what's working in the attackers advantage, right?

230
00:15:18,040 --> 00:15:19,040
They don't have everyone.

231
00:15:19,040 --> 00:15:23,800
Not everyone has to click on the link or respond to the text message or respond to the email.

232
00:15:23,800 --> 00:15:26,200
All it takes is one person ultimately.

233
00:15:26,200 --> 00:15:27,440
So it's the law of large numbers.

234
00:15:27,440 --> 00:15:32,640
You know, again, a small percentage of a very large number is still a large number of potential

235
00:15:32,640 --> 00:15:33,640
victims.

236
00:15:33,640 --> 00:15:34,640
Yeah, definitely.

237
00:15:34,640 --> 00:15:35,640
That's what I always say.

238
00:15:35,640 --> 00:15:40,440
I always say like being a defender or a blue teamer is really the hardest job in cybersecurity.

239
00:15:40,440 --> 00:15:44,680
Like I personally believe because you need to be perfect every day.

240
00:15:44,680 --> 00:15:47,880
The attackers and the bad guys really get as many shots as they want.

241
00:15:47,880 --> 00:15:49,400
It doesn't matter if they miss.

242
00:15:49,400 --> 00:15:52,800
They just like you say, more messages, more emails.

243
00:15:52,800 --> 00:15:54,360
Eventually someone's going to click.

244
00:15:54,360 --> 00:15:58,200
The law of averages says they'll get someone eventually.

245
00:15:58,200 --> 00:16:01,400
And that's like, you know, you laugh at some of these emails and you'd be like, oh, why

246
00:16:01,400 --> 00:16:03,040
would anyone ever click on them?

247
00:16:03,040 --> 00:16:06,280
But it doesn't cost anything to send emails.

248
00:16:06,280 --> 00:16:07,280
You know, someone will click.

249
00:16:07,280 --> 00:16:11,200
I think I should bring that up about, you know, essentially being the defender.

250
00:16:11,200 --> 00:16:14,440
The very first book that I ever wrote when I was at Microsoft was called Designing Secure

251
00:16:14,440 --> 00:16:18,480
Web-based Applications for Windows 2000 back in the day.

252
00:16:18,480 --> 00:16:23,520
And in there I actually wrote a small section called The Attacker's Advantage and the Defender's

253
00:16:23,520 --> 00:16:24,840
Dilemma.

254
00:16:24,840 --> 00:16:26,560
And that's exactly the problem, right?

255
00:16:26,560 --> 00:16:28,400
Is the attacker just has to get lucky once.

256
00:16:28,400 --> 00:16:34,000
The defender has to protect 100% of the time and be correct 100% of the time.

257
00:16:34,000 --> 00:16:36,080
The attacker just has to sneak in once, right?

258
00:16:36,080 --> 00:16:39,880
There's one little chink in the armor and there you go.

259
00:16:39,880 --> 00:16:41,400
Yeah, exactly.

260
00:16:41,400 --> 00:16:47,760
And the defenders often have to deal with the reality of business, which might be, you

261
00:16:47,760 --> 00:16:49,720
know, they don't want to change.

262
00:16:49,720 --> 00:16:51,240
They don't want to deploy MFA.

263
00:16:51,240 --> 00:16:55,400
They don't want to do things that inhibit productivity.

264
00:16:55,400 --> 00:16:59,540
And they have these, like the defenders have all these corporate rules to follow and budgets

265
00:16:59,540 --> 00:17:01,560
and things like that.

266
00:17:01,560 --> 00:17:03,920
Attackers don't have any of those.

267
00:17:03,920 --> 00:17:08,520
There's no scope of work they can really do whatever they like.

268
00:17:08,520 --> 00:17:10,020
So that makes it doubly hard.

269
00:17:10,020 --> 00:17:17,560
I often hear people say, well, I'm not a really big, well-known, you know, Fortune 500 company.

270
00:17:17,560 --> 00:17:20,000
Why would someone try and attack me?

271
00:17:20,000 --> 00:17:23,040
Why would someone try and hack me?

272
00:17:23,040 --> 00:17:26,520
And I just wondered if you had any thoughts on that.

273
00:17:26,520 --> 00:17:33,000
You know, the types of customers that you actually end up helping, are they much bigger,

274
00:17:33,000 --> 00:17:34,920
high profile, well-known brands?

275
00:17:34,920 --> 00:17:40,920
Or do you go to a lot of, you know, smaller customers that might be the type that would

276
00:17:40,920 --> 00:17:43,720
say, hey, why does anyone care about me?

277
00:17:43,720 --> 00:17:48,640
So I think, yeah, no one's definitely no one's immune from cyber attack.

278
00:17:48,640 --> 00:17:49,640
For sure.

279
00:17:49,640 --> 00:17:51,600
I can definitely tell you that.

280
00:17:51,600 --> 00:17:57,400
Like whether it's small business, medium business or massive enterprises might be, you know,

281
00:17:57,400 --> 00:18:02,120
the attackers might be motivated by different things depending on the size of the organization.

282
00:18:02,120 --> 00:18:08,640
Our team probably skews towards bigger customers, but that's probably more because, you know,

283
00:18:08,640 --> 00:18:14,440
we're Microsoft and we're a big IR team and we kind of play in that market.

284
00:18:14,440 --> 00:18:21,600
But there's plenty of other cybersecurity incident response firms that kind of are engaged

285
00:18:21,600 --> 00:18:25,600
with small, medium businesses as well.

286
00:18:25,600 --> 00:18:29,240
I don't think, like I said, I don't think anyone's immune.

287
00:18:29,240 --> 00:18:33,000
And a lot of the attacks are opportunistic.

288
00:18:33,000 --> 00:18:36,680
So if they, like you said, they send out all these phishing emails or smishing and they

289
00:18:36,680 --> 00:18:43,600
get credentials for a particular company and they're able to access it, that might be the

290
00:18:43,600 --> 00:18:49,400
reason to go after that company rather than being particularly targeted.

291
00:18:49,400 --> 00:18:51,520
So yeah, every range.

292
00:18:51,520 --> 00:18:58,320
And I think the ones you read in the news and things skewed towards the bigger companies

293
00:18:58,320 --> 00:19:05,520
as well, but you're likely not hearing about the smaller and the medium sized enterprises

294
00:19:05,520 --> 00:19:06,760
that are getting attacked.

295
00:19:06,760 --> 00:19:10,400
It's just kind of not as newsworthy, I guess, unfortunately.

296
00:19:10,400 --> 00:19:15,720
But you look at the stats for things like business email compromise and business email

297
00:19:15,720 --> 00:19:18,660
compromise is not flashy.

298
00:19:18,660 --> 00:19:26,200
It's been around forever and that's simply compromising someone's user account and logging

299
00:19:26,200 --> 00:19:32,240
onto their email and then you might change the invoice on an email or something and to

300
00:19:32,240 --> 00:19:39,280
your bank account details, which sounds like they might be $20,000, like a $20,000 invoice,

301
00:19:39,280 --> 00:19:43,640
which doesn't sound like much in the scheme of things, but the stats for things like business

302
00:19:43,640 --> 00:19:47,400
email compromise, if you look at them, it's wild.

303
00:19:47,400 --> 00:19:55,040
It's billions of dollars that costs every year and $20,000 to a massive company is a

304
00:19:55,040 --> 00:19:56,040
drop in the ocean.

305
00:19:56,040 --> 00:20:01,720
And that could be devastating to a small or medium business that's kind of month to month.

306
00:20:01,720 --> 00:20:05,680
So yeah, like I say, no one's immune.

307
00:20:05,680 --> 00:20:12,480
What things would you recommend based on what you've seen that people should focus on in

308
00:20:12,480 --> 00:20:17,280
terms of security controls to reduce the likelihood of a breach?

309
00:20:17,280 --> 00:20:23,880
I mean, I know that in a compromise, I mean, I know that's what we all do every day, but

310
00:20:23,880 --> 00:20:29,200
I'd be keen to hear from an incident responder.

311
00:20:29,200 --> 00:20:32,000
What do you think the things are that people should really focus on?

312
00:20:32,000 --> 00:20:33,000
Yeah, definitely.

313
00:20:33,000 --> 00:20:35,840
It's going to be a very, very boring answer.

314
00:20:35,840 --> 00:20:38,000
Honestly, it's the basics.

315
00:20:38,000 --> 00:20:42,720
And like I always say, don't get the basics confused with it being easy because they're

316
00:20:42,720 --> 00:20:46,680
two different things, especially in massive, massive organizations.

317
00:20:46,680 --> 00:20:51,920
I'm not sure if you've ever seen the Microsoft Cybersecurity bell curve.

318
00:20:51,920 --> 00:20:53,480
I'm a big fan of that.

319
00:20:53,480 --> 00:20:54,480
You know what?

320
00:20:54,480 --> 00:20:59,760
I have that nowadays, no joke, in pretty much every presentation I do.

321
00:20:59,760 --> 00:21:01,520
We'll link to it in the show notes.

322
00:21:01,520 --> 00:21:03,280
It's in the digital defense report.

323
00:21:03,280 --> 00:21:04,960
Sorry, Matt, go on.

324
00:21:04,960 --> 00:21:11,280
The reality is, is like being an incident responder is you see that it is actually true.

325
00:21:11,280 --> 00:21:12,280
It's the basics.

326
00:21:12,280 --> 00:21:21,040
So I think it's like 98% of attacks will be stopped by, you know, MFA, least privilege,

327
00:21:21,040 --> 00:21:28,320
attaching vulnerable servers and kind of applying zero trust and having like a modern anti-malware,

328
00:21:28,320 --> 00:21:31,520
like a defender for endpoint or something like that.

329
00:21:31,520 --> 00:21:36,800
And I remember seeing a phrase once, it was like, basically what we're after is brilliance

330
00:21:36,800 --> 00:21:38,720
in the basics.

331
00:21:38,720 --> 00:21:42,960
And that's like a term that's always stuck with me and I've really enjoyed it.

332
00:21:42,960 --> 00:21:47,760
And I use it occasionally is, and that's what it is.

333
00:21:47,760 --> 00:21:53,240
It's doing those basics, doing them across a whole organization.

334
00:21:53,240 --> 00:21:57,120
And like I say, like I've been on the other side and worked as like a blue teamer in big

335
00:21:57,120 --> 00:22:02,920
organizations and it is hard, like hundreds of thousands of users or millions of users

336
00:22:02,920 --> 00:22:06,280
and deploying MFA to them.

337
00:22:06,280 --> 00:22:11,560
Like we understand it's difficult, but there's no secret source.

338
00:22:11,560 --> 00:22:14,160
If you do do it, it'll reduce your risk.

339
00:22:14,160 --> 00:22:17,720
And that's what we always say is there's no 100% secure.

340
00:22:17,720 --> 00:22:19,640
It's risk mitigation.

341
00:22:19,640 --> 00:22:28,760
And if you've got 100,000 users and you can deploy MFA to 90% of them, that's a huge win.

342
00:22:28,760 --> 00:22:34,840
Even if you can never get the last 10,000 or whatever it is, it's all reducing risk.

343
00:22:34,840 --> 00:22:41,080
What are the common mistakes you see that people make that subsequently end up being

344
00:22:41,080 --> 00:22:42,080
breaches?

345
00:22:42,080 --> 00:22:46,080
I realize this is kind of sort of very closely linked with what I've just asked you.

346
00:22:46,080 --> 00:22:50,880
Yeah, I think that the constant ones we see in this is both in on premises and kind of

347
00:22:50,880 --> 00:22:56,920
in Azure Active Directory is probably too much privilege for users that don't really

348
00:22:56,920 --> 00:23:04,120
require it and kind of not understanding what that privilege can grant users to.

349
00:23:04,120 --> 00:23:08,320
So you might have like a service account that's in a domain admins group.

350
00:23:08,320 --> 00:23:14,720
We see that all the time or a service account that's in global admins in Azure Active Directory.

351
00:23:14,720 --> 00:23:20,040
Those kind of service accounts and things like that are often not secured the same way

352
00:23:20,040 --> 00:23:22,840
like a human identity would be.

353
00:23:22,840 --> 00:23:25,680
And that's kind of the nature of them.

354
00:23:25,680 --> 00:23:29,320
You can't really put MFA and things on service accounts.

355
00:23:29,320 --> 00:23:35,040
And often credentials for those things are left kind of just hanging around in clear

356
00:23:35,040 --> 00:23:42,520
text, whether that's in scripts or that's the old passwords.txt sitting in your One

357
00:23:42,520 --> 00:23:48,320
Drive or sitting in your SharePoint because they're very if you can find them, if you

358
00:23:48,320 --> 00:23:52,120
search for, you know, I've got my passwords list that I go to.

359
00:23:52,120 --> 00:23:58,360
And so I search for passwords, then an attacker can also search for passwords and uncover

360
00:23:58,360 --> 00:24:00,160
them as well.

361
00:24:00,160 --> 00:24:02,840
So that's definitely the most common one.

362
00:24:02,840 --> 00:24:10,520
And I think the other one is basically not enforcing kind of stricter controls on your

363
00:24:10,520 --> 00:24:15,120
tier zero and your very, very high privileged account accounts.

364
00:24:15,120 --> 00:24:22,160
So like we said earlier, like we appreciate that it's very hard to deploy all these security

365
00:24:22,160 --> 00:24:27,160
controls across massive organizations uniformly.

366
00:24:27,160 --> 00:24:32,160
But your privileged accounts and your tier zeros and your domain admins and your global

367
00:24:32,160 --> 00:24:37,200
admins, that should be a very small subset of your users.

368
00:24:37,200 --> 00:24:41,480
And those users should be held to much stricter controls.

369
00:24:41,480 --> 00:24:48,040
Look, I think everyone on this call and probably everyone listening is you'll never stop regular

370
00:24:48,040 --> 00:24:50,200
users being compromised.

371
00:24:50,200 --> 00:24:51,200
It's just going to happen.

372
00:24:51,200 --> 00:24:56,280
Like we said, whether it's phishing, whether it's smishing, whether it's downloading cracked

373
00:24:56,280 --> 00:24:58,920
software, that's okay.

374
00:24:58,920 --> 00:25:02,920
Like that can be inconvenient for that one user.

375
00:25:02,920 --> 00:25:09,360
What ultimately we're trying to do is stop the kind of the compromise of a single user

376
00:25:09,360 --> 00:25:12,720
turning into a compromise of all our users.

377
00:25:12,720 --> 00:25:18,800
And that's where we need to, if we harden our tier zero, we can deal with the user installing

378
00:25:18,800 --> 00:25:20,520
malware.

379
00:25:20,520 --> 00:25:24,120
We can give them a wrap on the knuckles and send them a new laptop.

380
00:25:24,120 --> 00:25:25,560
That's no big deal.

381
00:25:25,560 --> 00:25:30,560
But we don't want that to be like the first domino that ends up in a huge incident.

382
00:25:30,560 --> 00:25:33,720
Just last couple of questions for you, Matt.

383
00:25:33,720 --> 00:25:36,880
Hopefully none of our listeners will have to do this.

384
00:25:36,880 --> 00:25:47,680
But if anyone out there did need the services of Microsoft IR, what is the process to get

385
00:25:47,680 --> 00:25:51,520
in touch if a customer thinks they need help?

386
00:25:51,520 --> 00:25:52,520
Yeah.

387
00:25:52,520 --> 00:25:54,040
So there's a few ways.

388
00:25:54,040 --> 00:26:00,480
So you can log a severity A case through your unified support.

389
00:26:00,480 --> 00:26:04,800
And we kind of on the Microsoft IR side, we kind of keep an eye on those.

390
00:26:04,800 --> 00:26:08,320
They don't always require full incident response.

391
00:26:08,320 --> 00:26:13,000
Sometimes the cert team can handle it or it's not actually a security thing.

392
00:26:13,000 --> 00:26:14,480
It might be an operational thing.

393
00:26:14,480 --> 00:26:18,000
And we'll kind of keep an eye on any that may come our way.

394
00:26:18,000 --> 00:26:23,560
Or we believe that the customer may need full incident response because they don't maybe

395
00:26:23,560 --> 00:26:27,040
we don't appreciate kind of the situation they're in.

396
00:26:27,040 --> 00:26:29,520
We also were available under retainer.

397
00:26:29,520 --> 00:26:35,360
So if you're interested in having Microsoft IR just available, it's like, you know, pick

398
00:26:35,360 --> 00:26:36,920
up the phone, something's happened.

399
00:26:36,920 --> 00:26:39,640
We can kind of mobilize very quickly.

400
00:26:39,640 --> 00:26:45,720
There's also just like on a link on our website, and I'll find you the link just basically,

401
00:26:45,720 --> 00:26:49,320
you know, we're suffering a security incident at the moment.

402
00:26:49,320 --> 00:26:54,720
And that comes through to our team and we reach out and kind of see what's happening.

403
00:26:54,720 --> 00:27:01,160
And again, if you if we feel you need full incident response, and we can try mobilize

404
00:27:01,160 --> 00:27:03,520
a team and get going.

405
00:27:03,520 --> 00:27:05,160
So lots of ways.

406
00:27:05,160 --> 00:27:10,160
What, in your opinion, what is the value of some kind of protected workstation like a

407
00:27:10,160 --> 00:27:14,360
protected access workstation or privilege access workstation, I should say, or, you

408
00:27:14,360 --> 00:27:15,640
know, secured access workstation?

409
00:27:15,640 --> 00:27:20,840
In other words, people shouldn't, you know, not going into production, like not administering

410
00:27:20,840 --> 00:27:26,400
production from their from their normal work machines, rather they're going in from, you

411
00:27:26,400 --> 00:27:30,440
know, machines that are designed to do a specific task, which is administration.

412
00:27:30,440 --> 00:27:31,440
And that's it.

413
00:27:31,440 --> 00:27:37,200
But you can't do your email, you can't go and look on Facebook or any of your socials.

414
00:27:37,200 --> 00:27:39,480
Is that something that that sits well with customers?

415
00:27:39,480 --> 00:27:41,600
Or is it a bit of a hard sell?

416
00:27:41,600 --> 00:27:46,640
It's one of those things it's we always laugh about it is like the saying is like, you know,

417
00:27:46,640 --> 00:27:47,920
never waste a crisis.

418
00:27:47,920 --> 00:27:53,760
So post incident or when we're engaged, things like the privileged access workstations and

419
00:27:53,760 --> 00:27:57,520
MFA and things like that, they all they all become a really good idea.

420
00:27:57,520 --> 00:28:03,200
It's very it's very easy to kind of invoke change during the incident or in the brief

421
00:28:03,200 --> 00:28:06,120
period after.

422
00:28:06,120 --> 00:28:11,160
So often customers are very open to that idea because they've seen the ramifications of

423
00:28:11,160 --> 00:28:14,640
not not having a privileged access workstation.

424
00:28:14,640 --> 00:28:16,960
And it is like a fantastic control.

425
00:28:16,960 --> 00:28:23,440
And it lends on to what I said earlier about protecting those tier zero accounts as well

426
00:28:23,440 --> 00:28:24,440
as you can.

427
00:28:24,440 --> 00:28:31,240
So, you know, privilege access workstation is ultimately trying to stop the kind of spread

428
00:28:31,240 --> 00:28:33,040
of privilege credentials.

429
00:28:33,040 --> 00:28:39,040
Because if you think about like if you've got 10,000 workstations and you've got a domain

430
00:28:39,040 --> 00:28:45,760
admin and if a user has a problem with their workstation, they log on to the workstation

431
00:28:45,760 --> 00:28:50,160
as their domain admin account and kind of help the user out.

432
00:28:50,160 --> 00:28:56,000
What you've done is actually left your credentials on that end user workstation and an end user

433
00:28:56,000 --> 00:29:01,440
workstation is probably not going to be secured as well as a domain controller or things like

434
00:29:01,440 --> 00:29:02,440
that.

435
00:29:02,440 --> 00:29:07,320
So if an attacker was to kind of compromise that end user device, there's a chance that

436
00:29:07,320 --> 00:29:11,760
they could compromise the domain admin credentials.

437
00:29:11,760 --> 00:29:17,480
So a poor or, you know, a secure access workstation or whatever you'd like to call it basically

438
00:29:17,480 --> 00:29:23,320
says if you're going to access a tier zero service like a domain controller, you can

439
00:29:23,320 --> 00:29:28,920
only access it from this particular device and it's got its own controls.

440
00:29:28,920 --> 00:29:31,000
And like you say, it doesn't browse the Internet.

441
00:29:31,000 --> 00:29:34,600
It doesn't have a mailbox so it can't be phished.

442
00:29:34,600 --> 00:29:35,600
It's hardened.

443
00:29:35,600 --> 00:29:42,080
It's whatever controls you want to put around it and that way we're not leaving domain admin

444
00:29:42,080 --> 00:29:46,440
credentials kind of scattered around the environment in lower tiers.

445
00:29:46,440 --> 00:29:50,440
So yeah, it's a very, very strong control.

446
00:29:50,440 --> 00:29:54,760
Definitely worth doing and I think it's gotten easier as well in time.

447
00:29:54,760 --> 00:29:59,600
I think you've got things like you can have like virtual secured workstations now running

448
00:29:59,600 --> 00:30:01,040
in Azure and things like that.

449
00:30:01,040 --> 00:30:05,480
So it doesn't always have to be, you know, another laptop or another device and things

450
00:30:05,480 --> 00:30:06,480
like that.

451
00:30:06,480 --> 00:30:12,200
Yeah, I did some work a few years ago with a healthcare company and they were moving

452
00:30:12,200 --> 00:30:14,560
some of their workloads to Azure.

453
00:30:14,560 --> 00:30:20,920
And basically it was, if you access, even try to access production in Azure without

454
00:30:20,920 --> 00:30:27,320
using a privileged access workstation, it was actually a potentially fireable offense

455
00:30:27,320 --> 00:30:30,760
because they needed to make sure that, you know, access to the environment was through

456
00:30:30,760 --> 00:30:35,400
a secured environment, not from Bob's laptop, you know.

457
00:30:35,400 --> 00:30:36,400
Yeah, that's right.

458
00:30:36,400 --> 00:30:38,000
Yeah, that's exactly right.

459
00:30:38,000 --> 00:30:39,000
Yeah.

460
00:30:39,000 --> 00:30:42,920
And with, you know, things like conditional access in Azure, I do really help that now

461
00:30:42,920 --> 00:30:48,520
because it's good to have, you know, the policy of course saying that you can be fired if

462
00:30:48,520 --> 00:30:54,600
you do that, but you really, what you really want is like a hard technical control because

463
00:30:54,600 --> 00:31:00,120
people will be lazy and eventually they'll probably kind of, you know, if you have an

464
00:31:00,120 --> 00:31:04,480
incident and you put a paw and you put all these cool rules in and that's awesome and

465
00:31:04,480 --> 00:31:09,960
kind of 12 months later, two years later, the security, you know, the compromise you

466
00:31:09,960 --> 00:31:14,680
had is kind of a distant memory and then the bad habits sneak back in.

467
00:31:14,680 --> 00:31:15,680
That's just human nature.

468
00:31:15,680 --> 00:31:19,280
Yeah, it has to be a strong preventative control, right?

469
00:31:19,280 --> 00:31:20,280
Like not a responsive control.

470
00:31:20,280 --> 00:31:25,000
It has to be something that prevents someone from accessing the environment unless they're

471
00:31:25,000 --> 00:31:27,960
using a privileged access workstation or similar.

472
00:31:27,960 --> 00:31:28,960
Yeah.

473
00:31:28,960 --> 00:31:29,960
Yeah.

474
00:31:29,960 --> 00:31:34,440
It's good to know that you're seeing people actually adopt it, but only after that you're

475
00:31:34,440 --> 00:31:38,200
like, you know, the incidents happened, but I guess that's better than better than never.

476
00:31:38,200 --> 00:31:41,840
So Matt, one thing you may be aware of is that every time we have a guest, we always

477
00:31:41,840 --> 00:31:43,360
ask them for a final thought.

478
00:31:43,360 --> 00:31:46,520
So if there's one final thought you'd like to leave our listeners with, what would it

479
00:31:46,520 --> 00:31:47,520
be?

480
00:31:47,520 --> 00:31:52,320
Yeah, I think my final thought kind of speaks to what we were just talking about actually

481
00:31:52,320 --> 00:31:59,320
in that, look, in our experience, we see the don't waste a crisis mentality and like it's

482
00:31:59,320 --> 00:32:04,400
great that companies go through and make the changes after the fact, but would really like

483
00:32:04,400 --> 00:32:06,880
to have people to do it beforehand.

484
00:32:06,880 --> 00:32:13,160
And really it's easier, it's cheaper, it's less stressful to make those changes before

485
00:32:13,160 --> 00:32:16,320
an incident than when you're in the heat of it.

486
00:32:16,320 --> 00:32:21,960
And kind of leading on with that is that a saying we have is that don't, you know, don't

487
00:32:21,960 --> 00:32:25,440
let perfection be the enemy of good.

488
00:32:25,440 --> 00:32:28,600
It doesn't need, your security controls don't need to be perfect.

489
00:32:28,600 --> 00:32:33,920
You don't need to be a hundred percent coverage of MFA or it doesn't need to be that.

490
00:32:33,920 --> 00:32:36,400
It can be just iterative.

491
00:32:36,400 --> 00:32:42,000
So you can deploy MFA to your admins, to your privileged account, to your board members,

492
00:32:42,000 --> 00:32:43,000
things like that.

493
00:32:43,000 --> 00:32:44,280
And it's all risk reduction.

494
00:32:44,280 --> 00:32:50,600
So trying to kind of architect solutions for a hundred thousand users, there's always going

495
00:32:50,600 --> 00:32:53,840
to be exclusions and that's not a problem.

496
00:32:53,840 --> 00:32:58,520
But don't let those exclusions kind of get the way in the way of rolling out these things

497
00:32:58,520 --> 00:33:00,560
to the majority of users.

498
00:33:00,560 --> 00:33:04,120
I think that would be my final thought is like I say, it's all risk reduction.

499
00:33:04,120 --> 00:33:07,360
If you can reduce risk a little bit, that's a really good win.

500
00:33:07,360 --> 00:33:08,360
Excellent.

501
00:33:08,360 --> 00:33:09,360
Thanks for that.

502
00:33:09,360 --> 00:33:10,360
So it makes absolute sense.

503
00:33:10,360 --> 00:33:13,840
And with that, let's bring this episode to an end.

504
00:33:13,840 --> 00:33:16,040
So Matt, thank you so much for joining us this week.

505
00:33:16,040 --> 00:33:21,440
I always learn something on these episodes and this was absolutely no exception.

506
00:33:21,440 --> 00:33:24,600
And to all our listeners out there, thank you so much for listening.

507
00:33:24,600 --> 00:33:26,720
Stay safe and we'll see you next time.

508
00:33:26,720 --> 00:33:29,760
Thanks for listening to the Azure Security Podcast.

509
00:33:29,760 --> 00:33:36,560
You can find show notes and other resources at our website, azsecuritypodcast.net.

510
00:33:36,560 --> 00:33:41,440
If you have any questions, please find us on Twitter at Azure Setpod.

511
00:33:41,440 --> 00:34:05,320
background music is from ccmixtor.com and licensed under the Creative Commons license.

