1
00:00:00,000 --> 00:00:06,200
Welcome to the Azure Security Podcast,

2
00:00:06,200 --> 00:00:08,520
where we discuss topics relating to security,

3
00:00:08,520 --> 00:00:13,760
privacy, reliability, and compliance on the Microsoft Cloud Platform.

4
00:00:13,760 --> 00:00:17,460
Hey, everybody. Welcome to episode 76.

5
00:00:17,460 --> 00:00:20,720
This week, it's myself, Michael, with Mark and Sarah.

6
00:00:20,720 --> 00:00:22,000
We have a guest this week,

7
00:00:22,000 --> 00:00:26,480
Nagar Shabab, who's here to talk to us about Microsoft Security Research.

8
00:00:26,480 --> 00:00:27,960
But before we get to our guest,

9
00:00:27,960 --> 00:00:29,800
let's take a little lap around the news.

10
00:00:29,800 --> 00:00:31,880
Mark, why don't you kick things off?

11
00:00:31,880 --> 00:00:34,800
So a couple of things that I've been focusing on

12
00:00:34,800 --> 00:00:36,960
lately that I thought would be of interest.

13
00:00:36,960 --> 00:00:40,080
The first is a call out and request.

14
00:00:40,080 --> 00:00:41,640
So the new version of

15
00:00:41,640 --> 00:00:46,760
the Microsoft Cybersecurity Reference Architecture or MCRA is in development.

16
00:00:46,760 --> 00:00:50,160
So I'm working on that quite a bit lately.

17
00:00:50,160 --> 00:00:53,120
Definitely got a plan for all the things that we're doing in there,

18
00:00:53,120 --> 00:00:55,480
but would love to hear any requirements, thoughts,

19
00:00:55,480 --> 00:00:59,800
ideas from folks on how you use it,

20
00:00:59,800 --> 00:01:01,600
what you wish it had in it,

21
00:01:01,600 --> 00:01:05,080
and what you wish was a little bit different or a little bit clearer,

22
00:01:05,080 --> 00:01:07,400
had a little more detail, something like that.

23
00:01:07,400 --> 00:01:09,240
So I'd love to get feedback on that.

24
00:01:09,240 --> 00:01:12,040
So just hit me up on LinkedIn or Twitter or whatever.

25
00:01:12,040 --> 00:01:16,280
Love to hear what you are interested in there.

26
00:01:16,280 --> 00:01:19,000
The other thing that's been interesting to me is,

27
00:01:19,000 --> 00:01:24,040
I've had a few conversations recently around what I'm calling for the lack of

28
00:01:24,040 --> 00:01:27,200
a better term a security alignment paradox.

29
00:01:27,200 --> 00:01:29,080
So I posted a slide,

30
00:01:29,080 --> 00:01:33,160
it's actually part of the architecture design session module once for

31
00:01:33,160 --> 00:01:36,280
the end-to-end Zero Trust architecture that we're

32
00:01:36,280 --> 00:01:40,400
delivering through our unified engagements.

33
00:01:40,400 --> 00:01:44,560
It's part of a Rosetta Stone where you got all these different models like

34
00:01:44,560 --> 00:01:49,240
defense in depth and Zero Trust principles and the cybersecurity framework,

35
00:01:49,240 --> 00:01:50,720
and all these other things that guide us,

36
00:01:50,720 --> 00:01:54,800
like a spider attack, etc. And put them all together in one place and

37
00:01:54,800 --> 00:01:57,000
explain what they're good for, what they're not.

38
00:01:57,000 --> 00:01:59,520
That's part of that workshop that ended in architecture.

39
00:01:59,520 --> 00:02:03,120
But one of them that we ended up creating over the course of this,

40
00:02:03,120 --> 00:02:06,080
as we were trying to do some planning and stuff,

41
00:02:06,080 --> 00:02:10,240
we realized that it's nearly impossible to map defenses

42
00:02:10,240 --> 00:02:14,640
cleanly to attacks or to business outcomes from security.

43
00:02:14,640 --> 00:02:17,840
So security is in this difficult,

44
00:02:17,840 --> 00:02:22,680
unexplainable position in some ways, and not unexplainable entirely,

45
00:02:22,680 --> 00:02:25,040
but just the things that we do that are the right things,

46
00:02:25,040 --> 00:02:27,360
the most important things to do to mitigate risk,

47
00:02:27,360 --> 00:02:32,000
they're going to mitigate on a many to many basis different attacks.

48
00:02:32,000 --> 00:02:35,720
And so it's not like, oh, you do this one thing and then magically this attack

49
00:02:35,720 --> 00:02:38,560
disappears, like that doesn't actually happen.

50
00:02:38,560 --> 00:02:43,640
And if you do this one thing, then you magically fix this business risk.

51
00:02:43,640 --> 00:02:45,080
That doesn't happen either.

52
00:02:45,080 --> 00:02:47,760
So sort of security is sort of caught in the middle.

53
00:02:47,760 --> 00:02:50,080
And I'll be honest with you, it's sort of an interesting realization.

54
00:02:50,080 --> 00:02:54,680
It took off on like 15,000 impressions or something like that.

55
00:02:54,680 --> 00:02:57,000
And I don't know, a couple hundred reactions.

56
00:02:57,000 --> 00:02:59,880
It was very interesting to me that it took off.

57
00:02:59,880 --> 00:03:03,360
Because I don't know, I think security people like bad news for some reason.

58
00:03:03,360 --> 00:03:06,400
But I'm trying to kind of figure out what to do with it, to be honest,

59
00:03:06,400 --> 00:03:08,760
because it's a truth and it's useful.

60
00:03:08,760 --> 00:03:11,320
But I haven't quite figured out what it's useful for.

61
00:03:11,320 --> 00:03:13,320
I mean, it helps us explain that security is hard,

62
00:03:13,320 --> 00:03:16,080
and it's kind of a nice, simple visual for doing that.

63
00:03:16,080 --> 00:03:19,280
But I'm just trying to figure out what people think about it

64
00:03:19,280 --> 00:03:22,840
and how we can use it to sort of help our industry move forward.

65
00:03:22,840 --> 00:03:25,160
So yeah, definitely interested in folks' feedback.

66
00:03:25,160 --> 00:03:28,320
We'll include a link to the LinkedIn post that has a visual

67
00:03:28,320 --> 00:03:30,920
and sort of the current discussion, et cetera.

68
00:03:30,920 --> 00:03:35,800
But yeah, those are the two things for me that's going on in my world.

69
00:03:35,800 --> 00:03:44,640
So I've just got one piece of exciting news, which is my baby,

70
00:03:44,640 --> 00:03:48,560
Microsoft Sentinel is now available in China.

71
00:03:48,560 --> 00:03:54,080
So that's something that a lot of customers who have operations in China

72
00:03:54,080 --> 00:03:57,320
have been asking for for a while.

73
00:03:57,320 --> 00:04:02,520
You may or may not know that Azure in China is a physically,

74
00:04:02,520 --> 00:04:08,080
it is separate from the rest of Azure, a bit like our GovNet.

75
00:04:08,080 --> 00:04:10,920
So it can adhere to local regulations.

76
00:04:10,920 --> 00:04:15,680
So if you wanted to use Sentinel and you wanted to use it in China,

77
00:04:15,680 --> 00:04:19,680
it's now available in public preview in the China East 2 region.

78
00:04:19,680 --> 00:04:25,000
And I do know that there are quite a few folks, both within China

79
00:04:25,000 --> 00:04:31,160
and come international companies that have presences in China that wanted to use it.

80
00:04:31,160 --> 00:04:33,360
So that's good news.

81
00:04:33,360 --> 00:04:37,240
And that's just my one bit of news this week.

82
00:04:37,240 --> 00:04:38,720
So I've got a few little items.

83
00:04:38,720 --> 00:04:42,520
The first one is we have some new confidential VMs.

84
00:04:42,520 --> 00:04:47,760
If you are familiar with the current sort of incarnation of the confidential VMs,

85
00:04:47,760 --> 00:04:51,040
they presently use an AMD EPIC chip.

86
00:04:51,040 --> 00:04:54,720
And essentially, the root of trust is all the way down in those in those CPUs.

87
00:04:54,720 --> 00:04:56,880
We now have some new confidential VMs.

88
00:04:56,880 --> 00:04:57,600
They're in preview.

89
00:04:57,600 --> 00:05:03,520
They're the DC ESV5 and EC ESV5 series.

90
00:05:03,520 --> 00:05:09,480
And they use Intel TDX, which is if you squint, it's a similar idea.

91
00:05:09,480 --> 00:05:12,560
So they're in preview right now.

92
00:05:12,560 --> 00:05:16,520
Next one is this is actually kind of cool to see.

93
00:05:16,520 --> 00:05:20,600
We're kind of alarmed at how many people don't use auditing in ASI SQL database.

94
00:05:20,600 --> 00:05:24,040
And we think part of that may have been the documentation wasn't exactly,

95
00:05:24,040 --> 00:05:25,880
let's just say it could be improved.

96
00:05:25,880 --> 00:05:27,160
Well, it's been improved.

97
00:05:27,160 --> 00:05:32,480
And so there's a link in the show notes to the new sort of landing page for auditing.

98
00:05:32,480 --> 00:05:35,040
It explains things really, really well.

99
00:05:35,040 --> 00:05:38,520
It explains all the requirements to set up auditing in ASI SQL database.

100
00:05:38,520 --> 00:05:42,360
So it's had a complete rework, which is great to see.

101
00:05:42,360 --> 00:05:45,000
Next one, which has got nothing whatsoever to do with security really,

102
00:05:45,000 --> 00:05:47,760
but it's something that's in my backyard.

103
00:05:47,760 --> 00:05:52,640
There's a new.NET library for T-SQL parsing.

104
00:05:52,640 --> 00:05:56,320
Now, more importantly, it's available in open source.

105
00:05:56,320 --> 00:05:57,760
And it's called Script DOM.

106
00:05:57,760 --> 00:06:01,680
And it's basically just a new get package that you can download.

107
00:06:01,680 --> 00:06:06,520
The nice thing about it is if you wanted to do special analysis of T-SQL statements

108
00:06:06,520 --> 00:06:09,600
in some code somewhere, so for example, detecting SQL injection vulnerabilities

109
00:06:09,600 --> 00:06:14,360
or something like that, you now have available at your fingertips

110
00:06:14,360 --> 00:06:17,000
a library that will actually do the parsing for T-SQL.

111
00:06:17,000 --> 00:06:21,440
So T-SQL is TransactSQL, which is the flavor of SQL that SQL Server and ASI SQL

112
00:06:21,440 --> 00:06:22,880
database use.

113
00:06:22,880 --> 00:06:26,360
And the last point is I wrote a blog post.

114
00:06:26,360 --> 00:06:30,240
It's up on the tech community website.

115
00:06:30,240 --> 00:06:33,000
And it's called The Importance of TLS with SQL Server.

116
00:06:33,000 --> 00:06:35,280
And same applies to ASI SQL database.

117
00:06:35,280 --> 00:06:38,080
But basically, it's explaining just how important TLS is.

118
00:06:38,080 --> 00:06:40,120
And it's not for the reasons you think.

119
00:06:40,120 --> 00:06:42,120
A lot of people think that it's just about, hey,

120
00:06:42,120 --> 00:06:44,920
you know, protecting my credit cards as they fly across the wire.

121
00:06:44,920 --> 00:06:46,160
Yeah, that's important.

122
00:06:46,160 --> 00:06:48,160
But in actual fact, the most important thing it does

123
00:06:48,160 --> 00:06:49,800
is provide server authentication.

124
00:06:49,800 --> 00:06:51,360
And then once you have the server authentication,

125
00:06:51,360 --> 00:06:53,280
then you can do the channel protections.

126
00:06:53,280 --> 00:06:58,360
And unfortunately, people get that server authentication part wrong many,

127
00:06:58,360 --> 00:07:01,600
many times through various settings in their SQL connection strings.

128
00:07:01,600 --> 00:07:02,880
So I go through all of that.

129
00:07:02,880 --> 00:07:05,840
And basically, at the end of it, just plead with people to please just do TLS

130
00:07:05,840 --> 00:07:07,560
correctly.

131
00:07:07,560 --> 00:07:11,000
So with the news out of the way, let's turn our attention this week

132
00:07:11,000 --> 00:07:12,200
to our guest.

133
00:07:12,200 --> 00:07:14,880
This week, we have Nigar Shabab, who is here

134
00:07:14,880 --> 00:07:18,520
from Microsoft in Melbourne, Australia.

135
00:07:18,520 --> 00:07:22,560
And she's here to talk to us about Microsoft Security Research.

136
00:07:22,560 --> 00:07:25,320
So Nigar, first of all, welcome to the podcast.

137
00:07:25,320 --> 00:07:28,640
We'd like to spend a moment and introduce yourself to our listeners.

138
00:07:28,640 --> 00:07:29,160
Hello.

139
00:07:29,160 --> 00:07:30,560
Hello, everyone.

140
00:07:30,560 --> 00:07:32,320
Pleasure being here.

141
00:07:32,320 --> 00:07:33,840
I'm Nigar.

142
00:07:33,840 --> 00:07:37,760
I've been with Microsoft for three years now.

143
00:07:37,760 --> 00:07:41,920
I'm part of Microsoft Trade Intelligence community,

144
00:07:41,920 --> 00:07:44,080
working for Windows Defender.

145
00:07:44,080 --> 00:07:49,440
And I've been doing malware research for over 10 years now.

146
00:07:49,440 --> 00:07:50,600
That's what I am.

147
00:07:50,600 --> 00:07:51,360
Fantastic.

148
00:07:51,360 --> 00:07:54,560
So I want to ask the most obvious question to start off with.

149
00:07:54,560 --> 00:07:56,440
So you're actually the first person we've actually

150
00:07:56,440 --> 00:08:00,360
had on the podcast from Microsoft Security Research.

151
00:08:00,360 --> 00:08:01,200
So what do you do?

152
00:08:01,200 --> 00:08:03,000
I mean, what does a typical day look like?

153
00:08:03,000 --> 00:08:04,360
And what sort of things do you focus on?

154
00:08:04,360 --> 00:08:07,640
Do you just sit in front of Ida Pro all day, like debugging malware?

155
00:08:07,640 --> 00:08:10,600
I'm just really curious as to what you sort of do on a day-to-day basis.

156
00:08:10,600 --> 00:08:12,240
And what's the team's focus?

157
00:08:12,240 --> 00:08:15,400
We have a very large team of researchers.

158
00:08:15,400 --> 00:08:18,720
All the day-to-day jobs are kind of different.

159
00:08:18,720 --> 00:08:23,760
Even inside Defender, different teams focus on different areas

160
00:08:23,760 --> 00:08:27,400
and different types of malware or attacks.

161
00:08:27,400 --> 00:08:30,880
For me, I'm part of a team in Malware

162
00:08:30,880 --> 00:08:36,240
that we mostly focus on the adverts and browser-based threats.

163
00:08:36,240 --> 00:08:43,960
I personally don't use Ida every day, but once in a while.

164
00:08:43,960 --> 00:08:51,440
And so we have our research has a wide range of different things

165
00:08:51,440 --> 00:08:56,920
we do from looking at one specific attack module,

166
00:08:56,920 --> 00:09:00,000
like looking at it inside Ida, as you said,

167
00:09:00,000 --> 00:09:04,640
or analyzing it with different tools,

168
00:09:04,640 --> 00:09:09,840
to looking at the whole family or the whole campaign,

169
00:09:09,840 --> 00:09:14,040
and doing some sort of threat intelligence,

170
00:09:14,040 --> 00:09:21,680
to just trying to improve detection for our known malware.

171
00:09:21,680 --> 00:09:26,160
Yeah, so something I should probably explain is Ida, just spell I-D-A,

172
00:09:26,160 --> 00:09:31,440
it's a very commonly used tool for understanding what malware does

173
00:09:31,440 --> 00:09:34,240
and also for things like debugging patches and those sorts of things.

174
00:09:34,240 --> 00:09:36,600
So it's a very, very common tool used

175
00:09:36,600 --> 00:09:39,000
throughout the professional security community.

176
00:09:39,000 --> 00:09:43,680
You said different teams focus on different things.

177
00:09:43,680 --> 00:09:48,360
Obviously, because there's a huge number of threats out there to research.

178
00:09:48,360 --> 00:09:53,680
Are you able to talk a little bit about specifically what your team do

179
00:09:53,680 --> 00:09:56,520
and the kind of things that you can see?

180
00:09:56,520 --> 00:10:00,200
Basically, we are not in silos.

181
00:10:00,200 --> 00:10:04,040
So we kind of work together with different teams.

182
00:10:04,040 --> 00:10:13,160
For example, specifically our team are in contact with Edge

183
00:10:13,160 --> 00:10:22,520
or Bing or some other teams to find the research areas we do every day.

184
00:10:22,520 --> 00:10:29,400
So we basically focus on any kind of browser-based threats.

185
00:10:29,400 --> 00:10:32,600
This is one of our main focuses.

186
00:10:32,600 --> 00:10:37,760
To give you some examples, all the malicious browser extensions,

187
00:10:37,760 --> 00:10:41,080
you might get installed on your system.

188
00:10:41,080 --> 00:10:46,160
It might be something you install or something that you pick up

189
00:10:46,160 --> 00:10:50,320
during installing something else or just browsing the internet.

190
00:10:50,320 --> 00:10:54,680
And this is one of the main focuses of our team.

191
00:10:54,680 --> 00:10:59,440
And another thing is any other type of adverts,

192
00:10:59,440 --> 00:11:03,400
not necessarily browser-based, but any kind,

193
00:11:03,400 --> 00:11:09,720
like any tool advertising, hack tools, or whatever.

194
00:11:09,720 --> 00:11:11,200
I'm kind of curious.

195
00:11:11,200 --> 00:11:14,000
There's a couple of questions that come to mind.

196
00:11:14,000 --> 00:11:15,680
It seems like you focus on browsers.

197
00:11:15,680 --> 00:11:20,320
There's obviously people that break down malware of different types, et cetera.

198
00:11:20,320 --> 00:11:23,800
So I'm kind of curious, what are the different types of researchers

199
00:11:23,800 --> 00:11:24,920
that are sort of out there?

200
00:11:24,920 --> 00:11:27,560
Because I'm sort of a novice to the whole security research space

201
00:11:27,560 --> 00:11:31,280
in some ways, career-wise.

202
00:11:31,280 --> 00:11:33,680
And then the other question I have is, how much of this

203
00:11:33,680 --> 00:11:36,560
is proactive versus reactive?

204
00:11:36,560 --> 00:11:40,600
You've got the actual malware and the actual things people are doing

205
00:11:40,600 --> 00:11:45,960
versus what could be done more of like a red team attack simulation perspective.

206
00:11:45,960 --> 00:11:48,680
I'm kind of curious how that blends out.

207
00:11:48,680 --> 00:11:53,240
So are you talking about malware research in general?

208
00:11:53,240 --> 00:11:54,920
Just in general.

209
00:11:54,920 --> 00:11:58,720
Just thinking of someone that's completely new to security research,

210
00:11:58,720 --> 00:12:01,280
how much of what you do is looking at what attackers are already doing

211
00:12:01,280 --> 00:12:04,760
versus trying to anticipate what they might try next?

212
00:12:04,760 --> 00:12:07,360
That was a very good question.

213
00:12:07,360 --> 00:12:14,280
Actually, we always look at the attacks from these two different perspectives,

214
00:12:14,280 --> 00:12:19,720
from an attacker point of view and the defender point of view.

215
00:12:19,720 --> 00:12:26,720
In our team, we don't do the red teaming or the offensive part of it.

216
00:12:26,720 --> 00:12:31,280
But it's a very big area, security research umbrella.

217
00:12:31,280 --> 00:12:34,000
I've done a little bit of that.

218
00:12:34,000 --> 00:12:41,440
I tried during these years that I'm doing security research,

219
00:12:41,440 --> 00:12:47,800
I tried getting into different areas a little bit.

220
00:12:47,800 --> 00:12:53,440
It's very cool to do red teaming.

221
00:12:53,440 --> 00:12:59,800
And I guess it also helps you when you do the other side,

222
00:12:59,800 --> 00:13:04,720
like when I'm working on the defensive side,

223
00:13:04,720 --> 00:13:10,480
it helps me to have a little bit of knowledge about the offensive side of it.

224
00:13:10,480 --> 00:13:11,280
That's interesting.

225
00:13:11,280 --> 00:13:14,480
So I didn't realize that there was sort of the red versus blue

226
00:13:14,480 --> 00:13:17,000
kind of elements within the research space.

227
00:13:17,000 --> 00:13:17,960
That's kind of cool.

228
00:13:17,960 --> 00:13:22,800
Do folks kind of cross between different technology and focus areas?

229
00:13:22,800 --> 00:13:27,080
Like people get bored with browsers and they want to work on different types of malware?

230
00:13:27,080 --> 00:13:29,320
Do folks tend to focus in the same area?

231
00:13:29,320 --> 00:13:31,280
I'm just kind of curious there.

232
00:13:31,280 --> 00:13:32,680
We are pretty flexible.

233
00:13:32,680 --> 00:13:37,720
We try to experiment different areas.

234
00:13:37,720 --> 00:13:40,400
Definitely not a very routine job.

235
00:13:40,400 --> 00:13:46,120
And we don't get to work on the same thing or similar things over and over.

236
00:13:46,120 --> 00:13:50,200
Depends on the attacks we see every day.

237
00:13:50,200 --> 00:13:56,520
We have to kind of adopt the nature of evolving in the attacks,

238
00:13:56,520 --> 00:13:58,000
in the attack scenarios.

239
00:13:58,000 --> 00:13:58,480
Gotcha.

240
00:13:58,480 --> 00:14:02,000
So it sounds like in your world it's very attack driven

241
00:14:02,000 --> 00:14:05,080
and what the attackers are doing and trends and whatnot.

242
00:14:05,080 --> 00:14:05,600
Yep.

243
00:14:05,600 --> 00:14:06,360
That's right.

244
00:14:06,360 --> 00:14:10,560
And I assume that the research that your team does,

245
00:14:10,560 --> 00:14:13,240
a lot of that ends up in our various Defender products.

246
00:14:13,240 --> 00:14:16,160
Is there any, could it end up in any Defender product?

247
00:14:16,160 --> 00:14:19,240
Or is it like Defender for endpoint or something like that?

248
00:14:19,240 --> 00:14:24,440
What sort of products do you end up sort of affecting?

249
00:14:24,440 --> 00:14:28,480
I guess our research end up in any Defender product.

250
00:14:28,480 --> 00:14:32,040
Yeah, we basically everything we do in our team,

251
00:14:32,040 --> 00:14:36,160
one of their main goal is to end up in Defender.

252
00:14:36,160 --> 00:14:38,440
Most of our research is what we do,

253
00:14:38,440 --> 00:14:43,640
we want to add it to the Defender and improving the detection.

254
00:14:43,640 --> 00:14:45,640
So one thing I sort of touched on briefly is,

255
00:14:45,640 --> 00:14:48,240
you know, you have different types of researchers like,

256
00:14:48,240 --> 00:14:51,560
you know, you got like red, those in red teams and blue teams and so on.

257
00:14:51,560 --> 00:14:55,880
Someone was interested in getting into sort of security research,

258
00:14:55,880 --> 00:14:57,920
like malware research.

259
00:14:57,920 --> 00:14:59,360
I mean, where would you start?

260
00:14:59,360 --> 00:15:02,360
I mean, you know, if I'm sitting at a computer right now,

261
00:15:02,360 --> 00:15:07,120
like, you know, I really want to just sort of learn more about malware security research.

262
00:15:07,120 --> 00:15:10,200
I mean, what would you advise people to do next?

263
00:15:10,200 --> 00:15:18,200
I can say there are different areas in malware research.

264
00:15:18,200 --> 00:15:24,960
My suggestion is that to go and learn about, like,

265
00:15:24,960 --> 00:15:31,040
get into any of them, like try to see which one do you like,

266
00:15:31,040 --> 00:15:38,000
learn from each area a little bit and try to see what are you interested in.

267
00:15:38,000 --> 00:15:41,400
And there are different, they have different,

268
00:15:41,400 --> 00:15:45,440
like there are some things in common between all of them,

269
00:15:45,440 --> 00:15:52,560
but definitely people are, some people are more interested in the red side.

270
00:15:52,560 --> 00:15:56,040
Some of them are more interested on the blue side.

271
00:15:56,040 --> 00:15:57,640
Yeah, give it a go.

272
00:15:57,640 --> 00:16:03,160
Try to build skills on different sides and see what suits you.

273
00:16:03,160 --> 00:16:05,240
All right, let's take it one step further.

274
00:16:05,240 --> 00:16:09,880
So let's say you want to be on the blue side, right, which is the defensive side.

275
00:16:09,880 --> 00:16:13,280
So give me an example of what you, or say examples if you could,

276
00:16:13,280 --> 00:16:18,400
of what sort of tools or technologies or techniques

277
00:16:18,400 --> 00:16:22,280
you think people should set down and study.

278
00:16:22,280 --> 00:16:28,880
I think reverse engineering is one of the main skills you need.

279
00:16:28,880 --> 00:16:37,640
Even if you want to do malware research or if you want to do incident response

280
00:16:37,640 --> 00:16:42,760
or any other areas on the blue side.

281
00:16:42,760 --> 00:16:48,960
And I think reverse engineering is, it actually helps you on the red side as well.

282
00:16:48,960 --> 00:16:53,840
But yeah, I guess it's one of the main skills you need to have.

283
00:16:53,840 --> 00:16:57,560
So, and that's where Ida comes in, right, to help with the reverse engineering.

284
00:16:57,560 --> 00:16:58,800
Yeah.

285
00:16:58,800 --> 00:17:00,840
But on that topic though, it's funny you should bring that up.

286
00:17:00,840 --> 00:17:04,920
I mean, if you're reversing some malware,

287
00:17:04,920 --> 00:17:06,640
you've really got to have a programming background, right?

288
00:17:06,640 --> 00:17:08,800
Is that a fair comment?

289
00:17:08,800 --> 00:17:11,840
You should know some programming.

290
00:17:11,840 --> 00:17:16,280
Yeah, you don't have to be a professional programmer.

291
00:17:16,280 --> 00:17:16,960
That's good to know.

292
00:17:16,960 --> 00:17:22,080
I mean, I remember this is a while ago and I was debugging something with,

293
00:17:22,080 --> 00:17:24,080
someone was watching me do it.

294
00:17:24,080 --> 00:17:26,960
And I ended up stepping through the assembly language.

295
00:17:26,960 --> 00:17:30,200
It was on the X64.

296
00:17:30,200 --> 00:17:32,080
And this guy's like, what are you doing?

297
00:17:32,080 --> 00:17:34,200
Why don't you just debug the C code?

298
00:17:34,200 --> 00:17:35,200
It was written in C.

299
00:17:35,200 --> 00:17:38,320
I'm like, why don't you just debug the C code?

300
00:17:38,320 --> 00:17:42,880
I'm like, I mean, I could, but I really want to see what the code is really,

301
00:17:42,880 --> 00:17:44,440
really doing.

302
00:17:44,440 --> 00:17:49,440
And an optimizer can easily change the C code or the C++ code

303
00:17:49,440 --> 00:17:51,880
to be some completely different assembly language.

304
00:17:51,880 --> 00:17:54,840
So I want to just see precisely what the code is doing.

305
00:17:54,840 --> 00:17:59,440
And that's why I was single stepping through, single instructions.

306
00:17:59,440 --> 00:18:03,760
And in fact, I think this is my advice, I guess, for people if they want to do

307
00:18:03,760 --> 00:18:07,400
reverse engineering, is you're probably going to have to learn assembly language

308
00:18:07,400 --> 00:18:13,320
as well, because again, knowing what the malware is doing, they're not going to

309
00:18:13,320 --> 00:18:15,480
give you the source code.

310
00:18:15,480 --> 00:18:17,120
So you need to really step through the assembly language.

311
00:18:17,120 --> 00:18:19,400
So you're going to have to learn basic assembly language as well.

312
00:18:19,400 --> 00:18:24,000
Is that a fair comment or do I just scare a whole bunch of people off?

313
00:18:24,000 --> 00:18:24,880
No, that's fair.

314
00:18:24,880 --> 00:18:32,080
And I can say that it's not, if you are a C programmer, you don't always get

315
00:18:32,080 --> 00:18:36,280
malware written in C. If you want to work on different kinds of

316
00:18:36,280 --> 00:18:43,840
malvers, you have to be curious and constantly learning different

317
00:18:43,840 --> 00:18:45,680
programming languages.

318
00:18:45,680 --> 00:18:49,800
Not like you want to code in that programming language, but you have to

319
00:18:49,800 --> 00:18:51,360
understand it.

320
00:18:51,360 --> 00:18:58,400
So yeah, even if you are no C, it doesn't necessarily help you in

321
00:18:58,400 --> 00:19:01,640
analyzing all kinds of malvers.

322
00:19:01,640 --> 00:19:02,440
Yeah, that's a good point.

323
00:19:02,440 --> 00:19:03,560
That's a really good point.

324
00:19:03,560 --> 00:19:04,720
Actually, so on that topic.

325
00:19:04,720 --> 00:19:08,640
So what languages are we seeing as the predominant languages now that

326
00:19:08,640 --> 00:19:10,240
people are using to write malware?

327
00:19:10,240 --> 00:19:15,560
Different types of malvers are written in different types of languages,

328
00:19:15,560 --> 00:19:17,560
like script malvers.

329
00:19:17,560 --> 00:19:25,080
Even some executables are written in script-based languages, like Node.js

330
00:19:25,080 --> 00:19:28,920
or different types of programming languages.

331
00:19:28,920 --> 00:19:31,960
And I'm not an expert in that area.

332
00:19:31,960 --> 00:19:32,880
Can you tell us a story?

333
00:19:32,880 --> 00:19:36,400
Tell us about a fun case at work where someone sort of surprised you or

334
00:19:36,400 --> 00:19:38,440
was it interesting or you learned something interesting.

335
00:19:38,440 --> 00:19:43,320
I'd love to hear sort of what it's like to sort of run something down.

336
00:19:43,320 --> 00:19:47,280
I guess we get surprised every day.

337
00:19:47,280 --> 00:19:55,520
So with any piece of new malware or new campaign, you see something new and

338
00:19:55,520 --> 00:19:57,720
then you have to learn it.

339
00:19:57,720 --> 00:20:04,120
You have to learn how to analyze it and how this would work.

340
00:20:04,120 --> 00:20:11,560
I don't have anything specific on top of my mind, but yeah, definitely like

341
00:20:11,560 --> 00:20:19,760
with every malware campaign we work on, we see like new and interesting techniques.

342
00:20:19,760 --> 00:20:28,040
With regards to what you see in your research, do we see the same sort of

343
00:20:28,040 --> 00:20:36,040
culprits doing attacks over and over again or is it lots of different people?

344
00:20:36,040 --> 00:20:38,640
It's just like the whole internet.

345
00:20:38,640 --> 00:20:43,920
Are you able to tell us a little bit about kind of what you see at a very high level there?

346
00:20:43,920 --> 00:20:48,640
It really depends on the attack itself.

347
00:20:48,640 --> 00:20:54,240
There are some attacks that, as you know, that there are some nation-sponsored attacks.

348
00:20:54,240 --> 00:21:03,400
So I can tell you that we've seen like different attacks from the same actor again and again.

349
00:21:03,400 --> 00:21:12,240
So the actor might be the same and they might use same techniques or even same

350
00:21:12,240 --> 00:21:17,160
modules of the old attacks and the new attacks.

351
00:21:17,160 --> 00:21:21,200
But definitely there are new attackers every day.

352
00:21:21,200 --> 00:21:28,960
But what we see is that we've seen evolution in the same attack a lot.

353
00:21:28,960 --> 00:21:35,400
Attackers try to prevent getting detected, so they change the attack chain a bit,

354
00:21:35,400 --> 00:21:41,720
preventing the detection and getting their way in what they do.

355
00:21:41,720 --> 00:21:51,520
If we see a different attack with a new approach, we can say that it's the same attack with a different approach.

356
00:21:51,520 --> 00:21:54,440
Yeah, that same actor.

357
00:21:54,440 --> 00:21:57,520
Do you find that the different threat actors are learning from each other?

358
00:21:57,520 --> 00:21:58,520
Yeah, definitely.

359
00:21:58,520 --> 00:22:06,920
Yeah, and we see a lot of different tools, like same tools using by different attackers.

360
00:22:06,920 --> 00:22:11,920
I realize this is, Mike may have an opinion on this, but it's interesting.

361
00:22:11,920 --> 00:22:15,120
My guess is there's a whole ecosystem behind this, right?

362
00:22:15,120 --> 00:22:22,920
There are people who can generate malware and then people who purchase the malware to be used against victims and so on.

363
00:22:22,920 --> 00:22:31,120
Not just threat actors writing their own malware, they're probably purchasing it from other people who are experts in writing malware.

364
00:22:31,120 --> 00:22:32,320
Any thoughts on that?

365
00:22:32,320 --> 00:22:36,120
Again, I know Mike will probably have an opinion, but is that something that we see as well?

366
00:22:36,120 --> 00:22:38,120
Yeah, definitely. You're right.

367
00:22:38,120 --> 00:22:49,920
There are some people that are writing different pieces of malware and they sell it to different people who want to use them.

368
00:22:49,920 --> 00:22:59,720
So yeah, it's not always the attacker is writing every piece of every module since they attacked themselves from scratch.

369
00:22:59,720 --> 00:23:07,920
Are you able to talk a little bit about how the research that you do, because of course Microsoft does do the research for a reason,

370
00:23:07,920 --> 00:23:14,320
are you able to talk about how the research that you and your team do end up in Defender products?

371
00:23:14,320 --> 00:23:16,320
That's a very interesting question.

372
00:23:16,320 --> 00:23:26,320
So as I mentioned earlier, one of our main goals for our researchers is to improve the detection for our Defender.

373
00:23:26,320 --> 00:23:36,720
So what we do is when we look at the malware, different modules separately or the whole behavior of the attack,

374
00:23:36,720 --> 00:23:47,120
how we add that sort of knowledge into the Defender is by adding some sort of malware signature to the Defender.

375
00:23:47,120 --> 00:24:01,120
And malware signature would be like any sequence of bytes from the malware we analyze or something like a hash or something behavioral.

376
00:24:01,120 --> 00:24:12,120
So we see some kind of behaviors in the malware or some sort of something unique about a piece of malware.

377
00:24:12,120 --> 00:24:18,120
And we add them as a signature for that malware to the researcher.

378
00:24:18,120 --> 00:24:33,120
So everywhere, like in any system, Defender picks up some sort of behavior or a file with data specific features or sequence of bytes.

379
00:24:33,120 --> 00:24:36,120
It detects that specific malware.

380
00:24:36,120 --> 00:24:38,120
That's really cool.

381
00:24:38,120 --> 00:24:48,120
I always find how we get all of this information and all this research that we do into Microsoft products

382
00:24:48,120 --> 00:24:52,120
and doing something for people who are using them really, really interesting.

383
00:24:52,120 --> 00:24:56,120
Yeah. I mean, at the end of the day, the stuff's got to be shipped products, right?

384
00:24:56,120 --> 00:25:01,120
So it's good to see all this work going into products to protect our customers.

385
00:25:01,120 --> 00:25:03,120
All right. So let's wrap this thing up.

386
00:25:03,120 --> 00:25:10,120
Hey, Nega. So one thing we ask our guests is if you had just one little thought to leave our listeners with, what would it be?

387
00:25:10,120 --> 00:25:18,120
As a researcher, I want to talk to the other researchers in this area.

388
00:25:18,120 --> 00:25:33,120
I want to tell them to stay connected with different researchers because you're going to get some opinion from people and learn new areas every day.

389
00:25:33,120 --> 00:25:38,120
People have different skills and different interests.

390
00:25:38,120 --> 00:25:47,120
So you might learn new things from staying connected with the community and other researchers.

391
00:25:47,120 --> 00:25:51,120
Yeah, that's so true, isn't it? I mean, you think you know it all, but in actual fact, you really don't.

392
00:25:51,120 --> 00:25:57,120
Yeah. Talk to five security researchers and five security people in general. You get 10 different opinions anyway.

393
00:25:57,120 --> 00:26:00,120
But that's really good advice.

394
00:26:00,120 --> 00:26:03,120
All right. So with that, let's bring this episode to an end.

395
00:26:03,120 --> 00:26:05,120
Nega, thank you so much for joining us this week.

396
00:26:05,120 --> 00:26:10,120
Again, you're the very first person we've had from Microsoft Security Research. So that was good.

397
00:26:10,120 --> 00:26:13,120
Thank you to all our listeners out there. We hope you found this episode useful.

398
00:26:13,120 --> 00:26:16,120
Stay safe and we'll see you next time.

399
00:26:16,120 --> 00:26:19,120
Thanks for listening to the Azure Security Podcast.

400
00:26:19,120 --> 00:26:26,120
You can find show notes and other resources at our website azsecuritypodcast.net.

401
00:26:26,120 --> 00:26:31,120
If you have any questions, please find us on Twitter at Azure Setpod.

402
00:26:31,120 --> 00:26:46,120
All of our background music is from ccmixtor.com and licensed under the Creative Commons license.

