1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast, where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:17,120
Hey everybody, welcome to episode 103.

4
00:00:17,120 --> 00:00:19,640
This week is myself, Michael, with Sarah.

5
00:00:19,640 --> 00:00:24,760
Our guest this week is Nick Fillingham, who's here to talk to us about security conferences,

6
00:00:24,760 --> 00:00:27,120
most notably Microsoft Blue Hat.

7
00:00:27,120 --> 00:00:30,520
Before we get to our guest, let's take a little wrap around the news.

8
00:00:30,520 --> 00:00:33,360
I'll kick things off just to get going.

9
00:00:33,360 --> 00:00:37,840
First one is Azure Database for Postgres SQL, my old stomping ground.

10
00:00:37,840 --> 00:00:42,120
By the way, that's always Azure Database for Postgres SQL flexible server.

11
00:00:42,120 --> 00:00:47,920
Now has support for Postgres SQL anonymizer version 1.3.2.

12
00:00:47,920 --> 00:00:53,400
One of the cool things about Postgres is there's like an extension for absolutely everything.

13
00:00:53,400 --> 00:00:58,280
The Postgres anonymizer we've now updated to 1.3.2, which I assume is a good thing,

14
00:00:58,280 --> 00:01:00,840
but we'll provide a link in the show notes.

15
00:01:00,840 --> 00:01:04,800
Next one, and this is awesome to see, I've already talked about this, I think I talk

16
00:01:04,800 --> 00:01:10,880
about this kind of stuff every other episode, is Microsoft, especially Azure, the services

17
00:01:10,880 --> 00:01:17,440
that we provide are moving, a lot of the services are moving away from using say tokens or SaaS

18
00:01:17,440 --> 00:01:20,960
tokens and that sort of stuff for authentication and authorization.

19
00:01:20,960 --> 00:01:27,920
So a dedicated gateway now has support for using RBAC support as opposed to using say

20
00:01:27,920 --> 00:01:29,240
a SaaS token.

21
00:01:29,240 --> 00:01:33,840
So historically this dedicated gateway for Cosmos DB, I should have mentioned that, would

22
00:01:33,840 --> 00:01:36,720
use a primary key to the Cosmos DB account.

23
00:01:36,720 --> 00:01:38,800
Well now you can use a managed identity.

24
00:01:38,800 --> 00:01:39,800
This is incredibly important.

25
00:01:39,800 --> 00:01:44,700
I know I've talked about this so much, but I really want to hammer this home.

26
00:01:44,700 --> 00:01:51,560
It is incredibly important that we, the industry, move away from using essentially secrets and

27
00:01:51,560 --> 00:01:54,600
these sort of credentials that you have to persist and then protect.

28
00:01:54,600 --> 00:01:58,960
So managed identities and Entra ID identities are certainly the way to go because that way

29
00:01:58,960 --> 00:02:02,080
the credential information is stored by Entra ID.

30
00:02:02,080 --> 00:02:06,320
It's managed and it's rotated and protected and audited by Entra ID and you don't have

31
00:02:06,320 --> 00:02:07,320
to worry about it.

32
00:02:07,320 --> 00:02:12,440
And the last bit of news that I have is in public preview is Azure Virtual Network IP

33
00:02:12,440 --> 00:02:13,440
address management.

34
00:02:13,440 --> 00:02:18,200
We've got a tool for it, which is really, really cool because sort of handling IP addresses

35
00:02:18,200 --> 00:02:21,920
is like where are all my IP addresses?

36
00:02:21,920 --> 00:02:23,940
It can be a bit of a pain.

37
00:02:23,940 --> 00:02:28,440
So now we have a tool, that's a new feature called the IP address management feature and

38
00:02:28,440 --> 00:02:30,320
that certainly simplifies that process substantially.

39
00:02:30,320 --> 00:02:31,960
So that's all my news.

40
00:02:31,960 --> 00:02:36,560
So we've got a couple of things in Azure AI Studio we've got that have gone into public

41
00:02:36,560 --> 00:02:38,280
preview that you might want to play around with.

42
00:02:38,280 --> 00:02:44,800
We've got evaluations for protected material, text based and then we've also got evaluations

43
00:02:44,800 --> 00:02:47,080
for indirect prompt injection attacks.

44
00:02:47,080 --> 00:02:50,280
Now indirect prompt injection attacks are very cool.

45
00:02:50,280 --> 00:02:53,440
Oh, well, they're cool and they're very interesting.

46
00:02:53,440 --> 00:03:00,280
So that's where you essentially rather than try and jailbreak the prompt directly, you

47
00:03:00,280 --> 00:03:05,120
actually get the prompt to go and reference other things that are malicious and then that's

48
00:03:05,120 --> 00:03:06,720
how you break the model.

49
00:03:06,720 --> 00:03:08,800
It's very, very interesting.

50
00:03:08,800 --> 00:03:14,400
With the Azure AI evaluation SDK, you can now simulate indirect prompt injection attacks

51
00:03:14,400 --> 00:03:20,040
and you can drill into the evaluation and of course protect the AI you're building much

52
00:03:20,040 --> 00:03:21,400
better.

53
00:03:21,400 --> 00:03:27,360
And then the last one that I have for today is the container support for pre-built text

54
00:03:27,360 --> 00:03:36,560
PII, which means if you've got container stuff with PII in it, you can actually look at redacting

55
00:03:36,560 --> 00:03:38,120
that when it gets into your system.

56
00:03:38,120 --> 00:03:40,480
So go and have a play around with that.

57
00:03:40,480 --> 00:03:44,460
That's obviously some security and privacy stuff that definitely at least some folks

58
00:03:44,460 --> 00:03:45,840
will need to use.

59
00:03:45,840 --> 00:03:48,440
And yeah, that's my news for this time.

60
00:03:48,440 --> 00:03:53,200
By the way, just in case people don't recognize this, Sarah's not feeling very well right

61
00:03:53,200 --> 00:03:54,200
now.

62
00:03:54,200 --> 00:03:57,600
So give her some slack if she feels a bit snotty and a bit under the weather.

63
00:03:57,600 --> 00:03:58,600
Yeah.

64
00:03:58,600 --> 00:04:04,000
Hopefully I don't feel too well because we all have nice mics, I'm hoping and nice software.

65
00:04:04,000 --> 00:04:09,680
I'm hoping Michael's going to make me sound very good in the post edit, but I actually

66
00:04:09,680 --> 00:04:13,440
do have the flu and might sound a little bit croaky today.

67
00:04:13,440 --> 00:04:17,720
We need to have a filter in Adobe Audition that says make someone sound like they don't

68
00:04:17,720 --> 00:04:18,720
have flu filter.

69
00:04:18,720 --> 00:04:22,760
But anyway, all right, so look, let's go and introduce our guests.

70
00:04:22,760 --> 00:04:26,680
So this week we have Nick Fillingham, who's here to talk to us, as I mentioned before,

71
00:04:26,680 --> 00:04:30,320
about sort of security conferences, most specifically Microsoft Blue Hat.

72
00:04:30,320 --> 00:04:31,560
Nick, welcome to the podcast.

73
00:04:31,560 --> 00:04:34,760
Will I take a moment and introduce yourself to our listeners?

74
00:04:34,760 --> 00:04:35,920
Yeah, hi.

75
00:04:35,920 --> 00:04:37,360
Thank you so much for having me.

76
00:04:37,360 --> 00:04:38,360
Hi, Michael.

77
00:04:38,360 --> 00:04:39,360
Hi, Sarah.

78
00:04:39,360 --> 00:04:40,360
So I'm Nick Fillingham.

79
00:04:40,360 --> 00:04:41,360
I'm a Microsoft employee.

80
00:04:41,360 --> 00:04:44,600
I work for the MSRC, Microsoft Security Response Center.

81
00:04:44,600 --> 00:04:50,560
I run the Blue Hat program, which is, it's been, the Blue Hat program has been going

82
00:04:50,560 --> 00:04:53,720
on coming up on 20 years.

83
00:04:53,720 --> 00:04:59,080
Started with a conference in 2005, which I was not at, but I do believe one of the hosts

84
00:04:59,080 --> 00:05:02,000
of this podcast was.

85
00:05:02,000 --> 00:05:07,760
And yeah, I have the great opportunity to work with security researchers, both inside

86
00:05:07,760 --> 00:05:14,840
of Microsoft and then out in the external community to really, you know, I'll say attract

87
00:05:14,840 --> 00:05:19,780
and get them to know about our conference and hopefully come and present their findings

88
00:05:19,780 --> 00:05:23,240
and their learnings and share what they've found and what they've discovered with the

89
00:05:23,240 --> 00:05:28,040
world as part of the Blue Hat conference or as part of some other sort of Blue Hat event,

90
00:05:28,040 --> 00:05:33,040
whether it's a, whether it's our podcast or a blog or some other way that we, that we

91
00:05:33,040 --> 00:05:34,120
engage with the community.

92
00:05:34,120 --> 00:05:35,280
But that's what I do.

93
00:05:35,280 --> 00:05:36,280
It's a great job.

94
00:05:36,280 --> 00:05:37,280
I love it.

95
00:05:37,280 --> 00:05:42,400
And I've also got a podcast, Blue Hat podcast, which I'll probably plug 25 times throughout

96
00:05:42,400 --> 00:05:43,400
this episode.

97
00:05:43,400 --> 00:05:48,080
We'll add a link to the podcast and also other material about the Blue Hat conference.

98
00:05:48,080 --> 00:05:53,720
So we want to sort of talk just generally about security conferences and specifically

99
00:05:53,720 --> 00:05:55,900
about Blue Hat.

100
00:05:55,900 --> 00:06:01,920
So why don't we kick things off with just talking about what sort of common conferences

101
00:06:01,920 --> 00:06:08,720
that people should really look into the different audiences that those conferences expect.

102
00:06:08,720 --> 00:06:12,520
And then we'll sort of wrap things up towards the end by going through Blue Hat, like its

103
00:06:12,520 --> 00:06:13,520
origins.

104
00:06:13,520 --> 00:06:16,120
By the way, I want everyone to know on the, who's listening, we have absolutely no agenda

105
00:06:16,120 --> 00:06:17,120
for this whatsoever.

106
00:06:17,120 --> 00:06:20,080
Literally our pre-meeting yesterday was let's talk about conferences.

107
00:06:20,080 --> 00:06:22,120
We're all professionals here and we all agreed.

108
00:06:22,120 --> 00:06:23,120
That's basically it.

109
00:06:23,120 --> 00:06:26,680
So why don't we kick things off in that way with, okay, Nick, what's your opinion?

110
00:06:26,680 --> 00:06:30,880
And then Sarah, you chime in as well about the various security conferences around the

111
00:06:30,880 --> 00:06:35,560
world and the ones that you feel are sort of the most impactful.

112
00:06:35,560 --> 00:06:41,920
So I think what I love about the security research space, which is a sort of a subset

113
00:06:41,920 --> 00:06:48,000
of people, obviously within the broader security community, is that it is a very grassroots

114
00:06:48,000 --> 00:06:50,880
organization, or very, excuse me.

115
00:06:50,880 --> 00:06:53,480
It's a very grassroots community.

116
00:06:53,480 --> 00:06:59,040
And so what I've learned in these last couple of years being in the security research space,

117
00:06:59,040 --> 00:07:06,480
and especially looking at conferences, is that there are specialists or just sort of

118
00:07:06,480 --> 00:07:10,840
community-based conferences everywhere that are just sort of popping up.

119
00:07:10,840 --> 00:07:16,000
And you really only need someone with an idea and a little bit of energy and a little bit

120
00:07:16,000 --> 00:07:22,320
of passion to bring together hackers, researchers, and responders to have a conference.

121
00:07:22,320 --> 00:07:28,840
And so you have sort of at the tippy top of the pyramid, you have the Black Hats and the

122
00:07:28,840 --> 00:07:33,840
DEF CON, and Black Hat is sort of the commercial side, and DEF CON is sort of more of the underground

123
00:07:33,840 --> 00:07:36,080
grassroots community side.

124
00:07:36,080 --> 00:07:41,360
And then from there, you have a lot of the sort of regional versions of, especially Black

125
00:07:41,360 --> 00:07:43,760
Hat.

126
00:07:43,760 --> 00:07:48,000
There'll be an Asia or Europe version that happens throughout the year.

127
00:07:48,000 --> 00:07:54,400
And then you start to sort of filter out into more of the grassroots efforts like your B-sides,

128
00:07:54,400 --> 00:08:00,160
and then down into individual CONs that are happening in certain cities or areas.

129
00:08:00,160 --> 00:08:02,000
And I think any community can do this.

130
00:08:02,000 --> 00:08:06,560
It's not exclusive to security researchers, but it really does feel like if you can pull

131
00:08:06,560 --> 00:08:15,840
together 20, 30, 40 people in the security research hacker response space, that's enough

132
00:08:15,840 --> 00:08:20,440
to have a really interesting set of talks or conversations.

133
00:08:20,440 --> 00:08:25,800
And that can be enough to actually spur an event or a conference that in just a few years

134
00:08:25,800 --> 00:08:28,720
could actually turn into something.

135
00:08:28,720 --> 00:08:34,320
And I'm sure there are hundreds of other industries or sub-elements of industries where that happens

136
00:08:34,320 --> 00:08:38,000
as well, but it's just very cool to see that in the security research space.

137
00:08:38,000 --> 00:08:43,320
So yeah, I think you've got the Black Hats and the DEF CONs at the top.

138
00:08:43,320 --> 00:08:49,080
And then you've got sort of the B-sides somewhere sort of in the middle.

139
00:08:49,080 --> 00:08:51,960
I like to think sort of Blue Hat maybe sits around there.

140
00:08:51,960 --> 00:08:55,200
And then you've got all those incredible sort of community meetups and other sort of hacker

141
00:08:55,200 --> 00:08:59,640
groups that get together on a regular basis, whether it's 20, 30 people all the way up

142
00:08:59,640 --> 00:09:02,120
to several hundred or a thousand.

143
00:09:02,120 --> 00:09:12,300
But it's interesting that the hacker space is constantly going backwards and forwards

144
00:09:12,300 --> 00:09:24,120
or sort of dancing around the sort of traditional image of the hacker as the hooded, the hoodie,

145
00:09:24,120 --> 00:09:28,320
the dark hooded figure at a laptop up to no good.

146
00:09:28,320 --> 00:09:34,360
And certain times that will be embraced and celebrated and other times that will be pushed

147
00:09:34,360 --> 00:09:40,000
away in favor of some other sort of emblem.

148
00:09:40,000 --> 00:09:42,920
But yeah, it's a really interesting subspace.

149
00:09:42,920 --> 00:09:43,920
I'll sort of pause there.

150
00:09:43,920 --> 00:09:48,680
I don't know, Michael, you've been around for a little bit longer than me.

151
00:09:48,680 --> 00:09:57,400
Does that align with your perspective on security conferences and maybe with a bend on the research

152
00:09:57,400 --> 00:09:58,400
space?

153
00:09:58,400 --> 00:10:05,320
To me, I think Black Hat started out very much as a true dyed in the wool sort of hacking

154
00:10:05,320 --> 00:10:07,560
conference for the hacking community.

155
00:10:07,560 --> 00:10:08,560
It's a little bit like RSA.

156
00:10:08,560 --> 00:10:09,560
Look at this.

157
00:10:09,560 --> 00:10:10,560
It's going to sound really cynical.

158
00:10:10,560 --> 00:10:14,480
And I don't mean it to sound really cynical, but that's the way it's going to come out.

159
00:10:14,480 --> 00:10:16,280
If you look at RSA, right?

160
00:10:16,280 --> 00:10:21,800
So the original RSA was really a conference just for cryptographic researchers.

161
00:10:21,800 --> 00:10:24,240
I mean, that's really what it was.

162
00:10:24,240 --> 00:10:27,520
But now it's very much just a straight up industry event.

163
00:10:27,520 --> 00:10:29,680
I'm not going to say it's not technical.

164
00:10:29,680 --> 00:10:32,880
I mean, obviously there are technical tracks and people talk about technical things.

165
00:10:32,880 --> 00:10:36,620
But at the end of the day, it's also very much a sales conference.

166
00:10:36,620 --> 00:10:41,320
And then Black Hat, I think, started life as a true dyed in the wool hacking conference.

167
00:10:41,320 --> 00:10:49,520
Would it be fair to say that, say, DEF CON has taken over the hardcore of the hacking

168
00:10:49,520 --> 00:10:51,120
community?

169
00:10:51,120 --> 00:10:52,120
I would say that.

170
00:10:52,120 --> 00:10:56,680
Yeah, I certainly think that if you were to put DEF CON and Black Hat side by side, I

171
00:10:56,680 --> 00:11:00,920
think even the DEF CON and the Black Hat crew that run those events would agree that Black

172
00:11:00,920 --> 00:11:09,200
Hat is maybe more of a suit and tie and DEF CON is more of the rorer, undergroundy sort

173
00:11:09,200 --> 00:11:13,100
of community grassroots bottom up approach.

174
00:11:13,100 --> 00:11:16,500
But they run sit side by side in Las Vegas every year.

175
00:11:16,500 --> 00:11:20,640
So there's a huge overlap between the two and a lot of sort of connective tissue and

176
00:11:20,640 --> 00:11:24,080
the communities very much are overlapped and integrated with each other.

177
00:11:24,080 --> 00:11:28,000
Yeah, it's quite normal for people to take to attend both.

178
00:11:28,000 --> 00:11:34,240
So they'll do Black Hat first and then the more hardcore folks will stay behind and attend

179
00:11:34,240 --> 00:11:35,240
DEF CON.

180
00:11:35,240 --> 00:11:36,760
Where do the three letter agencies go?

181
00:11:36,760 --> 00:11:37,760
Is it RSA?

182
00:11:37,760 --> 00:11:38,760
Is it Black Hat?

183
00:11:38,760 --> 00:11:39,760
Is it DEF CON?

184
00:11:39,760 --> 00:11:41,840
Where do they go to hire people?

185
00:11:41,840 --> 00:11:44,040
I mean, I think they go to all of them.

186
00:11:44,040 --> 00:11:48,000
It's just a question of whether or not those three letters are visible on their attendee

187
00:11:48,000 --> 00:11:49,360
badge or not.

188
00:11:49,360 --> 00:11:53,920
Yeah, I've heard I actually have heard of people, actually some people that I know quite

189
00:11:53,920 --> 00:11:58,680
well attending the conferences under sort of different guises.

190
00:11:58,680 --> 00:12:01,560
I have visions of Groucho Marx turning up.

191
00:12:01,560 --> 00:12:06,400
So Sarah, you got an interesting perspective as well because you have talked at various

192
00:12:06,400 --> 00:12:11,240
security related conferences, but development related, right?

193
00:12:11,240 --> 00:12:14,480
I've done, yeah, I've done most.

194
00:12:14,480 --> 00:12:20,680
I've done lots of them, but recently I have been doing a lot more of talking at developer

195
00:12:20,680 --> 00:12:29,400
conferences because I think like I am no hardcore security researcher, but I think it's important

196
00:12:29,400 --> 00:12:33,760
that we actually go talk as security folk to the rest of IT because they're the ones

197
00:12:33,760 --> 00:12:38,560
that we actually need to get on board with doing some of the basics because that's often

198
00:12:38,560 --> 00:12:41,320
where mistakes and vulnerabilities come in.

199
00:12:41,320 --> 00:12:48,560
So I have been, yeah, I've spent a lot of time the last couple of years talking at other

200
00:12:48,560 --> 00:12:54,720
types of IT conferences because you'll find if you go speak to dev conferences and other

201
00:12:54,720 --> 00:12:59,800
types that they are actually often quite open to having a security talk as long as it's

202
00:12:59,800 --> 00:13:00,920
pitched in the right way.

203
00:13:00,920 --> 00:13:05,960
I mean, they're not going to want a DEF CON hacking talk probably, but if it's a, hey,

204
00:13:05,960 --> 00:13:11,000
developers, you've been building this stuff really and securely, how can we do it better?

205
00:13:11,000 --> 00:13:13,040
That is very relevant for their audiences.

206
00:13:13,040 --> 00:13:20,760
So what is interesting is when I go to those conferences, a line that I often throw out,

207
00:13:20,760 --> 00:13:28,120
it's a slide that I put in pretty much all the talks is hands up, who here has ever felt

208
00:13:28,120 --> 00:13:31,040
personally victimized by security?

209
00:13:31,040 --> 00:13:37,040
And that comes from mean girls if you don't know and you're not of the correct generation.

210
00:13:37,040 --> 00:13:41,880
But literally when I go to conferences like that, everyone puts their hands up.

211
00:13:41,880 --> 00:13:47,560
And what that tells me is that, and this is sad, but I think it's good that we know this,

212
00:13:47,560 --> 00:13:54,480
is that a lot of folks in the wider IT space have had poor experiences with security in

213
00:13:54,480 --> 00:13:55,720
the past.

214
00:13:55,720 --> 00:14:01,000
And so, and whether that's you or someone else, they just generally have quite a negative

215
00:14:01,000 --> 00:14:05,000
idea of security because security just stop you doing things.

216
00:14:05,000 --> 00:14:09,320
You say no, security are kind of a pain in the butt.

217
00:14:09,320 --> 00:14:17,440
And so I think we as a security industry need to go and do better and probably like try

218
00:14:17,440 --> 00:14:23,480
and undo some of the damage we've done with those relationships to prove that nowadays,

219
00:14:23,480 --> 00:14:26,840
well, we should be, security is trying to help you.

220
00:14:26,840 --> 00:14:30,240
We're not just going to say no, we're going to help you come up with a solution that's

221
00:14:30,240 --> 00:14:31,240
better for everybody.

222
00:14:31,240 --> 00:14:36,800
I'm going to say Michael, with all the work you do nowadays, I'm sure you can relate to

223
00:14:36,800 --> 00:14:42,280
this because it is a challenge, no matter if you're a big or a small organization, that

224
00:14:42,280 --> 00:14:48,760
security often, security priorities often conflict with other IT and business priorities,

225
00:14:48,760 --> 00:14:51,440
but we need to find a better way to deal with that.

226
00:14:51,440 --> 00:14:54,000
Yeah, I agree 100%.

227
00:14:54,000 --> 00:14:59,200
I think the days of the security people just being curmudgeon whose job it is to stop stuff

228
00:14:59,200 --> 00:15:01,120
from shipping and what have you.

229
00:15:01,120 --> 00:15:06,640
You need to work together with the engineering teams to help them do the right thing.

230
00:15:06,640 --> 00:15:09,480
I'm a big fan of just not being a complete curmudgeon.

231
00:15:09,480 --> 00:15:14,800
I'm pretty upbeat about this sort of stuff and just helping people basically ship a more

232
00:15:14,800 --> 00:15:20,720
secure product and help them see it as a fundamental part of shipping any product, just like with

233
00:15:20,720 --> 00:15:25,440
any other illity, reliability, scalability, usability, the security.

234
00:15:25,440 --> 00:15:29,000
No, it's not an illity, but you know what I mean.

235
00:15:29,000 --> 00:15:30,800
So yeah, I'm a big fan of that.

236
00:15:30,800 --> 00:15:36,000
I do get very angry actually at security people who are just absolute curmudgeons and won't

237
00:15:36,000 --> 00:15:39,880
... almost take great delight in telling people that they're not going to ship their product.

238
00:15:39,880 --> 00:15:41,840
I take no delight in that whatsoever.

239
00:15:41,840 --> 00:15:44,720
So anyway, yeah, I agree 100%.

240
00:15:44,720 --> 00:15:46,360
That's why it's really cool talking to the developer community.

241
00:15:46,360 --> 00:15:50,000
I remember many, many years ago I talked at one of the Microsoft Professional Developer

242
00:15:50,000 --> 00:15:52,840
conferences a long, long time ago in the late 90s.

243
00:15:52,840 --> 00:15:55,240
Actually, it may have even been the mid 90s.

244
00:15:55,240 --> 00:15:58,360
Oh my God, I'm really aging myself now.

245
00:15:58,360 --> 00:16:04,840
I talked about SQL injection and I'll never forget the fire marshal came in and was really

246
00:16:04,840 --> 00:16:08,360
kind of a bit angry because there was way too many people in the room.

247
00:16:08,360 --> 00:16:14,440
But when I actually talked about SQL injection, you could see people get nervous and you could

248
00:16:14,440 --> 00:16:15,440
see people getting on there.

249
00:16:15,440 --> 00:16:19,500
But I think back then it may have even been pages, but they were talking to people back

250
00:16:19,500 --> 00:16:21,720
at the office and what have you to go looking for these kinds of issues.

251
00:16:21,720 --> 00:16:27,160
Because I actually demonstrated SQL injection and the demo was actually so good that I actually

252
00:16:27,160 --> 00:16:29,800
destroyed the demo through SQL injection.

253
00:16:29,800 --> 00:16:34,720
I actually deleted the database or one of the tables that I was going to use during

254
00:16:34,720 --> 00:16:35,720
the demo.

255
00:16:35,720 --> 00:16:38,800
So I actually cut my demo short, but it kind of made the point.

256
00:16:38,800 --> 00:16:42,320
It's like, oops, I accidentally deleted one of the tables for the demo.

257
00:16:42,320 --> 00:16:45,320
Well, I think we're done talking about SQL injection.

258
00:16:45,320 --> 00:16:49,320
So yeah, I think the intersection of security and development is certainly an area that

259
00:16:49,320 --> 00:16:51,880
I really, really like.

260
00:16:51,880 --> 00:16:56,920
Mainly if for no other reason than if you look at the topics that are discussed at Black

261
00:16:56,920 --> 00:17:03,840
Hat and Defcon and B-sides and obviously Blue Hat, a lot of them are dev related.

262
00:17:03,840 --> 00:17:04,840
They're dev related.

263
00:17:04,840 --> 00:17:06,960
Hey, there's a memory corruption vulnerability over here.

264
00:17:06,960 --> 00:17:12,280
All those things riddled with SQL injection or the web discovered a new class of vulnerabilities.

265
00:17:12,280 --> 00:17:18,240
So yeah, I'm a big fan of that intersection of developers and communities.

266
00:17:18,240 --> 00:17:20,600
So where does B-sides fit in?

267
00:17:20,600 --> 00:17:23,360
Nick, is that something you're closer to?

268
00:17:23,360 --> 00:17:26,400
Oh, I know B-sides.

269
00:17:26,400 --> 00:17:32,480
If Sarah knows the true origin, I can take a stab at it, but again, I certainly don't

270
00:17:32,480 --> 00:17:35,280
claim to be an expert on that one.

271
00:17:35,280 --> 00:17:38,080
Okay, well, I have done quite a few B-sides.

272
00:17:38,080 --> 00:17:40,520
I think now I've done a B-sides.

273
00:17:40,520 --> 00:17:45,640
One of my bucket list things was to try and do a B-sides in most parts of the world or

274
00:17:45,640 --> 00:17:47,240
at least on most continents.

275
00:17:47,240 --> 00:17:51,320
I've done B-sides down here in Oceania.

276
00:17:51,320 --> 00:17:53,280
I've done it in North America.

277
00:17:53,280 --> 00:17:55,320
I've done it in Europe.

278
00:17:55,320 --> 00:18:00,080
I think I've still got a couple of continents to go, but for those who do not know, B-sides

279
00:18:00,080 --> 00:18:04,760
is and this is my understanding, though, of course, you can feel free to tweet as if I

280
00:18:04,760 --> 00:18:06,600
have this wrong.

281
00:18:06,600 --> 00:18:12,600
B-sides started in Vegas and I have presented at B-sides Las Vegas as kind of an overflow

282
00:18:12,600 --> 00:18:15,000
to Black Hat and Def Con.

283
00:18:15,000 --> 00:18:19,840
So it's talking about like the B-side of a record or a tape back in the day.

284
00:18:19,840 --> 00:18:25,400
So hopefully everyone on the, well, if you don't know what a B-side of a record or a

285
00:18:25,400 --> 00:18:26,640
tape is, go look that up.

286
00:18:26,640 --> 00:18:29,360
That means you're too young.

287
00:18:29,360 --> 00:18:34,160
But essentially it started as a movement to basically pick up a lot of the great talks

288
00:18:34,160 --> 00:18:36,920
that weren't selected at Black Hat and Def Con.

289
00:18:36,920 --> 00:18:43,000
And if anybody's been involved in a conference, you'll know that generally you get way more

290
00:18:43,000 --> 00:18:48,020
talks than you can ever have, even if they're great.

291
00:18:48,020 --> 00:18:50,280
And so it can be really sad.

292
00:18:50,280 --> 00:18:51,720
Talks can get rejected from conferences.

293
00:18:51,720 --> 00:18:55,240
And I think we've talked about this on previous episodes.

294
00:18:55,240 --> 00:18:57,860
Talks can get rejected not because they're a bad talk.

295
00:18:57,860 --> 00:19:00,200
It's just the agenda is imbalanced.

296
00:19:00,200 --> 00:19:02,080
You don't have room, et cetera.

297
00:19:02,080 --> 00:19:04,300
And so B-sides was set up originally.

298
00:19:04,300 --> 00:19:07,800
And I think B-sides Las Vegas is the original one.

299
00:19:07,800 --> 00:19:12,200
And I believe it might be coming up for about 15 years old now to pick up some of those

300
00:19:12,200 --> 00:19:15,800
overflow talks and still give the people an opportunity to talk.

301
00:19:15,800 --> 00:19:19,640
Now since then, B-sides has turned into a bit of a global movement.

302
00:19:19,640 --> 00:19:24,560
And so there are now B-sides in lots of different cities around the world, all throughout North

303
00:19:24,560 --> 00:19:27,560
America, Europe, all over the place.

304
00:19:27,560 --> 00:19:30,680
I think they're up to something like 200 plus B-sides events.

305
00:19:30,680 --> 00:19:35,880
And the idea with the B-sides event is they're all run by individual, it's different people

306
00:19:35,880 --> 00:19:36,880
organizing them.

307
00:19:36,880 --> 00:19:41,000
But the general ethos is that B-sides is a community event.

308
00:19:41,000 --> 00:19:45,160
It's a mixture of experienced hackers, but also people who are new and want to get into

309
00:19:45,160 --> 00:19:46,160
the field.

310
00:19:46,160 --> 00:19:51,200
Often, the B-sides will have free or very cheap tickets for students or people looking

311
00:19:51,200 --> 00:19:53,000
for a job.

312
00:19:53,000 --> 00:19:56,000
There's talks, but they also do capture the flags.

313
00:19:56,000 --> 00:20:00,000
They do lot picking.

314
00:20:00,000 --> 00:20:01,960
Sometimes they have career villages.

315
00:20:01,960 --> 00:20:07,000
Like I said, it's not like a cookie cutter, exactly one size fits all for B-sides.

316
00:20:07,000 --> 00:20:09,120
But that's the general gist of what they do.

317
00:20:09,120 --> 00:20:10,360
I've been very lucky.

318
00:20:10,360 --> 00:20:12,080
I've been to lots of B-sides.

319
00:20:12,080 --> 00:20:17,640
I'm very sad this year because my hometown B-sides, B-sides Melbourne, is whilst I am

320
00:20:17,640 --> 00:20:20,760
going to be in the US, so I'm going to miss it, which is the first time I've missed it

321
00:20:20,760 --> 00:20:22,640
for a long time.

322
00:20:22,640 --> 00:20:26,160
But they're generally very supportive environments.

323
00:20:26,160 --> 00:20:32,640
The idea is that they encourage people who are new to come in and do the thing and even

324
00:20:32,640 --> 00:20:34,120
just participate.

325
00:20:34,120 --> 00:20:38,280
So I think if you're new into security or you haven't been to a security conference

326
00:20:38,280 --> 00:20:43,120
before, see if there's a B-sides near you, that's a good place to start.

327
00:20:43,120 --> 00:20:45,040
I totally echo that.

328
00:20:45,040 --> 00:20:49,360
Yeah, B-sides is such a wonderful movement and set of conferences.

329
00:20:49,360 --> 00:20:53,120
If you are new to this space or even just want to check it out and see what it's all

330
00:20:53,120 --> 00:20:57,120
about, look up if there's a B-sides happening near you.

331
00:20:57,120 --> 00:21:02,280
Yeah, a lot of the tickets are either really low cost so that as many people as possible

332
00:21:02,280 --> 00:21:07,160
can attend or sometimes they're even free if they get sponsored by a company like Microsoft

333
00:21:07,160 --> 00:21:08,400
or anyone in the industry.

334
00:21:08,400 --> 00:21:09,400
So yeah, it's great.

335
00:21:09,400 --> 00:21:11,520
Big shout out to the B-sides community.

336
00:21:11,520 --> 00:21:17,720
So while we're on this topic of the origins of various conferences, so let's talk about

337
00:21:17,720 --> 00:21:20,680
something that's quite near and dear to all of our hearts for various reasons.

338
00:21:20,680 --> 00:21:22,080
That is Blue Hat.

339
00:21:22,080 --> 00:21:25,360
So Nick, as you mentioned, this is something that you are in charge of.

340
00:21:25,360 --> 00:21:29,820
So you want to give us an overview of what Blue Hat is, how it started, why is it called

341
00:21:29,820 --> 00:21:32,280
Blue Hat, anything that people may not know.

342
00:21:32,280 --> 00:21:33,280
Yeah, sure.

343
00:21:33,280 --> 00:21:34,280
Thank you.

344
00:21:34,280 --> 00:21:37,160
So it started in 2005.

345
00:21:37,160 --> 00:21:43,080
Michael actually have a still frame here on a monitor of mine to the left of you in a

346
00:21:43,080 --> 00:21:50,120
very lovely chambray shirt, emceeing panel of hackers and researchers at the very first

347
00:21:50,120 --> 00:21:51,120
Blue Hat.

348
00:21:51,120 --> 00:21:54,960
I don't know if you remember that, but I'll pick your brain on that one in a second.

349
00:21:54,960 --> 00:21:55,960
Hold on.

350
00:21:55,960 --> 00:21:57,360
You have a picture of me?

351
00:21:57,360 --> 00:21:58,360
I do.

352
00:21:58,360 --> 00:21:59,360
Okay.

353
00:21:59,360 --> 00:22:02,680
Well, ignoring the obvious, you know, why, but can you send that to me?

354
00:22:02,680 --> 00:22:03,680
We should probably post that.

355
00:22:03,680 --> 00:22:08,280
No, I've got, what I've got is it's the video recordings of all the sessions from the first

356
00:22:08,280 --> 00:22:10,960
Blue, the first Blue Hat.

357
00:22:10,960 --> 00:22:17,680
They were, I think they were, they were all filmed on potatoes because the resolution

358
00:22:17,680 --> 00:22:21,360
is just disgusting, but the audio is great.

359
00:22:21,360 --> 00:22:22,960
Yeah, the audio is great.

360
00:22:22,960 --> 00:22:26,560
And it is, they've clearly been like transferred from VHAs or something.

361
00:22:26,560 --> 00:22:29,160
There's like tracking marks and tracking lines and stuff.

362
00:22:29,160 --> 00:22:33,520
Anyway, but very clearly you, but the audio is great and you can hear the great conversations

363
00:22:33,520 --> 00:22:34,520
that are happening.

364
00:22:34,520 --> 00:22:39,960
But yeah, 2005, I wasn't there for the beginning, as I said, but I think, you know, there was

365
00:22:39,960 --> 00:22:40,960
sort of two ideas.

366
00:22:40,960 --> 00:22:46,760
The first was let's bring sort of an external perspective to an internal audience.

367
00:22:46,760 --> 00:22:53,200
This was, you know, around the time of the trustworthy computing memo and really a tools

368
00:22:53,200 --> 00:22:56,720
down let's focus on security and get that right.

369
00:22:56,720 --> 00:23:01,280
And, but that there is a need for, you know, I'll say like a reality check that we sort

370
00:23:01,280 --> 00:23:07,400
of, we need those external folk that can really give us a true lay of the land.

371
00:23:07,400 --> 00:23:08,920
What's happening?

372
00:23:08,920 --> 00:23:14,000
What's the external perspective on the industry and what's the external perspective on Microsoft?

373
00:23:14,000 --> 00:23:19,800
And so one of the reasons it was called Blue Hat was that it was in some parts sort of

374
00:23:19,800 --> 00:23:27,360
bringing some of those presenters and sessions, the best of from a Black Hat style conference

375
00:23:27,360 --> 00:23:33,280
or even Black Hat specifically and having them come and present internally to Microsoft

376
00:23:33,280 --> 00:23:39,400
to folks that were in development and engineering roles in sort of nascent security roles.

377
00:23:39,400 --> 00:23:45,320
And that was the first Blue Hat as I understand again, I wasn't there.

378
00:23:45,320 --> 00:23:51,040
So like Michael, how does that track with your understanding, you know, being there

379
00:23:51,040 --> 00:23:52,920
and being around the community at that time?

380
00:23:52,920 --> 00:23:54,240
Yeah, I remember it well.

381
00:23:54,240 --> 00:23:56,400
I remember it very, very well.

382
00:23:56,400 --> 00:23:57,640
There is a reason why it's blue.

383
00:23:57,640 --> 00:24:00,240
There's a reason why it's Blue Hat and not some other color.

384
00:24:00,240 --> 00:24:05,200
And it's because that's the, because it was only back then you had to be a permanent Microsoft

385
00:24:05,200 --> 00:24:10,440
employee to attend and our badges for permanent employees are blue.

386
00:24:10,440 --> 00:24:11,440
Blue badge.

387
00:24:11,440 --> 00:24:13,080
Blue badge.

388
00:24:13,080 --> 00:24:14,560
That is the reason why it's Blue Hat.

389
00:24:14,560 --> 00:24:15,560
Ah, yes.

390
00:24:15,560 --> 00:24:16,560
Yeah, exactly.

391
00:24:16,560 --> 00:24:18,400
So it is Black Hat.

392
00:24:18,400 --> 00:24:19,400
So you're absolutely right.

393
00:24:19,400 --> 00:24:23,800
We would take some of the best talks that we thought were most relevant to Microsoft

394
00:24:23,800 --> 00:24:29,040
and we would bring those speakers to Redmond and they would talk.

395
00:24:29,040 --> 00:24:34,560
Now, what was interesting, and you can tell me how much it's changed since those days,

396
00:24:34,560 --> 00:24:42,600
since the early days, but we would have a day that was set aside with condensed material

397
00:24:42,600 --> 00:24:44,760
just for execs, right?

398
00:24:44,760 --> 00:24:48,920
So the likes of, you know, Brian Valentine, who was running Windows at the time, and Paul

399
00:24:48,920 --> 00:24:52,960
Flesner who is running SQL Server and, you know, all the other folks running various

400
00:24:52,960 --> 00:24:57,480
products would also attend and it would be just for the execs, no one else.

401
00:24:57,480 --> 00:25:01,720
And then on the subsequent days, that would be the development teams.

402
00:25:01,720 --> 00:25:05,920
And the reason why we did that was just so that the, you know, you may have a different

403
00:25:05,920 --> 00:25:11,880
conversation with an exec, you know, as opposed to with an engineer.

404
00:25:11,880 --> 00:25:16,000
So yeah, so that was, I thought was a brilliant idea.

405
00:25:16,000 --> 00:25:22,520
The other thing that really amazed me sort of from a cultural perspective is there was

406
00:25:22,520 --> 00:25:24,600
no resistance to this whatsoever.

407
00:25:24,600 --> 00:25:30,800
Like people were quite happy to hear, you know, areas where their products could improve

408
00:25:30,800 --> 00:25:32,400
or things that they could learn.

409
00:25:32,400 --> 00:25:38,440
And to me, one of the most telling was not long after Slammer hit.

410
00:25:38,440 --> 00:25:42,920
So Slammer took advantage of a vulnerability, it was a worm, that took advantage of a vulnerability

411
00:25:42,920 --> 00:25:43,920
in SQL Server.

412
00:25:43,920 --> 00:25:47,200
Actually, technically it wasn't SQL Server, so I'm going to be totally honest, it was

413
00:25:47,200 --> 00:25:51,640
actually in UDP 1434, not TCP 1433.

414
00:25:51,640 --> 00:25:57,600
So TCP 1433 is actually SQL Server and UDP 1434 is the management.

415
00:25:57,600 --> 00:26:01,160
So it was actually a vulnerability in the management code, not actually in the core

416
00:26:01,160 --> 00:26:02,160
engine.

417
00:26:02,160 --> 00:26:06,680
But David Litchfield, so in the back in the day, there were really only three security

418
00:26:06,680 --> 00:26:10,080
researchers in the world finding bugs in SQL databases.

419
00:26:10,080 --> 00:26:16,040
In that, I mean, I mean everything, you know, Oracle and DB2 and SQL Server and many, many

420
00:26:16,040 --> 00:26:18,800
others back in the day.

421
00:26:18,800 --> 00:26:24,120
And so David had found this bug and it led to Slammer and David came to talk on campus

422
00:26:24,120 --> 00:26:28,200
during a Blue Hat to talk about, you know, the bug, how he found it, which is fuzzing,

423
00:26:28,200 --> 00:26:29,200
by the way.

424
00:26:29,200 --> 00:26:34,240
I actually spoke to David when I was writing designing and developing secure Azure solutions,

425
00:26:34,240 --> 00:26:39,680
actually spoke to David and actually called out in a chapter on fuzzing about how he found

426
00:26:39,680 --> 00:26:40,680
that bug.

427
00:26:40,680 --> 00:26:44,120
So I have it verbatim from David.

428
00:26:44,120 --> 00:26:47,880
So he found the bug and he came on campus to talk about it in Blue Hat.

429
00:26:47,880 --> 00:26:53,360
And what was interesting is, first of all, on the exact day, Flesner was totally attentive

430
00:26:53,360 --> 00:26:55,560
and asked David a lot of questions.

431
00:26:55,560 --> 00:27:02,920
But the next day, when we went into the main hall where David was talking about the bug

432
00:27:02,920 --> 00:27:10,480
that led to Slammer, I want to say 70% of the audience was from the SQL Server team.

433
00:27:10,480 --> 00:27:16,440
And that 70% of the audience probably made up about 85, 90% of the SQL Server team at

434
00:27:16,440 --> 00:27:23,200
the time, development team at the time, which to me just shows like a real change in thought

435
00:27:23,200 --> 00:27:27,840
about what it takes to think about security in our products.

436
00:27:27,840 --> 00:27:32,600
So that to me was really, really an amazing thing to see.

437
00:27:32,600 --> 00:27:33,600
Any other insights?

438
00:27:33,600 --> 00:27:35,400
So there's going to be one this year.

439
00:27:35,400 --> 00:27:37,200
So is it always at the end of October?

440
00:27:37,200 --> 00:27:39,600
Is that the general schedule?

441
00:27:39,600 --> 00:27:40,600
Yeah.

442
00:27:40,600 --> 00:27:43,560
So the conference has evolved over the years.

443
00:27:43,560 --> 00:27:49,120
So a couple of big changes are that it is now or it has been for a long time open to

444
00:27:49,120 --> 00:27:52,400
the public, open to the external security community.

445
00:27:52,400 --> 00:27:56,120
It's no longer just internal Microsoft Blue badges.

446
00:27:56,120 --> 00:28:01,520
And so the structure we have now, which we've had for quite a few years, is days one and

447
00:28:01,520 --> 00:28:08,320
two are open to both internal Microsoft employees as well as non-Microsoft employees or external

448
00:28:08,320 --> 00:28:10,040
members of the security community.

449
00:28:10,040 --> 00:28:14,120
And then we have a third day, which is an internal only day.

450
00:28:14,120 --> 00:28:17,640
And we call that Strike Presents Blue hat.

451
00:28:17,640 --> 00:28:23,620
Strike is one of our internal sort of security training programs, has both a lot of online

452
00:28:23,620 --> 00:28:25,160
training and then in-person events.

453
00:28:25,160 --> 00:28:31,160
And so we sort of merge them together so that we can have some conversations and presentations

454
00:28:31,160 --> 00:28:38,780
and discussions amongst just employees that may have, you know, be confidential in nature

455
00:28:38,780 --> 00:28:44,920
or covering vulnerabilities or exploits that may still be active in some capacity, as well

456
00:28:44,920 --> 00:28:51,240
as just content that might not just be as relevant for an external audience.

457
00:28:51,240 --> 00:28:53,080
And yes, the next one is coming up.

458
00:28:53,080 --> 00:28:57,000
Depending on when you're listening to this episode, it's a little bit over two weeks

459
00:28:57,000 --> 00:28:58,000
from now.

460
00:28:58,000 --> 00:29:04,560
It's going to be October 29th and 30th in Redmond, Washington in the US on the Microsoft

461
00:29:04,560 --> 00:29:08,420
campus in the conference center, which we call Building 33.

462
00:29:08,420 --> 00:29:11,280
And that's the sort of home of Blue Hat over the years.

463
00:29:11,280 --> 00:29:16,640
Most of them have been in that building and most of them have been in the October timeframe,

464
00:29:16,640 --> 00:29:20,760
although there have been a few, there was a couple of years where there was a sort of

465
00:29:20,760 --> 00:29:25,080
a fall or an autumn Blue Hat and then a spring Blue Hat.

466
00:29:25,080 --> 00:29:27,960
So two in a year.

467
00:29:27,960 --> 00:29:34,520
But the current sort of cadence we're going for is one a year aiming for October.

468
00:29:34,520 --> 00:29:36,280
There's also international Blue Hats.

469
00:29:36,280 --> 00:29:44,040
There's Blue Hat Israel, which Blue Hat IL, which has been happening for a while in obviously

470
00:29:44,040 --> 00:29:45,700
in Israel.

471
00:29:45,700 --> 00:29:49,240
And last year, we actually had, oh, excuse me, earlier this year, we had the first ever

472
00:29:49,240 --> 00:29:54,880
Blue Hat India, which was really cool and hopefully something that will continue as

473
00:29:54,880 --> 00:29:57,040
well.

474
00:29:57,040 --> 00:30:02,280
But yeah, the structure which has been pretty well established now is that there is a call

475
00:30:02,280 --> 00:30:07,920
for papers, which we open up several months beforehand and we ask both internal Microsoft

476
00:30:07,920 --> 00:30:13,840
employees and the external community to submit papers, which really, you know, you don't

477
00:30:13,840 --> 00:30:15,400
actually have to have a paper.

478
00:30:15,400 --> 00:30:20,840
It's really you're submitting a talk, you're submitting a session that you would like to

479
00:30:20,840 --> 00:30:21,840
present at Blue Hat.

480
00:30:21,840 --> 00:30:25,920
And so you've got to give it a title and there needs to be an abstract that explains what

481
00:30:25,920 --> 00:30:27,280
you would be presenting.

482
00:30:27,280 --> 00:30:32,240
A lot of submitters also include sort of supporting documentation, which can be a paper.

483
00:30:32,240 --> 00:30:36,560
It can be a white paper or it could be just more of a fleshed out outline.

484
00:30:36,560 --> 00:30:41,760
And then from there, we have a CAB, C-A-B, it's a content advisory board.

485
00:30:41,760 --> 00:30:45,760
And then they go through and they read all those submissions and they help us choose

486
00:30:45,760 --> 00:30:49,120
the best to present at Blue Hat.

487
00:30:49,120 --> 00:30:55,540
This year, we had over 100 submissions, which is fantastic, but we have to narrow that down

488
00:30:55,540 --> 00:30:56,840
to about 20.

489
00:30:56,840 --> 00:31:03,360
And so, you know, there's a lot of content, very, very good content, very, very good speakers

490
00:31:03,360 --> 00:31:06,960
who, you know, don't get picked.

491
00:31:06,960 --> 00:31:13,480
And that's always tough to not be able to select more sessions to be presented.

492
00:31:13,480 --> 00:31:17,040
But we try and keep it down to 20.

493
00:31:17,040 --> 00:31:22,480
We have 10 sessions per day in two parallel tracks.

494
00:31:22,480 --> 00:31:28,000
We try very hard to have sort of 50-50 representation in both the presenters as well as the attendees.

495
00:31:28,000 --> 00:31:32,720
So if you are a presenter at Blue Hat, sorry, at the Blue Hat conference, hopefully half

496
00:31:32,720 --> 00:31:37,320
the presenters are going to be Microsoft folk who are presenting their research and their

497
00:31:37,320 --> 00:31:39,460
findings and sort of guidance to the industry.

498
00:31:39,460 --> 00:31:44,680
And then the other half, the presenters will be from the non-Microsoft sort of external

499
00:31:44,680 --> 00:31:48,960
community and from our partners who will be presenting their research findings, which

500
00:31:48,960 --> 00:31:55,440
are very often research findings targeted at a Microsoft product or technology.

501
00:31:55,440 --> 00:32:02,660
And it is often them showing us where, you know, some sort of vulnerability has been

502
00:32:02,660 --> 00:32:08,840
discovered or some sort of, you know, technical approach to a solution has been found to be

503
00:32:08,840 --> 00:32:13,760
out of date and is no longer sort of secure and the industry needs to change.

504
00:32:13,760 --> 00:32:18,920
And just, you know, looping this back to Sarah's comment about developers and the developer

505
00:32:18,920 --> 00:32:23,960
community and, you know, how do we create guidance and education to the developer community

506
00:32:23,960 --> 00:32:29,920
so that they are writing more secure code and they are starting to abandon or they're

507
00:32:29,920 --> 00:32:34,440
abandoning some of these techniques that have been proved to be insecure or just sort of,

508
00:32:34,440 --> 00:32:35,440
you know, outdated.

509
00:32:35,440 --> 00:32:41,760
So there's a really interesting sort of yin-yang relationship between the external community

510
00:32:41,760 --> 00:32:44,160
who are coming in and showing us what they've found.

511
00:32:44,160 --> 00:32:49,720
And a lot of time that's issues that have been discovered in a Microsoft technology

512
00:32:49,720 --> 00:32:50,880
or product.

513
00:32:50,880 --> 00:32:56,420
How Microsoft is also showing how that we're sort of hacking our own stuff as well as looking

514
00:32:56,420 --> 00:32:59,260
at other platforms and also sharing what we've found.

515
00:32:59,260 --> 00:33:01,760
But then more importantly, how do we turn that into guidance?

516
00:33:01,760 --> 00:33:03,600
How do we turn that into education?

517
00:33:03,600 --> 00:33:10,040
How do we turn that into instruction and teaching and learning for not just the researchers

518
00:33:10,040 --> 00:33:16,800
and responders but the broader technical community so that we are making more secure technology

519
00:33:16,800 --> 00:33:18,760
moving forward?

520
00:33:18,760 --> 00:33:24,860
And it's a great atmosphere because everyone is there on the same team and it's just wonderful

521
00:33:24,860 --> 00:33:31,880
to see the conversations that happen organically in the hallways and in the line up for tacos

522
00:33:31,880 --> 00:33:37,480
at lunch and while people are, you know, going around trading stickers and it's a wonderful

523
00:33:37,480 --> 00:33:39,580
conference.

524
00:33:39,580 --> 00:33:49,560
And yeah, two weeks from now will be the 23rd, 23rd Blue Hat, which is in the 19th year.

525
00:33:49,560 --> 00:33:52,260
And yeah, it's gonna be great.

526
00:33:52,260 --> 00:33:55,160
So I don't know if you know or not, but I was, you talk about strike.

527
00:33:55,160 --> 00:34:00,120
So there was an internal strike event last week and I was actually the emcee for that

528
00:34:00,120 --> 00:34:01,120
thing.

529
00:34:01,120 --> 00:34:03,320
There's a secure future initiative strike.

530
00:34:03,320 --> 00:34:06,920
We have one of the biggest attendances that we've ever had for a security conference,

531
00:34:06,920 --> 00:34:09,240
like in purely internal conference.

532
00:34:09,240 --> 00:34:11,640
So I was actually the emcee for it.

533
00:34:11,640 --> 00:34:13,440
It was a lot of fun.

534
00:34:13,440 --> 00:34:17,520
And just in case you don't know, Nick, the odds are, I would say 80% that I'm actually

535
00:34:17,520 --> 00:34:21,920
going to be at Blue Hat this year as an emcee for one of the tracks.

536
00:34:21,920 --> 00:34:22,960
That sounds great.

537
00:34:22,960 --> 00:34:25,840
And I did know that you're at the strike event because I actually sat next to you while we

538
00:34:25,840 --> 00:34:26,840
ate lunch together.

539
00:34:26,840 --> 00:34:27,840
That's what we did.

540
00:34:27,840 --> 00:34:28,840
That's what we did.

541
00:34:28,840 --> 00:34:29,840
In my Rusty shirt.

542
00:34:29,840 --> 00:34:31,840
Your Rusty shirt, your Etsy Rusty shirt.

543
00:34:31,840 --> 00:34:34,040
Well, I'm feeling very left out right now.

544
00:34:34,040 --> 00:34:35,040
I want to talk.

545
00:34:35,040 --> 00:34:36,040
Oh, I'm sorry, Sarah.

546
00:34:36,040 --> 00:34:37,040
I know, I know.

547
00:34:37,040 --> 00:34:40,320
Well, you presented at last year's Blue Hat and it was awesome to have you come over and

548
00:34:40,320 --> 00:34:41,880
then you helped run a village for us.

549
00:34:41,880 --> 00:34:42,880
I did.

550
00:34:42,880 --> 00:34:48,760
That's sort of another thing too, that we've adopted over the years the elements from security

551
00:34:48,760 --> 00:34:53,480
conferences like B-Sides, like Defcon, that the community love, like Villagers.

552
00:34:53,480 --> 00:34:55,320
So we do Villagers at Blue Hat as well.

553
00:34:55,320 --> 00:35:00,120
And we have a call for Villagers internally at Microsoft.

554
00:35:00,120 --> 00:35:05,880
And it's fantastic because we get, you might think, oh, Villagers at a Microsoft event,

555
00:35:05,880 --> 00:35:07,880
that's all going to be just sort of centered around product.

556
00:35:07,880 --> 00:35:09,720
And it's absolutely not.

557
00:35:09,720 --> 00:35:16,880
It's centered around other types of community groups, other types of interest groups, other

558
00:35:16,880 --> 00:35:24,520
types of sort of advocacy groups inside the security research community who want to come

559
00:35:24,520 --> 00:35:28,640
together and talk about something that's important to them.

560
00:35:28,640 --> 00:35:29,640
We have ERG.

561
00:35:29,640 --> 00:35:33,960
I don't know if you guys have used that employee resource group.

562
00:35:33,960 --> 00:35:36,360
ERG is an acronym that we have at Microsoft.

563
00:35:36,360 --> 00:35:39,520
So we have women in cybersecurity.

564
00:35:39,520 --> 00:35:41,200
That'll be one of the Villagers there.

565
00:35:41,200 --> 00:35:42,560
There'll be other ERGs.

566
00:35:42,560 --> 00:35:47,080
I'm not going to try and name them all, but those will be represented as part of the village

567
00:35:47,080 --> 00:35:48,560
community.

568
00:35:48,560 --> 00:35:53,440
There's, you know, Microsoft's got a huge investment in the gaming industry.

569
00:35:53,440 --> 00:35:56,160
So we're going to have a gaming security village.

570
00:35:56,160 --> 00:36:01,000
There will undoubtedly be sort of cybersecurity career focused Villagers there.

571
00:36:01,000 --> 00:36:02,280
There'll be an AI Village.

572
00:36:02,280 --> 00:36:06,120
There'll be the Microsoft garage will be there and we'll probably be teaching people how

573
00:36:06,120 --> 00:36:11,880
to do soldering or some sort of basic sort of hardware hacking skill.

574
00:36:11,880 --> 00:36:12,960
It's a ton of fun.

575
00:36:12,960 --> 00:36:20,640
And yeah, again, we sort of we unashamedly, but also in with, you know, in reverence to

576
00:36:20,640 --> 00:36:25,720
the community and to the amazing conferences that are out there in the security space,

577
00:36:25,720 --> 00:36:29,480
we sort of try and borrow some of those elements that we know folks love and we try and bring

578
00:36:29,480 --> 00:36:30,680
them to Blue Hat.

579
00:36:30,680 --> 00:36:31,680
So two things.

580
00:36:31,680 --> 00:36:33,480
What did you talk about last year, Sarah?

581
00:36:33,480 --> 00:36:34,480
That's the first thing.

582
00:36:34,480 --> 00:36:36,640
And the second thing is, is there going to be a lockpicking village this year?

583
00:36:36,640 --> 00:36:40,040
Yes, I did speak at Blue Hat last year.

584
00:36:40,040 --> 00:36:42,840
I spoke, I actually spoke on strike day.

585
00:36:42,840 --> 00:36:49,200
So the internal Microsoft day and I was talking about how we write things and how we create

586
00:36:49,200 --> 00:36:53,400
content for the wider world as security people.

587
00:36:53,400 --> 00:36:56,300
And there's a recording of it online.

588
00:36:56,300 --> 00:36:57,300
Something I was going to say.

589
00:36:57,300 --> 00:37:04,320
Well, if Nick can't covered it already is that eventually and I think most years now,

590
00:37:04,320 --> 00:37:09,840
Nick, we do record all the sessions and put them up online a little bit after the conference.

591
00:37:09,840 --> 00:37:16,920
So if you're not able to join, actually go to Blue Hat, you can watch the sessions later.

592
00:37:16,920 --> 00:37:17,920
That's absolutely right.

593
00:37:17,920 --> 00:37:18,920
Yeah.

594
00:37:18,920 --> 00:37:19,920
Thank you, Sarah, for reminding me.

595
00:37:19,920 --> 00:37:21,680
Yes, we record everything.

596
00:37:21,680 --> 00:37:26,480
And I would say 95 percent of it is able to be published.

597
00:37:26,480 --> 00:37:30,280
And so, yeah, we have a YouTube channel, so they all go up there.

598
00:37:30,280 --> 00:37:36,920
And we are we do have ambition to go back through the archives and publish some of the

599
00:37:36,920 --> 00:37:39,360
Blue Hat videos from back in the day.

600
00:37:39,360 --> 00:37:45,880
You can see Michael Howard in his chambray shirt emceeing this panel at Blue Hat 1.

601
00:37:45,880 --> 00:37:48,920
I hope that'll get that up in YouTube soon.

602
00:37:48,920 --> 00:37:52,440
But yeah, you know, obviously, it's amazing to be there in person and it's great to be

603
00:37:52,440 --> 00:37:56,760
there in person, but we appreciate that not everyone is able to make the trip to be at

604
00:37:56,760 --> 00:38:00,120
Redmond or the state of Washington or the country of the US.

605
00:38:00,120 --> 00:38:02,800
So we record as much as we can and we put it up online.

606
00:38:02,800 --> 00:38:05,800
And then the other thing is we have our own podcast, Blue Hat podcast.

607
00:38:05,800 --> 00:38:10,200
And one of the things we like to do there is we bring on some of those presenters and

608
00:38:10,200 --> 00:38:13,580
then also some of the folks that are just sort of part of the community and maybe didn't

609
00:38:13,580 --> 00:38:17,400
have a session, but have some content to share and we bring them on the Blue Hat podcast

610
00:38:17,400 --> 00:38:18,400
as well.

611
00:38:18,400 --> 00:38:25,080
So we try and we try and share and spread the wealth and the fantastic content that

612
00:38:25,080 --> 00:38:28,720
comes through the Blue Hat community as much as possible.

613
00:38:28,720 --> 00:38:31,260
Lockpicking Village, I believe is the answer is yes.

614
00:38:31,260 --> 00:38:33,400
We try for that every single year.

615
00:38:33,400 --> 00:38:39,040
Yeah, I mean, if we can provide links to the Blue Hat YouTube channel, that'd be really

616
00:38:39,040 --> 00:38:40,040
cool too.

617
00:38:40,040 --> 00:38:41,040
All right.

618
00:38:41,040 --> 00:38:44,760
So one thing that you mentioned is he wants to open this more up to the public.

619
00:38:44,760 --> 00:38:48,680
So can people actually register for this outside of Microsoft?

620
00:38:48,680 --> 00:38:49,840
Absolutely.

621
00:38:49,840 --> 00:38:54,480
So depending on when you're listening to this episode, registration for Blue Hat may still

622
00:38:54,480 --> 00:38:56,480
be open.

623
00:38:56,480 --> 00:39:00,560
Blue Hat is again, it's in Redmond, Washington on the Microsoft campus.

624
00:39:00,560 --> 00:39:02,440
So the tickets are free.

625
00:39:02,440 --> 00:39:07,240
We don't charge any money for the ticket, but you need to apply.

626
00:39:07,240 --> 00:39:11,000
And then we go through and sort of read the applications and we make sure that the folks

627
00:39:11,000 --> 00:39:18,240
that are asking to attend would really sort of add value to be there and certainly are

628
00:39:18,240 --> 00:39:21,960
relevant for the security research community and response community.

629
00:39:21,960 --> 00:39:25,480
But yes, in theory, when you're listening to this podcast, the registration may still

630
00:39:25,480 --> 00:39:26,600
be open.

631
00:39:26,600 --> 00:39:38,920
You can go to aka.mswacbhreg, B-H-R-E-G as in Blue Hat registration, but B-H-REG.

632
00:39:38,920 --> 00:39:42,040
And you fill out an application form there.

633
00:39:42,040 --> 00:39:45,880
And we're really just trying to make sure that everyone that applies, you know, really,

634
00:39:45,880 --> 00:39:49,920
really is applying for the right reason.

635
00:39:49,920 --> 00:39:53,720
And I will also point out that we have a really interesting sort of unwritten rule for Blue

636
00:39:53,720 --> 00:40:01,080
Hat where we say no sales, no marketing, no dunking, no fluff.

637
00:40:01,080 --> 00:40:08,680
And so Blue Hat is not a conference where either Microsoft employees or attendees are

638
00:40:08,680 --> 00:40:13,460
permitted to sell or pitch anything, certainly not anything for sale.

639
00:40:13,460 --> 00:40:19,080
So this is not a platform for Microsoft to sort of pitch their tools and services that

640
00:40:19,080 --> 00:40:20,080
are sold.

641
00:40:20,080 --> 00:40:21,880
It's not a sales conference.

642
00:40:21,880 --> 00:40:23,460
It's not a marketing event.

643
00:40:23,460 --> 00:40:27,080
We don't sort of announce and launch products or anything or do any sort of marketing in

644
00:40:27,080 --> 00:40:28,600
that sense.

645
00:40:28,600 --> 00:40:33,180
We also don't allow our attendees or our partners to do that as well.

646
00:40:33,180 --> 00:40:37,100
And we also, you know, we're not there to dunk on folks like, yes, you may have discovered

647
00:40:37,100 --> 00:40:40,200
a vulnerability in a product or service or technology.

648
00:40:40,200 --> 00:40:46,240
But the point is to use it as an opportunity for learning and for sharing guidance to make

649
00:40:46,240 --> 00:40:50,080
technology a better place and not to dunk.

650
00:40:50,080 --> 00:40:55,800
I really need a better analogy or something for dunking.

651
00:40:55,800 --> 00:40:57,880
But we don't allow people to speak.

652
00:40:57,880 --> 00:40:58,880
What was that?

653
00:40:58,880 --> 00:41:04,560
Yeah, dunking is like, you know, you can't, you know, you can't, you know, don't, don't.

654
00:41:04,560 --> 00:41:05,560
You got to be respectful, right?

655
00:41:05,560 --> 00:41:06,560
I mean.

656
00:41:06,560 --> 00:41:07,560
Absolutely.

657
00:41:07,560 --> 00:41:11,480
If you found a bug in someone's product, you know, it's like there's a constructive way

658
00:41:11,480 --> 00:41:14,640
of communicating and there's let's just say the other way.

659
00:41:14,640 --> 00:41:15,840
I'm all for being constructive.

660
00:41:15,840 --> 00:41:19,560
Look, my philosophy in life is just be kind to people.

661
00:41:19,560 --> 00:41:21,480
And I don't think this is any different.

662
00:41:21,480 --> 00:41:23,400
So yeah, I think dunking is probably the right word.

663
00:41:23,400 --> 00:41:26,120
All in the US, it could be donuts as well.

664
00:41:26,120 --> 00:41:29,060
I think it's actually I think as a basketball, I think it's a US term.

665
00:41:29,060 --> 00:41:30,060
So I need to get it.

666
00:41:30,060 --> 00:41:31,200
I need to get something better than that.

667
00:41:31,200 --> 00:41:33,320
But yeah, you don't don't diss anyone.

668
00:41:33,320 --> 00:41:35,260
Don't don't be disparaging.

669
00:41:35,260 --> 00:41:38,800
Don't be disrespectful, you know, assume, assume best intention.

670
00:41:38,800 --> 00:41:40,920
And we're not there to make anyone look bad.

671
00:41:40,920 --> 00:41:48,720
We're just we're just there to, you know, generate new ideas and push the industry forward

672
00:41:48,720 --> 00:41:50,520
so that everyone can be more secure.

673
00:41:50,520 --> 00:41:55,880
But yeah, aka dot Ms. Whack B H Reg B H R E G.

674
00:41:55,880 --> 00:41:59,080
And hopefully you can squeeze in and get yourself a ticket to Blue Hat.

675
00:41:59,080 --> 00:42:00,080
Yeah.

676
00:42:00,080 --> 00:42:04,200
But let's be honest, it is kind of late, you know, so the odds are not fantastic.

677
00:42:04,200 --> 00:42:07,680
But if you're a security researcher.

678
00:42:07,680 --> 00:42:12,840
Yeah, if we were half smart, we would have recorded this two months ago.

679
00:42:12,840 --> 00:42:13,840
But here we are.

680
00:42:13,840 --> 00:42:14,840
It's all good.

681
00:42:14,840 --> 00:42:15,840
It's all good.

682
00:42:15,840 --> 00:42:17,840
I probably can get in early for next year.

683
00:42:17,840 --> 00:42:22,400
All right, let's let's start to bring this thing to an end for a group of people where

684
00:42:22,400 --> 00:42:26,400
we had absolutely no agenda whatsoever.

685
00:42:26,400 --> 00:42:28,880
I'm kind of surprised how long we how long we went.

686
00:42:28,880 --> 00:42:29,960
But anyway, there you go.

687
00:42:29,960 --> 00:42:35,640
Nick, one question we started asking all our guests is, so what is a typical day in the

688
00:42:35,640 --> 00:42:37,000
in the life of Nick look like?

689
00:42:37,000 --> 00:42:39,000
What do you do on a daily basis?

690
00:42:39,000 --> 00:42:46,280
Gosh, well, I don't know if right now it's a it's a typical time for me because, you

691
00:42:46,280 --> 00:42:49,360
know, we're coming up on two weeks out of Blue Hat.

692
00:42:49,360 --> 00:42:58,600
So right now I am madly teamsing, emailing, signaling any any sort of communication vehicle

693
00:42:58,600 --> 00:43:03,280
possible with the prospective speakers at Blue Hat.

694
00:43:03,280 --> 00:43:08,480
So that's the that's the folks that have submitted a paper and have been selected by our cab.

695
00:43:08,480 --> 00:43:10,560
And I'm going backwards and forwards with them.

696
00:43:10,560 --> 00:43:18,320
I'm going backwards and forwards with folks on the MSRC case management and vulnerability

697
00:43:18,320 --> 00:43:20,320
response team.

698
00:43:20,320 --> 00:43:25,160
We have a we have a commitment to coordinated vulnerability disclosure, which basically

699
00:43:25,160 --> 00:43:29,720
means we we we do our absolute best to make sure that we we don't O day anyone.

700
00:43:29,720 --> 00:43:31,120
We don't zero day customers.

701
00:43:31,120 --> 00:43:32,120
We don't zero day partners.

702
00:43:32,120 --> 00:43:33,880
We don't zero to ourselves.

703
00:43:33,880 --> 00:43:40,020
And so there's a sort of a negotiation that has to happen between between the folks that

704
00:43:40,020 --> 00:43:44,000
have submitted content to present and have been accepted to present.

705
00:43:44,000 --> 00:43:52,080
And we need to make sure that if there are active or recently disclosed or recently addressed

706
00:43:52,080 --> 00:43:55,760
vulnerabilities or exploits or techniques that we've we've met all our commitments

707
00:43:55,760 --> 00:43:58,960
there from a sort of responsible disclosure perspective.

708
00:43:58,960 --> 00:44:02,280
So that's a lot of a lot of a lot of time at the moment.

709
00:44:02,280 --> 00:44:06,200
And then, you know, trying to work out how to fit all these incredible sessions into

710
00:44:06,200 --> 00:44:10,560
into an agenda as well as all the other things that have to happen around a conference, which

711
00:44:10,560 --> 00:44:14,640
some of them are all sort of logistics like what are we going to have for lunch on day

712
00:44:14,640 --> 00:44:18,800
one and how many coffee carts do we need and all the way through to coming up with some

713
00:44:18,800 --> 00:44:24,240
really cool ideas for for swag and the conference t-shirts and, you know, sort of enamel pins

714
00:44:24,240 --> 00:44:26,000
that we're going to have the attendees trade.

715
00:44:26,000 --> 00:44:32,040
So that's sort of a day in the life at the moment outside of outside of the Blue Hat

716
00:44:32,040 --> 00:44:37,040
Conference, certainly looking for really interesting people to come on the Blue Hat podcast, talk

717
00:44:37,040 --> 00:44:39,640
about research, talk about security research.

718
00:44:39,640 --> 00:44:40,960
That's a part of it.

719
00:44:40,960 --> 00:44:45,400
And then and then really sort of also just engaging with the security research community,

720
00:44:45,400 --> 00:44:50,840
which is both inside of Microsoft and and external and again, looking for ways that

721
00:44:50,840 --> 00:44:58,600
that we as as Microsoft and MSRC as a part of Microsoft can just better engage with the

722
00:44:58,600 --> 00:45:03,280
community, how we can be a better partner with them, how we can learn from them, how

723
00:45:03,280 --> 00:45:08,400
we can create content that we feel that would be beneficial for the community or how we

724
00:45:08,400 --> 00:45:11,160
could partner with other researchers to help amplify their work.

725
00:45:11,160 --> 00:45:12,160
All right.

726
00:45:12,160 --> 00:45:13,740
So let's bring this thing officially to an end.

727
00:45:13,740 --> 00:45:19,280
So one thing we always ask is if you had just one thought to leave our listeners with, what

728
00:45:19,280 --> 00:45:20,280
would it be?

729
00:45:20,280 --> 00:45:21,280
All right.

730
00:45:21,280 --> 00:45:25,960
I think what I want to leave folks with is a bit of inspiration that anyone can really

731
00:45:25,960 --> 00:45:31,000
be a security researcher, because security researchers really are just people that are

732
00:45:31,000 --> 00:45:37,320
looking under the hood at how does stuff work and coming up with ways to make that thing

733
00:45:37,320 --> 00:45:41,460
potentially work a little bit more efficiently or a little bit more securely.

734
00:45:41,460 --> 00:45:47,280
And so if in your you know, I'm sure the folks listening to this podcast have long, long

735
00:45:47,280 --> 00:45:53,480
lists of tips and tricks that they would give to their customers, their clients, their friends,

736
00:45:53,480 --> 00:45:54,480
their family.

737
00:45:54,480 --> 00:45:58,000
And I would encourage them to think about how some of those things might be able to

738
00:45:58,000 --> 00:46:04,120
be packaged together into a story that they could submit to a Blue Hat, to a B-Sides,

739
00:46:04,120 --> 00:46:09,840
to a Defcon as you know, as guidance and as as learnings and as stuff they want to pass

740
00:46:09,840 --> 00:46:12,960
on to help make the world a better place.

741
00:46:12,960 --> 00:46:18,520
And yeah, you don't have to have security researcher on your LinkedIn profile to be

742
00:46:18,520 --> 00:46:25,160
able to be someone who is researching or searching to make things more secure.

743
00:46:25,160 --> 00:46:28,440
So I just encourage everyone to give it a go.

744
00:46:28,440 --> 00:46:31,640
Yeah, actually, I want to emphasize that last point.

745
00:46:31,640 --> 00:46:36,300
I was a reviewer, I was asked to look at some of the submissions to this year's Blue Hat,

746
00:46:36,300 --> 00:46:39,800
just a couple of them, just because of the nature of what they were.

747
00:46:39,800 --> 00:46:43,400
And I was really amazed at a couple of them, which were actually for lightning talks.

748
00:46:43,400 --> 00:46:45,760
So something else I didn't even talk about was lightning talks, right?

749
00:46:45,760 --> 00:46:47,960
Just like 15 minute talks on a specific topic.

750
00:46:47,960 --> 00:46:48,960
Big fan of those things.

751
00:46:48,960 --> 00:46:51,960
And it was a really interesting lightning talk idea.

752
00:46:51,960 --> 00:46:54,960
It was a little bit out of left field.

753
00:46:54,960 --> 00:47:00,420
But it's one of those talks where the person who was proposing it, and by the way, it looks

754
00:47:00,420 --> 00:47:03,720
like that person got OK to do it, got accepted to do it.

755
00:47:03,720 --> 00:47:06,520
It was just a different way of thinking about things.

756
00:47:06,520 --> 00:47:07,520
That's all it was.

757
00:47:07,520 --> 00:47:13,720
But was it a true security vulnerability, a real spicy security kind of thing?

758
00:47:13,720 --> 00:47:14,720
Not really.

759
00:47:14,720 --> 00:47:18,480
But it was just an interesting way of looking at the problem space.

760
00:47:18,480 --> 00:47:20,080
And yes, it got accepted.

761
00:47:20,080 --> 00:47:21,080
So yeah, 100%.

762
00:47:21,080 --> 00:47:24,480
You don't have to be a complete dyed in the wall security nerd to present at these conferences.

763
00:47:24,480 --> 00:47:26,480
Sometimes a different perspective is all that's needed.

764
00:47:26,480 --> 00:47:29,160
All right, so let's bring this thing, this episode, to an end.

765
00:47:29,160 --> 00:47:30,720
This has been a great episode.

766
00:47:30,720 --> 00:47:34,840
Actually, again, we've been completely, basically been totally free with what we're talking

767
00:47:34,840 --> 00:47:35,840
about.

768
00:47:35,840 --> 00:47:37,280
But yes, always good talking to you, Nate.

769
00:47:37,280 --> 00:47:39,720
So again, thanks, Nick, for joining us this week.

770
00:47:39,720 --> 00:47:41,400
Been great having you on.

771
00:47:41,400 --> 00:47:45,040
And to all our listeners out there, again, we realize we kind of rambled a little bit,

772
00:47:45,040 --> 00:47:48,080
but hopefully you found this episode of interest.

773
00:47:48,080 --> 00:47:50,640
Stay safe and we'll see you next time.

774
00:47:50,640 --> 00:47:53,600
Thanks for listening to the Azure Security Podcast.

775
00:47:53,600 --> 00:48:00,440
You can find show notes and other resources at our website, azsecuritypodcast.net.

776
00:48:00,440 --> 00:48:05,280
If you have any questions, please find us on Twitter at Azure Set Pod.

777
00:48:05,280 --> 00:48:10,600
Background music is from ccmixtor.com and licensed under the Creative Commons license.

