1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast, where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:18,220
Hey everybody, welcome to episode 104.

4
00:00:18,220 --> 00:00:19,840
This week's one of those special episodes.

5
00:00:19,840 --> 00:00:21,600
We're not going to have any news.

6
00:00:21,600 --> 00:00:24,880
It's just myself, Michael, and our guest this week is Nick Fillingham.

7
00:00:24,880 --> 00:00:29,300
And we're going to talk about the aftermath of the Blue Hat Conference.

8
00:00:29,300 --> 00:00:34,420
So if you look at episode 103, we also have Nick talking about Blue Hat, which is coming

9
00:00:34,420 --> 00:00:36,960
up and now we're talking about Blue Hat as it has happened.

10
00:00:36,960 --> 00:00:39,200
So Nick, thank you so much for joining us this week.

11
00:00:39,200 --> 00:00:42,740
For those who haven't listened to episode 103, you just want to spend a couple of minutes

12
00:00:42,740 --> 00:00:44,640
and introduce yourself.

13
00:00:44,640 --> 00:00:47,920
Yeah, Michael, thank you for having me back.

14
00:00:47,920 --> 00:00:55,760
I feel very lucky, blessed, special to be your first repeating guest two episodes in

15
00:00:55,760 --> 00:00:56,760
a row.

16
00:00:56,760 --> 00:00:57,760
So hello, I'm Nick Fillingham.

17
00:00:57,760 --> 00:01:01,360
My accent's a little bit different to Michael Howard's, but it's pretty similar.

18
00:01:01,360 --> 00:01:06,360
And I'm the Blue Hat Program Lead here in Redmond, Washington.

19
00:01:06,360 --> 00:01:11,280
And I have the privilege, the honor and the fun to help put on the Blue Hat Conference

20
00:01:11,280 --> 00:01:13,400
and have been doing that for the last couple of years.

21
00:01:13,400 --> 00:01:14,400
Cool.

22
00:01:14,400 --> 00:01:17,840
You know how I tell people to tell Australian and New Zealand accents apart?

23
00:01:17,840 --> 00:01:19,080
How?

24
00:01:19,080 --> 00:01:22,760
So I asked them and I'm very careful how I sort of frame the question actually.

25
00:01:22,760 --> 00:01:30,920
Think of a famous British meal that involves potatoes and a meat from the sea.

26
00:01:30,920 --> 00:01:32,480
And then I said, okay, so in Australia...

27
00:01:32,480 --> 00:01:33,640
And a meat from the sea?

28
00:01:33,640 --> 00:01:36,000
Well, it's like, you know, a thing from the sea.

29
00:01:36,000 --> 00:01:37,000
Dolphin.

30
00:01:37,000 --> 00:01:38,000
Not dolphin.

31
00:01:38,000 --> 00:01:39,000
Fish.

32
00:01:39,000 --> 00:01:40,000
So I said, well...

33
00:01:40,000 --> 00:01:41,000
You mean, fush?

34
00:01:41,000 --> 00:01:42,000
Yeah.

35
00:01:42,000 --> 00:01:45,000
So in Australia, they say fish and chips.

36
00:01:45,000 --> 00:01:46,480
Fish and chips.

37
00:01:46,480 --> 00:01:49,760
Whereas in New Zealand, they say fush and chops.

38
00:01:49,760 --> 00:01:50,760
Fush and chops.

39
00:01:50,760 --> 00:01:51,760
That's right.

40
00:01:51,760 --> 00:01:52,760
So that's the fun part.

41
00:01:52,760 --> 00:01:53,760
So now you know.

42
00:01:53,760 --> 00:01:55,680
Let's get on to the actual content of this.

43
00:01:55,680 --> 00:01:58,600
So yeah, so Blue Hat was last week.

44
00:01:58,600 --> 00:02:01,120
It was in Redmond, Washington.

45
00:02:01,120 --> 00:02:02,920
I was the master of ceremonies.

46
00:02:02,920 --> 00:02:04,800
Talk about being invited back.

47
00:02:04,800 --> 00:02:08,280
I was, yeah, I was invited back to do a different event.

48
00:02:08,280 --> 00:02:12,320
I'd done one in Redmond a few weeks prior, an internal conference.

49
00:02:12,320 --> 00:02:14,080
So yeah, I was the MC.

50
00:02:14,080 --> 00:02:18,640
So as I mentioned, you know, I did the MC job a few weeks ago at a different conference

51
00:02:18,640 --> 00:02:19,640
side of Microsoft.

52
00:02:19,640 --> 00:02:22,640
To be honest, I don't want to sound like, you know, big headed or anything, but I've

53
00:02:22,640 --> 00:02:25,400
actually got some really, really positive comments about the way I am seated.

54
00:02:25,400 --> 00:02:27,960
So I want to just share some thoughts on that.

55
00:02:27,960 --> 00:02:31,360
So first of all, I want, not one of these MCs who sort of gets up there and says, you

56
00:02:31,360 --> 00:02:32,360
know, welcome, you know, here's Bruce.

57
00:02:32,360 --> 00:02:34,560
He's going to talk about blah, blah, blah.

58
00:02:34,560 --> 00:02:35,560
Yeah, hey, Bruce.

59
00:02:35,560 --> 00:02:38,320
And then when Bruce finishes, like, yeah, hey, thanks, Bruce.

60
00:02:38,320 --> 00:02:39,320
You know, here's Mary.

61
00:02:39,320 --> 00:02:41,320
You've got to show some color, right?

62
00:02:41,320 --> 00:02:44,600
So I talk, you know, I'll talk about the subject that's coming up.

63
00:02:44,600 --> 00:02:48,200
If I know the person, I'll talk about the person a little bit, especially if we've had,

64
00:02:48,200 --> 00:02:50,840
you know, some dealings in the past.

65
00:02:50,840 --> 00:02:52,360
But yeah, you've got to provide a little bit of color, right?

66
00:02:52,360 --> 00:02:54,160
If you're going to be an MC.

67
00:02:54,160 --> 00:02:56,840
So yeah, so the key thing there, I think from my perspective is, you know, if you're going

68
00:02:56,840 --> 00:03:01,120
to MC something, you know, research what the talks are, research the person who's presenting

69
00:03:01,120 --> 00:03:04,240
if you don't already know them, you'd be amazed how far that goes.

70
00:03:04,240 --> 00:03:07,520
But don't just, you know, here's Bruce, here's Mary.

71
00:03:07,520 --> 00:03:09,160
No one really appreciates that.

72
00:03:09,160 --> 00:03:11,520
Nick, there were a lot of people there, right?

73
00:03:11,520 --> 00:03:13,360
There's a lot of people there.

74
00:03:13,360 --> 00:03:15,440
Yes, that's right.

75
00:03:15,440 --> 00:03:16,440
We did.

76
00:03:16,440 --> 00:03:20,040
So Blue Hat's interesting, I think I mentioned this last episode, but just sort of a quick

77
00:03:20,040 --> 00:03:21,040
recap.

78
00:03:21,040 --> 00:03:28,560
So Blue Hat started almost 20 years ago, 2005, it was initially just for Microsoft employees

79
00:03:28,560 --> 00:03:35,400
to attend, who were sort of in this new and upcoming field of security and cybersecurity.

80
00:03:35,400 --> 00:03:39,600
And what they did was they brought some of the best speakers or the most sort of, you

81
00:03:39,600 --> 00:03:48,680
know, interesting and also sort of contentious speakers from Black Hat to Microsoft.

82
00:03:48,680 --> 00:03:52,600
And Microsoft, we have blue badges to identify FTEs.

83
00:03:52,600 --> 00:03:58,560
And so they bought this sort of subset of presenters to Redmond to have them sort of

84
00:03:58,560 --> 00:03:59,560
present their content.

85
00:03:59,560 --> 00:04:00,880
And they call that Blue Hat.

86
00:04:00,880 --> 00:04:02,680
And then over the years, it's evolved.

87
00:04:02,680 --> 00:04:06,440
And now where we find ourselves almost 20 years later is that it is a conference that

88
00:04:06,440 --> 00:04:11,640
is for both internal and external attendees, as well as internal and external presenters,

89
00:04:11,640 --> 00:04:14,900
which are really hard for a 50-50 balance.

90
00:04:14,900 --> 00:04:18,880
So we want to make sure that if you're an attendee at Blue Hat, and you work for Microsoft,

91
00:04:18,880 --> 00:04:22,960
then half the attendees there also work for Microsoft, but the other half don't.

92
00:04:22,960 --> 00:04:28,520
They are folks from other companies, other organizations, they're students, but they

93
00:04:28,520 --> 00:04:30,280
are not employees of Microsoft.

94
00:04:30,280 --> 00:04:31,280
Same with the presenters.

95
00:04:31,280 --> 00:04:35,980
We try really hard to have the day one and the day two presenters be split 50-50 as best

96
00:04:35,980 --> 00:04:37,360
we can.

97
00:04:37,360 --> 00:04:41,800
And then the topic is security research and response.

98
00:04:41,800 --> 00:04:46,320
And so the sessions that are being presented, whether they are full 45-minute breakout sessions

99
00:04:46,320 --> 00:04:49,840
or whether they are out sort of 15-minute lightning talks that happen during the lunch

100
00:04:49,840 --> 00:04:54,760
break, or whether they are conversations in the hall or in the villages, is really around

101
00:04:54,760 --> 00:04:56,480
security research and response.

102
00:04:56,480 --> 00:05:04,740
So new research findings, new research techniques, new ways to respond to up and coming sort

103
00:05:04,740 --> 00:05:11,080
of research, sorry, discoveries from research, whether that be vulnerabilities or red teaming,

104
00:05:11,080 --> 00:05:13,840
blue teaming, et cetera, et cetera.

105
00:05:13,840 --> 00:05:21,500
And yeah, we had, let's see, we had, I think over the, it was about 500, 550 people in

106
00:05:21,500 --> 00:05:24,040
total each day.

107
00:05:24,040 --> 00:05:25,480
That's not the same 550.

108
00:05:25,480 --> 00:05:30,680
We have a lot of interest in people that want to attend Blue Hat.

109
00:05:30,680 --> 00:05:36,480
So we have to sort of parcel out our internal folks and sometimes they'll get a day one

110
00:05:36,480 --> 00:05:38,760
pass or sometimes they'll get a day two pass.

111
00:05:38,760 --> 00:05:45,440
So I think throughout the whole conference, maybe like 7, 5, 800 if you sort of add everyone

112
00:05:45,440 --> 00:05:46,440
up.

113
00:05:46,440 --> 00:05:49,840
But yeah, it was a great assortment of people.

114
00:05:49,840 --> 00:05:54,640
It was a great collection of people, really great representation inside of Microsoft,

115
00:05:54,640 --> 00:05:58,660
really great representation from the community and the industry outside of Microsoft.

116
00:05:58,660 --> 00:06:02,120
We had, I think 30 different countries representing.

117
00:06:02,120 --> 00:06:04,680
We had people flying in from all over the world.

118
00:06:04,680 --> 00:06:10,720
We had people flying in from Africa, from Europe, from Australia, from Japan and South

119
00:06:10,720 --> 00:06:12,560
Korea, from China.

120
00:06:12,560 --> 00:06:18,040
Really, even though it is a conference that physically and geographically happens on the

121
00:06:18,040 --> 00:06:23,280
main campus of Microsoft in Redmond, Washington, it really has a very much a sort of a global

122
00:06:23,280 --> 00:06:24,520
flavor.

123
00:06:24,520 --> 00:06:27,680
And there are presenters who come from all over the world as well as attendees, which

124
00:06:27,680 --> 00:06:32,320
is always amazing to see as someone helping put on the conference.

125
00:06:32,320 --> 00:06:34,200
So the tracks.

126
00:06:34,200 --> 00:06:42,240
So on day one, there were two tracks, cloud and identity security and apps and OS security.

127
00:06:42,240 --> 00:06:47,560
And day two was threat hunting and Intel and AI and ML security.

128
00:06:47,560 --> 00:06:51,880
So I'll actually send a link, a post link on the show notes to the agenda so people

129
00:06:51,880 --> 00:06:54,400
can see what was going on.

130
00:06:54,400 --> 00:06:55,920
Why were those tracks chosen?

131
00:06:55,920 --> 00:06:57,640
Yeah, it's a great question.

132
00:06:57,640 --> 00:06:59,920
I got asked this a few times at Blue Hat.

133
00:06:59,920 --> 00:07:06,600
So the content that gets presented at Blue Hat comes from a public call for papers, the

134
00:07:06,600 --> 00:07:07,600
CFP.

135
00:07:07,600 --> 00:07:14,640
And so we launched this CFP, this call for papers, and we ask anyone, literally anybody

136
00:07:14,640 --> 00:07:16,280
can submit to the CFP.

137
00:07:16,280 --> 00:07:22,140
You don't have to be, there's no real criteria for proving that you're in the industry or

138
00:07:22,140 --> 00:07:25,840
proving that you're a researcher or working for some sort of subset of companies.

139
00:07:25,840 --> 00:07:28,120
Anyone can submit.

140
00:07:28,120 --> 00:07:32,060
And so we had over 100 submissions this year, which was fantastic.

141
00:07:32,060 --> 00:07:38,400
And then from there, we have a content advisory board, a CAB, which is a group of about 10

142
00:07:38,400 --> 00:07:45,480
folks who help us sort of whittle that down from 100 to 60 and then from 60 sort of down

143
00:07:45,480 --> 00:07:54,000
to 40 and then from 40 down to 25 or so that then gets selected as the final talks.

144
00:07:54,000 --> 00:08:02,160
And so the tracks that you just mentioned there, they're sort of somewhat organic in

145
00:08:02,160 --> 00:08:07,380
the sense that they're a representation of what was selected by the CAB.

146
00:08:07,380 --> 00:08:11,400
We didn't start with the tracks and then look to fill them.

147
00:08:11,400 --> 00:08:16,600
What we did is we selected the best sessions or our CAB helped us select the best sessions

148
00:08:16,600 --> 00:08:20,440
from all of the submissions through the call for papers.

149
00:08:20,440 --> 00:08:26,720
And then from there, we spent time and was like, okay, so how do we group these together

150
00:08:26,720 --> 00:08:37,660
in the most sort of logical cohorts to create tracks that then allow attendees to hopefully

151
00:08:37,660 --> 00:08:45,280
be able to manage their own time and agenda based on some of those really big sort of

152
00:08:45,280 --> 00:08:46,280
topic areas.

153
00:08:46,280 --> 00:08:54,040
So for example, cloud and identity security, it just so happened that we had six sessions.

154
00:08:54,040 --> 00:08:58,720
We actually sort of had seven, but we had six sessions that absolutely fit into that

155
00:08:58,720 --> 00:09:03,980
cloud and identity security sort of subject matter focus.

156
00:09:03,980 --> 00:09:06,000
And so we thought, well, let's put them all in one track.

157
00:09:06,000 --> 00:09:11,800
And so the folks that are at Bluehat that work specifically with a focus on cloud and

158
00:09:11,800 --> 00:09:17,120
identity security, whether that's research or response, red team, blue team, et cetera,

159
00:09:17,120 --> 00:09:21,880
they will at least know that there is a single track that they can go and sit in and they'll

160
00:09:21,880 --> 00:09:26,920
be able to get all that content in a sort of a linear fashion.

161
00:09:26,920 --> 00:09:28,840
Same with OS and app security.

162
00:09:28,840 --> 00:09:32,080
That's how that track was created.

163
00:09:32,080 --> 00:09:37,880
We had a bunch of submissions that came in and were selected and we were like, oh, look,

164
00:09:37,880 --> 00:09:43,680
these are all for essentially on-premise, on-device focused content versus cloud and

165
00:09:43,680 --> 00:09:45,200
identity.

166
00:09:45,200 --> 00:09:50,440
And then AI and ML sort of became its own thing.

167
00:09:50,440 --> 00:09:56,080
We arguably had two...

168
00:09:56,080 --> 00:10:02,240
So the track C, threat hunting and Intel, that one was maybe a bit of a stretch in some

169
00:10:02,240 --> 00:10:03,240
ways.

170
00:10:03,240 --> 00:10:08,240
And the first half of the day was very much threat hunting and Intel focused in its literal

171
00:10:08,240 --> 00:10:09,240
sense.

172
00:10:09,240 --> 00:10:14,360
And then we sort of were fitting some other sessions in there in a way that hopefully

173
00:10:14,360 --> 00:10:15,360
made sense.

174
00:10:15,360 --> 00:10:19,300
But yeah, I mean, that's a very long-winded answer to your question.

175
00:10:19,300 --> 00:10:24,520
And the question is, the tracks come about in a somewhat organic fashion based on the

176
00:10:24,520 --> 00:10:27,160
papers that are submitted as part of the CFP.

177
00:10:27,160 --> 00:10:34,160
And then once our cab help us select the final selection of papers that we want to be presented

178
00:10:34,160 --> 00:10:37,880
at Blue Hat Sessions, we then work out what do we think the best grouping is in order

179
00:10:37,880 --> 00:10:44,920
to create tracks that hopefully align to some of the core focus areas for attendees.

180
00:10:44,920 --> 00:10:46,280
I'm glad you said that.

181
00:10:46,280 --> 00:10:51,840
I'm glad we didn't just say, hey, here are four categories we want from when we're doing

182
00:10:51,840 --> 00:10:54,840
the call for papers.

183
00:10:54,840 --> 00:11:00,560
I'm very glad it's not just, here's the categories, please apply knowing these categories are

184
00:11:00,560 --> 00:11:01,560
in mind.

185
00:11:01,560 --> 00:11:05,120
It makes so much more sense to just take all the papers and then say, okay, where does

186
00:11:05,120 --> 00:11:07,240
it make sense to have the categories and have the tracks?

187
00:11:07,240 --> 00:11:08,240
I like that idea.

188
00:11:08,240 --> 00:11:11,400
And I actually agree with all four tracks.

189
00:11:11,400 --> 00:11:14,960
Based on being 2024, I think the tracks make absolute sense to me.

190
00:11:14,960 --> 00:11:18,320
If you look at the Cloud and Identity stuff, I think it's great looking at some of the

191
00:11:18,320 --> 00:11:23,120
issues with OAuth 2 and the way people use OAuth 2 because everyone's building in the

192
00:11:23,120 --> 00:11:31,720
Cloud and OAuth 2 is the fundamental authorization mechanism for accessing resources in the Cloud.

193
00:11:31,720 --> 00:11:38,120
It reminds me a little bit of back in the days, in the late 90s, when we had X.509 issues

194
00:11:38,120 --> 00:11:41,280
when people weren't doing certificates correctly.

195
00:11:41,280 --> 00:11:45,480
And it reminds me a little bit of that just 20 something years later.

196
00:11:45,480 --> 00:11:49,200
So I want to go through just some of the papers real fast.

197
00:11:49,200 --> 00:11:53,640
So I was actually also emceeing the app and OS security track.

198
00:11:53,640 --> 00:11:57,680
The first one was on Decom research, which is Distributed Comm, which is our object model

199
00:11:57,680 --> 00:11:58,680
in Windows.

200
00:11:58,680 --> 00:12:02,280
And that was really good to see because just talking about some areas where there may be

201
00:12:02,280 --> 00:12:03,280
concerns.

202
00:12:03,280 --> 00:12:05,480
So let's see if we can work those issues out.

203
00:12:05,480 --> 00:12:09,080
Next one was on some CVEs, so some actual vulnerabilities and how the person actually

204
00:12:09,080 --> 00:12:12,800
found them or the researchers found them, which I thought was really, really cool.

205
00:12:12,800 --> 00:12:17,840
My favorite one of the three was pointer problems, which is basically how we're refactoring parts

206
00:12:17,840 --> 00:12:21,920
of the Windows kernel to work around pointer concerns.

207
00:12:21,920 --> 00:12:24,440
I kind of joked before the...

208
00:12:24,440 --> 00:12:28,960
And the nerds will appreciate this, but I said pointers are completely fine.

209
00:12:28,960 --> 00:12:30,920
The problems only begin when you dereference them.

210
00:12:30,920 --> 00:12:33,920
And apparently I got some groans from the audience because it's a bit of a security

211
00:12:33,920 --> 00:12:36,560
dad joke, but anyway, that is what it is.

212
00:12:36,560 --> 00:12:39,180
So yeah, those two made absolute sense.

213
00:12:39,180 --> 00:12:44,960
The threat Intel one I found interesting because you kind of got to understand.

214
00:12:44,960 --> 00:12:48,520
If you're building software, you've kind of got to understand the threats.

215
00:12:48,520 --> 00:12:50,240
You've got to understand what you're up against.

216
00:12:50,240 --> 00:12:53,720
And I think that's really important, but there's also another aspect to that, which is just

217
00:12:53,720 --> 00:12:57,720
the whole Intel side of it, which is not software development, which is understanding what's

218
00:12:57,720 --> 00:13:02,120
going on in the marketplace and in the industry so that you can feed some of that information

219
00:13:02,120 --> 00:13:03,120
into the organization.

220
00:13:03,120 --> 00:13:08,120
And hopefully some of that data also finds its way to not just people administering systems,

221
00:13:08,120 --> 00:13:12,920
but also people who are building and developing and maintaining systems so they can update

222
00:13:12,920 --> 00:13:16,200
their environments accordingly and their codes and their designs accordingly.

223
00:13:16,200 --> 00:13:18,280
In fact, that's now what I'm working on at Microsoft.

224
00:13:18,280 --> 00:13:23,840
That's why I moved into the Mystic team is to do exactly that is to focus on not just,

225
00:13:23,840 --> 00:13:27,000
you know, what's the correct way of building secure software, but what's the best way of

226
00:13:27,000 --> 00:13:30,480
building secure software and knowing what the real threats are out there.

227
00:13:30,480 --> 00:13:33,920
And so now I have access to that threat Intel, which is really, really, really cool.

228
00:13:33,920 --> 00:13:38,000
Hey, we also had some lightning talks in the middle, which were a lot of fun.

229
00:13:38,000 --> 00:13:39,000
We did.

230
00:13:39,000 --> 00:13:45,000
The lightning talks are 15 minutes each or maximum of 15 minutes and they are very quick

231
00:13:45,000 --> 00:13:46,280
turn.

232
00:13:46,280 --> 00:13:51,400
The goal is, you know, we give about 90 minutes total for the lunch break.

233
00:13:51,400 --> 00:13:56,840
So folks get to stretch their legs, grab some food, and then they come back to this particular

234
00:13:56,840 --> 00:13:59,760
room in the Microsoft conference center called Kodiak.

235
00:13:59,760 --> 00:14:00,960
And that's where the lightning talks are.

236
00:14:00,960 --> 00:14:02,280
And we had five each day.

237
00:14:02,280 --> 00:14:05,200
We had five lightning talks back to back.

238
00:14:05,200 --> 00:14:11,840
And the lightning talks are, you know, that's something else that our cab, our content advisory

239
00:14:11,840 --> 00:14:13,880
board selects for us.

240
00:14:13,880 --> 00:14:20,880
And they're looking for, you know, bite size content, really, you know, 15 minutes in some

241
00:14:20,880 --> 00:14:22,520
ways isn't a lot of time.

242
00:14:22,520 --> 00:14:29,920
So what are some sort of topics that can be adequately covered or maybe perhaps brought

243
00:14:29,920 --> 00:14:30,920
up, right?

244
00:14:30,920 --> 00:14:35,520
And the lightning talk is also sort of posing a question or an idea or a new thought to

245
00:14:35,520 --> 00:14:37,600
the community and to the audience.

246
00:14:37,600 --> 00:14:45,160
And yeah, we had fantastic lightning talks on each day, on day one, looking at some analysis

247
00:14:45,160 --> 00:14:52,320
of online scams and some of the techniques and frameworks that they use, a personal story

248
00:14:52,320 --> 00:14:58,560
or sort of more of a human focus story for someone sort of, you know, going through their

249
00:14:58,560 --> 00:15:03,840
security engineering journey and how that relates to how they go about building tools.

250
00:15:03,840 --> 00:15:09,960
We had a very fun session from David Cross and Svetlana from Oracle where they talked

251
00:15:09,960 --> 00:15:12,560
about the synergy between red and blue teams.

252
00:15:12,560 --> 00:15:16,360
They even had costumes and props and it was a ton of fun.

253
00:15:16,360 --> 00:15:19,040
Well, I mean, David was wearing his red suit.

254
00:15:19,040 --> 00:15:22,040
I'm like, are you like being paid to wear that or something?

255
00:15:22,040 --> 00:15:24,360
David and I have known each other for a long time.

256
00:15:24,360 --> 00:15:25,680
He and I worked in Windows together.

257
00:15:25,680 --> 00:15:29,520
He worked on PKI, like smart cards and that sort of stuff back in the day.

258
00:15:29,520 --> 00:15:33,080
And someone paying you to wear that is like, no, I'm wearing red because I'm representing

259
00:15:33,080 --> 00:15:35,980
the red team and Svetlana was wearing a blue dress.

260
00:15:35,980 --> 00:15:36,980
It was incredible.

261
00:15:36,980 --> 00:15:42,660
And David probably gets maybe the prize for the best dressed blue hat attendee across

262
00:15:42,660 --> 00:15:44,080
the entire conference too.

263
00:15:44,080 --> 00:15:48,040
So shout out to Mr. Cross there.

264
00:15:48,040 --> 00:15:51,000
And then just sort of quickly wrapping up on lightning talks, you know, talked about

265
00:15:51,000 --> 00:15:55,720
sort of lessons learned from scaling open source at Microsoft and then we had Eve, Eve

266
00:15:55,720 --> 00:15:59,520
Yunan from Cisco Talos talking about entitlements on Mac OS.

267
00:15:59,520 --> 00:16:08,360
So really, you know, five very different topics, but some really sort of fascinating perspectives

268
00:16:08,360 --> 00:16:13,920
that are offered, some interesting questions that were sort of posed and little nuggets

269
00:16:13,920 --> 00:16:18,920
of sort of food for thought, which is what the goal is for lightning talks.

270
00:16:18,920 --> 00:16:22,840
I think an important point there is sometimes you don't need to cover the entire topic,

271
00:16:22,840 --> 00:16:23,840
right?

272
00:16:23,840 --> 00:16:25,400
You just want to get people thinking about things.

273
00:16:25,400 --> 00:16:28,080
Oh, I didn't even know that there were entitlements in Mac OS.

274
00:16:28,080 --> 00:16:30,120
How can I use those?

275
00:16:30,120 --> 00:16:33,500
And then you just start doing a little bit digging, a bit more digging yourself.

276
00:16:33,500 --> 00:16:39,000
So yeah, I love these sort of very small bite sized content because again, people in the

277
00:16:39,000 --> 00:16:40,680
audience are smart people.

278
00:16:40,680 --> 00:16:43,880
They can go to the ones that are of interest to them and then learn a little bit of something

279
00:16:43,880 --> 00:16:47,280
and then take that away and perhaps even learn, you know, learn even more down the track.

280
00:16:47,280 --> 00:16:50,640
Yeah, should I quickly talk through the day two lightning talks?

281
00:16:50,640 --> 00:16:51,720
Yeah.

282
00:16:51,720 --> 00:16:57,200
So we had Brett Hawkins from IBM who talked about detecting Microsoft Intune lateral movement,

283
00:16:57,200 --> 00:16:58,860
which was great.

284
00:16:58,860 --> 00:17:06,000
We had Vivek Vinod Sharma from Microsoft talking about AI rag muffins.

285
00:17:06,000 --> 00:17:09,960
We often get some fun puns in the session titles.

286
00:17:09,960 --> 00:17:14,600
Then we had a really interesting talk from Tom Williams from True Zero Technologies talking

287
00:17:14,600 --> 00:17:20,560
about sort of ransomware and I think his title was Turning the Tide Against Cyber Extortion.

288
00:17:20,560 --> 00:17:22,200
That was a really interesting talk.

289
00:17:22,200 --> 00:17:30,080
Aobami Alotunji from Microsoft talked about safe chat AI, so enhancing sort of awareness

290
00:17:30,080 --> 00:17:35,140
of cybersecurity issues using AI bots and AI chat bots.

291
00:17:35,140 --> 00:17:40,360
And then we finished off with firmware security, you know, coming from Nitin Saad from, or

292
00:17:40,360 --> 00:17:44,760
Saad from Google and I just, I thought these were all fantastic lighting talks.

293
00:17:44,760 --> 00:17:48,280
What was really interesting, someone asked me at the conference, wasn't there a hardware

294
00:17:48,280 --> 00:17:50,280
track or an IoT track?

295
00:17:50,280 --> 00:17:51,880
And it was a great question.

296
00:17:51,880 --> 00:17:56,920
And the simple, simple answer was we actually didn't really get any submissions for those

297
00:17:56,920 --> 00:17:57,920
topics.

298
00:17:57,920 --> 00:17:58,920
We got one.

299
00:17:58,920 --> 00:17:59,920
We got one, right?

300
00:17:59,920 --> 00:18:01,840
Because that was on day one.

301
00:18:01,840 --> 00:18:05,640
It was when the levy breaks, exposing critical flaws in Wi-Fi camera ecosystems.

302
00:18:05,640 --> 00:18:06,640
You're absolutely right.

303
00:18:06,640 --> 00:18:13,840
So what I meant is that we didn't get enough to create a single track.

304
00:18:13,840 --> 00:18:14,840
I think I misspoke there.

305
00:18:14,840 --> 00:18:20,480
The question I was like, why is there not a track for hardware based security or IoT

306
00:18:20,480 --> 00:18:21,480
security?

307
00:18:21,480 --> 00:18:26,480
And yes, the answer to that question is that we just didn't get enough submissions for

308
00:18:26,480 --> 00:18:30,000
that to be a standalone track.

309
00:18:30,000 --> 00:18:32,360
And so I just think that's a sort of interesting call out, right?

310
00:18:32,360 --> 00:18:38,000
That the tracks that we had and the topics that we had are a reflection of what gets

311
00:18:38,000 --> 00:18:39,760
submitted to us.

312
00:18:39,760 --> 00:18:44,420
And so that's one thing that we also need to be on the lookout for when we're running

313
00:18:44,420 --> 00:18:50,440
the Blue Hat Conference is how are we actually advertising to the industry, to the community

314
00:18:50,440 --> 00:18:55,880
that the Colf papers is open and that we want this really broad cross section.

315
00:18:55,880 --> 00:18:59,840
And we want to make sure that the IoT researchers and the hardware security researchers are

316
00:18:59,840 --> 00:19:05,400
aware and submitting to us so that we do get a sort of a broader cross section.

317
00:19:05,400 --> 00:19:08,440
Just sort of a fascinating observation from when you look at the schedule, if you think

318
00:19:08,440 --> 00:19:13,480
something's missing, it could just be simply that at that moment in time, that portion,

319
00:19:13,480 --> 00:19:17,520
that segment of the community of the industry didn't sort of submit.

320
00:19:17,520 --> 00:19:18,520
And that's not a bad thing.

321
00:19:18,520 --> 00:19:23,000
It's just a fascinating sort of snapshot in time of what's going on in cybersecurity.

322
00:19:23,000 --> 00:19:25,520
By the way, that the session that I just mentioned was absolutely terrifying.

323
00:19:25,520 --> 00:19:26,520
And it's actually kind of funny.

324
00:19:26,520 --> 00:19:31,200
It was almost like, you know, 1999 call them want their vulnerabilities back.

325
00:19:31,200 --> 00:19:36,880
You know, it's the same silly mistakes that we saw 20 something years ago, but being done,

326
00:19:36,880 --> 00:19:39,600
you know, down in cameras and so on.

327
00:19:39,600 --> 00:19:41,760
But yeah, the day two stuff was interesting.

328
00:19:41,760 --> 00:19:46,200
The AI track was fascinating, mainly because the three topics were quite different.

329
00:19:46,200 --> 00:19:49,920
First one was about red teaming AI, which is really, really cool.

330
00:19:49,920 --> 00:19:54,240
You're thinking about how you can sort of break AI or more accurately, probably, you

331
00:19:54,240 --> 00:19:56,280
know, large language models.

332
00:19:56,280 --> 00:19:58,320
Next one was about hallucination.

333
00:19:58,320 --> 00:20:02,680
And then the last one was breaking LLM applications, which is all about prompt injection, sort

334
00:20:02,680 --> 00:20:08,080
of exploitation and the latest research in that area, which is very somewhat related,

335
00:20:08,080 --> 00:20:10,080
but quite different topics.

336
00:20:10,080 --> 00:20:13,960
But you know, very pertinent for people who are building systems based on LLMs.

337
00:20:13,960 --> 00:20:15,500
That was good to say.

338
00:20:15,500 --> 00:20:20,560
And then in the afternoon, we also had on the threat hunting and Intel track, we had

339
00:20:20,560 --> 00:20:24,960
that patterns in the shadows, which is scaling threat hunting and intelligence, which is

340
00:20:24,960 --> 00:20:27,120
incredibly important.

341
00:20:27,120 --> 00:20:30,480
The afternoon, my favorite one in the afternoon was actually was Mystic, which is the team

342
00:20:30,480 --> 00:20:33,840
I'm in, threat intelligence year in review.

343
00:20:33,840 --> 00:20:40,920
The reason why I liked that session so much is because it really brings to focus the risks

344
00:20:40,920 --> 00:20:44,200
and the threats that we have to mitigate.

345
00:20:44,200 --> 00:20:48,480
A lot of people don't realize what we're up against.

346
00:20:48,480 --> 00:20:53,080
And that was a very refreshing, if not terrifying talk as well.

347
00:20:53,080 --> 00:20:57,280
Yeah, so shout out to Rachel Giacobuzzi, who presented that.

348
00:20:57,280 --> 00:21:01,080
That was a really highly reviewed and highly rated session.

349
00:21:01,080 --> 00:21:06,520
I might jump in Michael here and just sort of say that all pretty much every single session

350
00:21:06,520 --> 00:21:08,160
from Blue Hat was recorded.

351
00:21:08,160 --> 00:21:10,380
Well, they all were recorded, excuse me.

352
00:21:10,380 --> 00:21:16,040
And every single one of them will be published on our Blue Hat YouTube channel.

353
00:21:16,040 --> 00:21:20,420
If that's okay, I can maybe ask you to put the link in the show notes for when this episode

354
00:21:20,420 --> 00:21:21,420
goes live.

355
00:21:21,420 --> 00:21:25,400
Depending on when you're listening to this, hopefully those recordings will be ready to

356
00:21:25,400 --> 00:21:26,400
go.

357
00:21:26,400 --> 00:21:33,200
And yeah, there were, I know at any time during Blue Hat, attendees were having to choose

358
00:21:33,200 --> 00:21:34,400
one track over the other.

359
00:21:34,400 --> 00:21:37,560
So I think attendees will probably want to go and watch some of the sessions that they

360
00:21:37,560 --> 00:21:39,960
weren't able to see because it was happening in a competing track.

361
00:21:39,960 --> 00:21:42,240
And then obviously the folks couldn't make it in person.

362
00:21:42,240 --> 00:21:48,000
We hope they go and watch those videos and they ask any questions or reach out to the

363
00:21:48,000 --> 00:21:55,280
presenters through the information that we'll have there probably in the individual session

364
00:21:55,280 --> 00:21:56,280
recording notes.

365
00:21:56,280 --> 00:22:02,240
Another thing, if I could jump in again, just ahead of your next question, Michael, one

366
00:22:02,240 --> 00:22:07,280
thing that we saw a lot of at Blue Hat this year, which was so awesome and I really hope

367
00:22:07,280 --> 00:22:15,600
we get to see more of is we had dual presenters for sessions where one of the presenters was

368
00:22:15,600 --> 00:22:26,000
a non-Microsoft researcher or a non-Microsoft spokesperson and then combined with a Microsoft.

369
00:22:26,000 --> 00:22:32,640
So essentially there was about four sessions where it was, I'll say external as in non-Microsoft

370
00:22:32,640 --> 00:22:36,280
presenter and a Microsoft presenter sort of co-presenting on that topic.

371
00:22:36,280 --> 00:22:41,640
And what they were essentially doing is the external person that doesn't work for Microsoft

372
00:22:41,640 --> 00:22:46,000
was saying, hey, I did some research and I found a thing.

373
00:22:46,000 --> 00:22:51,600
And then the Microsoft person would then sort of come on and say, and then let me tell you

374
00:22:51,600 --> 00:22:52,720
the other side of the story.

375
00:22:52,720 --> 00:22:56,480
Let me tell you the sort of the coin of here's what happened when that research or when those

376
00:22:56,480 --> 00:23:01,880
findings were submitted to us and how we went about not just fixing them, but how we then

377
00:23:01,880 --> 00:23:08,640
went about potentially doing some sort of more longer term work, whether it's to try

378
00:23:08,640 --> 00:23:12,920
and try and mitigate a complete class or sort of do some other sort of variant hunting or

379
00:23:12,920 --> 00:23:16,480
just how sort of processes change and evolve based on that.

380
00:23:16,480 --> 00:23:21,400
And those sessions were absolutely fantastic to see that sort of both sides of the coin

381
00:23:21,400 --> 00:23:23,800
or two sides of the coin yin yang.

382
00:23:23,800 --> 00:23:27,600
I really hope that we get more and more of that at BlueHat because I think it's quite

383
00:23:27,600 --> 00:23:31,800
a unique thing that we can do at a conference like BlueHat.

384
00:23:31,800 --> 00:23:35,960
And I just love seeing that and I think the audience liked it as well.

385
00:23:35,960 --> 00:23:39,440
Yeah, I mean, so one thing we haven't covered are the two keynotes.

386
00:23:39,440 --> 00:23:42,720
So one was outside of Microsoft, one was inside of Microsoft.

387
00:23:42,720 --> 00:23:45,840
So the first one day one was Chris Weisople.

388
00:23:45,840 --> 00:23:52,280
So I've known Chris for probably 25 years, thereabouts.

389
00:23:52,280 --> 00:23:58,120
He set up a company called Veracode back in the day before he actually started Veracode.

390
00:23:58,120 --> 00:24:00,320
He and I presented a conference together.

391
00:24:00,320 --> 00:24:04,400
I can't remember what it was, probably about 20 something years ago.

392
00:24:04,400 --> 00:24:09,680
And he demonstrated some static analysis in Java that he was working on.

393
00:24:09,680 --> 00:24:14,640
And that code eventually became the starting point for Veracode.

394
00:24:14,640 --> 00:24:19,280
And I talked about some internal tools that we had at Microsoft called Prefix and Prefast.

395
00:24:19,280 --> 00:24:24,640
Prefast is actually in the Visual C++ compiler when you do slash analyze that actually invokes

396
00:24:24,640 --> 00:24:28,080
Prefast which is the name of the tool under the covers.

397
00:24:28,080 --> 00:24:33,040
But yeah, he did a magnificent job sort of walking down memory lane starting in the late

398
00:24:33,040 --> 00:24:37,760
1990s when he was part of the loft, loft heavy industries out of Boston, talking about issues

399
00:24:37,760 --> 00:24:39,840
that they'd found in Windows and reporting it.

400
00:24:39,840 --> 00:24:43,800
And you know, the fact that there was a bit of a bit of tension back then and how things

401
00:24:43,800 --> 00:24:45,200
have changed over the years.

402
00:24:45,200 --> 00:24:47,280
He did an absolutely magnificent job.

403
00:24:47,280 --> 00:24:50,880
To me it was a bit of a, it really was a trip down memory lane.

404
00:24:50,880 --> 00:24:54,040
And in fact, we had a really, really good chat.

405
00:24:54,040 --> 00:24:58,320
And to, I mean, I've really got a hat tip Chris for this one.

406
00:24:58,320 --> 00:25:00,640
He emailed me through you, right?

407
00:25:00,640 --> 00:25:04,800
Remember when he was asking about talking to, so we'll name names because it's all public.

408
00:25:04,800 --> 00:25:05,800
Yeah.

409
00:25:05,800 --> 00:25:09,320
So a long, long, long, long, long time ago, there were issues in SMB, which was known

410
00:25:09,320 --> 00:25:11,040
as CIFS back then.

411
00:25:11,040 --> 00:25:14,400
And one of the guys who was involved in that is a guy called Paul Leach.

412
00:25:14,400 --> 00:25:19,560
And Paul was a senior architect working in Windows security on protocols.

413
00:25:19,560 --> 00:25:24,360
And there was a discussion held, I believe at Black Hat one year.

414
00:25:24,360 --> 00:25:27,840
And Chris wanted to make sure that he spoke to Paul.

415
00:25:27,840 --> 00:25:31,240
So I hooked him up with Paul to make sure that what he was going to talk about, about

416
00:25:31,240 --> 00:25:33,800
that event was actually accurate.

417
00:25:33,800 --> 00:25:36,080
A lot of people would just go with the hearsay, right?

418
00:25:36,080 --> 00:25:38,800
And not actually verify that it was correct.

419
00:25:38,800 --> 00:25:44,860
But no, Chris went the extra 15 miles to make sure and confirm with Paul that what was said

420
00:25:44,860 --> 00:25:49,360
was actually said and what was actually covered was actually covered rather than just the

421
00:25:49,360 --> 00:25:50,360
hearsay.

422
00:25:50,360 --> 00:25:51,840
So, you know, hat tip to Chris for doing that.

423
00:25:51,840 --> 00:25:56,320
It made the, it made the presentation a lot more enjoyable because it was more accurate,

424
00:25:56,320 --> 00:25:58,480
which is, which is really good to say.

425
00:25:58,480 --> 00:26:04,200
And then on the second day we had Amanda Silver, corporate VP in the developer division, talking

426
00:26:04,200 --> 00:26:07,800
about some of the SFI stuff that we're doing, secure future initiative stuff and the importance

427
00:26:07,800 --> 00:26:09,680
of the, you know, this kind of work that we're all doing.

428
00:26:09,680 --> 00:26:11,280
Anything else you want to add to that?

429
00:26:11,280 --> 00:26:12,280
No, no, that's great.

430
00:26:12,280 --> 00:26:17,320
And I think, you know, I'll come back to one of the things I said at the top of the episode

431
00:26:17,320 --> 00:26:23,080
is that one of the goals we have, and we work really hard with Blue Hat is to try and find

432
00:26:23,080 --> 00:26:30,040
and keep that balance between internal presenters, whether they're keynote presenters, breakout

433
00:26:30,040 --> 00:26:33,280
sessions, lightning talks, and external.

434
00:26:33,280 --> 00:26:36,400
And when you hear external, we just mean people don't work for Microsoft.

435
00:26:36,400 --> 00:26:41,000
And we try and make sure that balance happens in the attendees as well.

436
00:26:41,000 --> 00:26:43,840
And you know, it's kind of, it's tricky.

437
00:26:43,840 --> 00:26:45,800
It's tricky to do that, to find that balance.

438
00:26:45,800 --> 00:26:51,080
You know, we, Blue Hat is, as I said earlier, sort of physically located.

439
00:26:51,080 --> 00:26:53,920
We know we do it on the Microsoft campus in Redmond, Washington.

440
00:26:53,920 --> 00:26:56,680
And like Michael, you know, for example, you don't live in Redmond, Washington.

441
00:26:56,680 --> 00:27:01,160
So to get you there, you need to get on a plane and, you know, fly and, you know, you

442
00:27:01,160 --> 00:27:07,400
need to rearrange your, you know, a couple of days and your family life and all that

443
00:27:07,400 --> 00:27:08,400
kind of stuff.

444
00:27:08,400 --> 00:27:13,520
And so, you know, it's challenging to create that balance, but it's really important for

445
00:27:13,520 --> 00:27:14,520
us.

446
00:27:14,520 --> 00:27:18,400
And I, you know, I'm not sure if we're at the end of the episode yet, but certainly

447
00:27:18,400 --> 00:27:22,920
one of the questions that I want to pose to your audiences is really just about how we

448
00:27:22,920 --> 00:27:26,400
can, you know, how are we doing on finding and keeping that balance and what can we do

449
00:27:26,400 --> 00:27:30,320
to do better next time?

450
00:27:30,320 --> 00:27:31,320
But maybe I'm jumping ahead.

451
00:27:31,320 --> 00:27:32,320
Are we at final thought?

452
00:27:32,320 --> 00:27:33,320
No, not yet.

453
00:27:33,320 --> 00:27:34,320
We're going to have a couple of minutes.

454
00:27:34,320 --> 00:27:35,320
No more rapid.

455
00:27:35,320 --> 00:27:36,320
It's all good.

456
00:27:36,320 --> 00:27:41,640
Actually, it's even worse than that because I actually just got back from a diving trip

457
00:27:41,640 --> 00:27:43,840
with my wife to Maui on the Sunday night.

458
00:27:43,840 --> 00:27:47,600
And then I had to hop on a plane to Redmond on Monday, which meant that all my diving

459
00:27:47,600 --> 00:27:52,520
gear, when I got back from Redmond was waiting for me to hose down and clean and put away.

460
00:27:52,520 --> 00:27:55,960
My wife's like, no, you're putting your own diving gear away.

461
00:27:55,960 --> 00:27:59,840
So anyway, you didn't do it before you got on the plane?

462
00:27:59,840 --> 00:28:01,600
No, I mean, I'd done some of it.

463
00:28:01,600 --> 00:28:02,600
I did the important stuff.

464
00:28:02,600 --> 00:28:06,840
I did the regulator and the BCD, but things like my mask and my fins and my dive knife

465
00:28:06,840 --> 00:28:08,120
and that sort of stuff, I didn't do.

466
00:28:08,120 --> 00:28:09,760
Did you throw those in the dishwasher?

467
00:28:09,760 --> 00:28:10,760
Yeah, the dishwasher.

468
00:28:10,760 --> 00:28:11,760
I have the dog lick them.

469
00:28:11,760 --> 00:28:12,760
All right.

470
00:28:12,760 --> 00:28:14,560
So what are the benefits of Blue Hat?

471
00:28:14,560 --> 00:28:16,520
I mean, from my perspective, I see two major benefits.

472
00:28:16,520 --> 00:28:17,520
One is obviously learning, right?

473
00:28:17,520 --> 00:28:18,520
I'm a big fan of learning.

474
00:28:18,520 --> 00:28:19,520
You've got to keep learning in this industry.

475
00:28:19,520 --> 00:28:24,080
Otherwise, you're going to, I don't know, give you, I reckon if you don't learn something

476
00:28:24,080 --> 00:28:27,440
new in every six months or so, you're going to get behind very, very quickly.

477
00:28:27,440 --> 00:28:28,440
That's obviously number one.

478
00:28:28,440 --> 00:28:29,920
And number two is just a straight networking.

479
00:28:29,920 --> 00:28:32,840
It really was magnificent catching up with so many people.

480
00:28:32,840 --> 00:28:38,400
I already shared this with you, but the day after I got back, when I was in Redmond, so

481
00:28:38,400 --> 00:28:44,180
after I got back from Redmond, there were 123 LinkedIn invitations from people who bumped

482
00:28:44,180 --> 00:28:47,520
into that Blue Hat, which is great to see.

483
00:28:47,520 --> 00:28:51,400
So I have a rule on LinkedIn, which is if I've never met you or I don't know you, I

484
00:28:51,400 --> 00:28:52,400
don't accept the invitation.

485
00:28:52,400 --> 00:28:54,320
But these are all people that I've met.

486
00:28:54,320 --> 00:28:56,960
So obviously, you know, learning and then just networking.

487
00:28:56,960 --> 00:29:01,600
It's amazing how much you can learn from other people.

488
00:29:01,600 --> 00:29:02,720
I mean, absolutely.

489
00:29:02,720 --> 00:29:05,720
You know, I think learning is priority one.

490
00:29:05,720 --> 00:29:14,360
And coming back to the original intent of Blue Hat, where it was external perspectives

491
00:29:14,360 --> 00:29:19,120
for an internal audience, obviously that's evolved, but it's so important.

492
00:29:19,120 --> 00:29:23,320
And one of the things we do at Blue Hat is we ensure that we're bringing in presenters

493
00:29:23,320 --> 00:29:30,920
and speakers and viewpoints that might be a little hard to hear or might be a little

494
00:29:30,920 --> 00:29:32,120
bit of a challenge to hear.

495
00:29:32,120 --> 00:29:36,760
If you're an engineer working on a Microsoft product and there are vulnerabilities being

496
00:29:36,760 --> 00:29:42,660
discovered in your product and the immense scale and real estate of Microsoft Digital

497
00:29:42,660 --> 00:29:48,200
Footprint means that pretty much every product is going to have some sort of vulnerability

498
00:29:48,200 --> 00:29:52,480
or issue that's found with it at some point.

499
00:29:52,480 --> 00:29:57,280
That can be challenged, but it's so important because we need to ensure that folks, no one

500
00:29:57,280 --> 00:30:02,960
is sort of stuck in sort of a digital echo chamber where they think that their work is

501
00:30:02,960 --> 00:30:07,440
flawless and everything they're doing is sort of perfect.

502
00:30:07,440 --> 00:30:11,440
So part of it is ensuring that we're bringing in those external perspectives and external

503
00:30:11,440 --> 00:30:18,200
ideas and viewpoints to maybe have sometimes some uncomfortable conversations or to push

504
00:30:18,200 --> 00:30:22,600
conversations and discussions in the right direction around evolution and around sort

505
00:30:22,600 --> 00:30:28,120
of breaking out of stale behaviors or sort of methodologies.

506
00:30:28,120 --> 00:30:29,120
That's a big part of it.

507
00:30:29,120 --> 00:30:36,280
So learning, yes, but also learning where it is bringing in new and interesting and

508
00:30:36,280 --> 00:30:40,520
different and sometimes a little bit sort of challenging perspectives, especially if

509
00:30:40,520 --> 00:30:46,200
that's external folks coming and telling us where we need to do things differently and

510
00:30:46,200 --> 00:30:47,200
do them better.

511
00:30:47,200 --> 00:30:49,080
And when I say we, obviously Microsoft.

512
00:30:49,080 --> 00:30:54,020
It's also an opportunity for us, for Microsoft, to present out on stuff that we've been working

513
00:30:54,020 --> 00:30:59,080
on and stuff that we've been evolving and how that can benefit not just customers, but

514
00:30:59,080 --> 00:31:00,320
also the industry.

515
00:31:00,320 --> 00:31:05,360
So there's a sort of knowledge sharing in both directions is sort of a really big part

516
00:31:05,360 --> 00:31:06,360
of it.

517
00:31:06,360 --> 00:31:11,640
And then, yeah, community building and networking is such a massive part of it as well too,

518
00:31:11,640 --> 00:31:14,200
especially coming out of COVID.

519
00:31:14,200 --> 00:31:23,320
I think the cybersecurity industry, folks can sometimes sort of be maybe, I don't want

520
00:31:23,320 --> 00:31:25,480
to say insular.

521
00:31:25,480 --> 00:31:29,600
It was very easy when COVID and work from home and lockdown happened for everyone to

522
00:31:29,600 --> 00:31:32,440
get extra insular.

523
00:31:32,440 --> 00:31:39,320
And so having opportunities to be in person and meet with folks in your industry that

524
00:31:39,320 --> 00:31:42,240
are dealing with the same things that you are or thinking about the same things that

525
00:31:42,240 --> 00:31:45,920
you are or just interested in those topics.

526
00:31:45,920 --> 00:31:50,660
And it's so important to create opportunities for folks to get together and meet each other

527
00:31:50,660 --> 00:31:53,360
and have conversations and have fun.

528
00:31:53,360 --> 00:31:58,760
Networking just doesn't mean that you're sharing your LinkedIn history and what you're

529
00:31:58,760 --> 00:31:59,760
doing day to day.

530
00:31:59,760 --> 00:32:04,640
A lot of it's about also identifying other fun things that help to create community.

531
00:32:04,640 --> 00:32:09,320
So some of the villages that we run at Blue Hat, we have a lockpicking village, which

532
00:32:09,320 --> 00:32:12,040
is one of the most popular villages.

533
00:32:12,040 --> 00:32:20,960
There's always tables full of people with padlocks and those awesome see-through learning

534
00:32:20,960 --> 00:32:28,100
locks where you can see all the pins and you can learn how to pick a lock and there's challenges

535
00:32:28,100 --> 00:32:34,880
and there's sticker trading and people are collecting pins.

536
00:32:34,880 --> 00:32:38,280
Fun is also a big part of how you create and sort of maintain community.

537
00:32:38,280 --> 00:32:43,840
So I totally agree with you on sort of learning and knowledge sharing and then the networking

538
00:32:43,840 --> 00:32:49,040
community building, I think are probably the two big reasons why we run Blue Hat.

539
00:32:49,040 --> 00:32:53,460
So I think one of the things that, currently if I'm wrong here, but gaming security was

540
00:32:53,460 --> 00:32:57,320
one of the, in the village and apparently that's the first time we've had that, is that

541
00:32:57,320 --> 00:32:58,320
right?

542
00:32:58,320 --> 00:32:59,320
That's right.

543
00:32:59,320 --> 00:33:05,240
Yeah, we're really excited to have, so Microsoft made an acquisition of Activision, Blizzard,

544
00:33:05,240 --> 00:33:06,240
King.

545
00:33:06,240 --> 00:33:12,240
I think those are the three sort of large brands, but Activision was sort of the big

546
00:33:12,240 --> 00:33:18,000
one and yeah, that team reached out to us and said, hey, we've attended Blue Hat in

547
00:33:18,000 --> 00:33:22,800
the past, but we'd love to get sort of more involved and we were like, let's do a gaming

548
00:33:22,800 --> 00:33:23,800
security village.

549
00:33:23,800 --> 00:33:27,740
And I think that we're inundated with folks that wanted to chat with them and talk about

550
00:33:27,740 --> 00:33:32,140
gaming security and we're hoping to do more with them in future Blue Hats.

551
00:33:32,140 --> 00:33:38,140
But yeah, gaming security was a new one and I think folks were really excited to see that

552
00:33:38,140 --> 00:33:41,400
and hopefully we'll get more gaming security stuff in the future.

553
00:33:41,400 --> 00:33:44,440
Yeah, a big part of that is obviously cheating.

554
00:33:44,440 --> 00:33:46,240
You don't want people cheating in games.

555
00:33:46,240 --> 00:33:49,400
So yeah, it was good to see how they really could chat with a whole bunch of people.

556
00:33:49,400 --> 00:33:53,400
A whole bunch of people I actually knew, they were actually ex-Microsoft people who had

557
00:33:53,400 --> 00:33:58,360
joined Activision and then found themselves back in Microsoft again after the acquisition.

558
00:33:58,360 --> 00:34:01,160
So that was really, really funny.

559
00:34:01,160 --> 00:34:04,640
But yeah, the lockpick village was really, really busy.

560
00:34:04,640 --> 00:34:08,640
I saw a lot of people, they had a whole bunch of tables and a lot of people were learning

561
00:34:08,640 --> 00:34:09,640
how to pick locks.

562
00:34:09,640 --> 00:34:14,560
A lot of people were really amazed how easy it is to pick most locks that we take for

563
00:34:14,560 --> 00:34:17,140
granted as air quotes being secure.

564
00:34:17,140 --> 00:34:18,140
But yeah.

565
00:34:18,140 --> 00:34:20,840
All right, let's bring this episode to an end.

566
00:34:20,840 --> 00:34:24,680
As you know, Nick, because you're only on a few weeks ago, one thing we always ask our

567
00:34:24,680 --> 00:34:28,560
guests if you had just one little sort of one final thought to leave our listeners

568
00:34:28,560 --> 00:34:30,160
with, what would it be?

569
00:34:30,160 --> 00:34:31,700
So I have two questions for folks.

570
00:34:31,700 --> 00:34:35,760
Question one is, if you only learnt of Blue Hat through listening to either the episode

571
00:34:35,760 --> 00:34:41,240
today or last episode, but it sounds like something you would want to be involved in

572
00:34:41,240 --> 00:34:46,520
or potentially attend or submit a paper for, but you hadn't heard of it, tell us how you

573
00:34:46,520 --> 00:34:50,960
learn of these kind of things so that we can do a better job of promoting Blue Hat and

574
00:34:50,960 --> 00:34:53,640
making sure that more people in the community are aware of it.

575
00:34:53,640 --> 00:34:58,120
So I'd love to know, how do you learn of these kind of conferences and events and what should

576
00:34:58,120 --> 00:35:01,360
we consider for how we advertise Blue Hat in the future?

577
00:35:01,360 --> 00:35:09,200
And then if you were at Blue Hat or were following along with Blue Hat and you have specific

578
00:35:09,200 --> 00:35:13,320
sort of feedback on what you liked or what you think we could do differently, please

579
00:35:13,320 --> 00:35:14,680
also reach out.

580
00:35:14,680 --> 00:35:22,720
I think the best way to do it is send an email to bluehat, B-L-U-E-H-A-T, bluehat at microsoft.com

581
00:35:22,720 --> 00:35:26,280
and myself and some other folks on the team will get your mail.

582
00:35:26,280 --> 00:35:31,120
Yeah, how can we better promote Blue Hat so more people in the community know about it?

583
00:35:31,120 --> 00:35:36,880
How can we better advertise the call for papers so we make sure we're getting a much more

584
00:35:36,880 --> 00:35:44,480
broad and deep representation across all the topics and viewpoints in the industry?

585
00:35:44,480 --> 00:35:48,600
And then yeah, just any other feedback that folks have on the conference.

586
00:35:48,600 --> 00:35:53,040
We really take all of that stuff very, very seriously and when we go into our sort of

587
00:35:53,040 --> 00:36:00,960
planning and review phases and stages, we spend a lot of time looking at the feedback

588
00:36:00,960 --> 00:36:04,920
that we get from the community both inside Microsoft and out and it's really, really

589
00:36:04,920 --> 00:36:05,920
important.

590
00:36:05,920 --> 00:36:09,560
So that's my final thought is thank you everyone that came to Blue Hat.

591
00:36:09,560 --> 00:36:11,680
Thank you everyone that submitted to the call for papers.

592
00:36:11,680 --> 00:36:14,780
Thank you everyone that applied to attend.

593
00:36:14,780 --> 00:36:17,720
Please let us know how we can do better.

594
00:36:17,720 --> 00:36:22,400
And if you love something too, it's always nice to hear what folks responded well to.

595
00:36:22,400 --> 00:36:23,400
Bluehatatmicrosoft.com.

596
00:36:23,400 --> 00:36:24,400
Magnificent.

597
00:36:24,400 --> 00:36:25,400
All right.

598
00:36:25,400 --> 00:36:30,360
Well, with that, let's bring this episode to an end.

599
00:36:30,360 --> 00:36:31,840
Again, Nick, thanks for coming on the podcast.

600
00:36:31,840 --> 00:36:35,360
I know you're a busy boy, so I appreciate you taking the time.

601
00:36:35,360 --> 00:36:38,640
And to all our listeners out there, we hope you found that this abuse, oh, by the way,

602
00:36:38,640 --> 00:36:39,640
Blue Hat is free.

603
00:36:39,640 --> 00:36:40,640
It is free.

604
00:36:40,640 --> 00:36:41,640
So with that, it is free.

605
00:36:41,640 --> 00:36:42,640
That's right.

606
00:36:42,640 --> 00:36:44,000
So with that, let's bring this episode to an end.

607
00:36:44,000 --> 00:36:46,520
So all our listeners out there, thank you so much for listening.

608
00:36:46,520 --> 00:36:48,640
Again, we hope you found this episode of use.

609
00:36:48,640 --> 00:36:50,800
Stay safe and we'll see you next time.

610
00:36:50,800 --> 00:36:53,920
Thanks for listening to the Azure Security Podcast.

611
00:36:53,920 --> 00:37:00,760
You can find show notes and other resources at our website, azsecuritypodcast.net.

612
00:37:00,760 --> 00:37:05,920
If you have any questions, please find us on Twitter at Azure Setpod.

613
00:37:05,920 --> 00:37:21,560
The music is from ccmixtor.com and licensed under the Creative Commons license.

