1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast, where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:17,720
Hey everybody, welcome to episode 106.

4
00:00:17,720 --> 00:00:20,360
This week it's myself, Michael, with Sarah and Mark.

5
00:00:20,360 --> 00:00:24,040
We don't have any guests this week because we're going to talk about Microsoft Ignite

6
00:00:24,040 --> 00:00:25,040
from a security perspective.

7
00:00:25,040 --> 00:00:30,240
But before we get into Microsoft Ignite, I know Mark has just one little piece of news

8
00:00:30,240 --> 00:00:31,480
and then we'll get stuck into Ignite.

9
00:00:31,480 --> 00:00:32,920
Mark, why don't you go?

10
00:00:32,920 --> 00:00:35,160
So a couple quick pieces here.

11
00:00:35,160 --> 00:00:37,280
One is the Zero Trust Playbook.

12
00:00:37,280 --> 00:00:38,280
There is a discount.

13
00:00:38,280 --> 00:00:44,600
This is the same link that was on the slides for those of you that attended my Ignite session.

14
00:00:44,600 --> 00:00:48,080
And for those of you that haven't, the video and the link to it is also there.

15
00:00:48,080 --> 00:00:49,840
So we got both those links there.

16
00:00:49,840 --> 00:00:55,020
And then another thing that came out around the time of Ignite, although it wasn't a specific

17
00:00:55,020 --> 00:01:01,400
way, at Ignite, was the update to the CAF, the Cloud Adoption Framework, secure methodology.

18
00:01:01,400 --> 00:01:05,240
And so one of the big pieces I contributed there was kind of a role by role security

19
00:01:05,240 --> 00:01:07,400
guidance on who does what.

20
00:01:07,400 --> 00:01:09,000
And we treated cloud providers as a role.

21
00:01:09,000 --> 00:01:13,560
We treated the infrastructure and server and container and whatnot teams as a role.

22
00:01:13,560 --> 00:01:16,440
And then we went through each of the different security roles and said, this is what you

23
00:01:16,440 --> 00:01:19,000
all need to do to secure the cloud.

24
00:01:19,000 --> 00:01:22,040
And so a nice piece of work there, I think.

25
00:01:22,040 --> 00:01:23,800
Just that's really all I had.

26
00:01:23,800 --> 00:01:28,800
All right, so let's get on to Microsoft Ignite, which was in Chicago this year.

27
00:01:28,800 --> 00:01:31,760
So Mark and Sarah, you were both there.

28
00:01:31,760 --> 00:01:36,600
So why don't you give us the lowdown on sort of what you saw, what your roles were, what

29
00:01:36,600 --> 00:01:37,600
you were doing.

30
00:01:37,600 --> 00:01:40,480
And then let's get stuck into some of the news that interested each of us.

31
00:01:40,480 --> 00:01:42,960
Yeah, well, I'll go first.

32
00:01:42,960 --> 00:01:47,680
So well, if you were watching the live stream, you probably saw me.

33
00:01:47,680 --> 00:01:53,760
I was one of the co-hosts this year, which I did last year as well, which is a very interesting

34
00:01:53,760 --> 00:01:58,600
thing to do because it's very different to just doing a presentation because it's all

35
00:01:58,600 --> 00:02:04,200
live and there's a TV crew and you have someone in your ear talking to you, but it is very

36
00:02:04,200 --> 00:02:05,680
fun and very different.

37
00:02:05,680 --> 00:02:11,180
So I was sort of kept up on my stage most of the time.

38
00:02:11,180 --> 00:02:15,520
So I didn't get as much time to walk the floor as I would have liked.

39
00:02:15,520 --> 00:02:20,720
Mark probably will have done more of that and can comment more, but it was really big.

40
00:02:20,720 --> 00:02:27,440
I know it's not as big as Ignite sort of in years gone by before 2020, but it's definitely

41
00:02:27,440 --> 00:02:29,800
getting to be a pretty big event.

42
00:02:29,800 --> 00:02:36,500
And I think my main takeaway was I got to do a mixture of Microsoft leadership interviews

43
00:02:36,500 --> 00:02:43,600
and also partners is that even if it wasn't a security interview that I did, that everyone

44
00:02:43,600 --> 00:02:50,400
has a security story to tell now because of renewed focus on security more generally across

45
00:02:50,400 --> 00:02:53,160
the org with Secure Futures initiative.

46
00:02:53,160 --> 00:02:57,280
So I thought that that's probably one of my main takeaways from the event.

47
00:02:57,280 --> 00:03:01,920
And I thought that was really good that everybody could talk about, hey, this is what we're

48
00:03:01,920 --> 00:03:09,000
doing with our product and this is our initiative to do the security bit of our product, even

49
00:03:09,000 --> 00:03:11,040
if it wasn't a defender for blah.

50
00:03:11,040 --> 00:03:13,600
So that's probably the thing that I like the most.

51
00:03:13,600 --> 00:03:15,000
Oh, and one more thing.

52
00:03:15,000 --> 00:03:20,160
I did run the pre days for those of you that don't know Ignite has pre days the day before

53
00:03:20,160 --> 00:03:23,640
the real and inverted commas conference starts.

54
00:03:23,640 --> 00:03:25,120
They're usually training.

55
00:03:25,120 --> 00:03:27,960
They can be labs or it might be lecture based with specialists.

56
00:03:27,960 --> 00:03:33,160
We did an AI Red Team lab with the AI Red Team folks who are amazing who have been on

57
00:03:33,160 --> 00:03:35,360
the podcast in previous episodes.

58
00:03:35,360 --> 00:03:42,920
And we also did one that's we'll talk about more next episode about oversharing for how

59
00:03:42,920 --> 00:03:46,280
to control oversharing for a co-pilot deployment.

60
00:03:46,280 --> 00:03:51,640
And that was very popular for obvious reasons that a lot of people are using co-pilot.

61
00:03:51,640 --> 00:03:57,640
So yeah, I think that was they were the main things I was involved with.

62
00:03:57,640 --> 00:03:59,440
And yeah, it was a good time.

63
00:03:59,440 --> 00:04:01,440
Very busy, but good time.

64
00:04:01,440 --> 00:04:05,760
And I didn't get to do all the celebrity stuff that Sarah got to do.

65
00:04:05,760 --> 00:04:07,600
But I did get to walk the floor a little bit.

66
00:04:07,600 --> 00:04:11,600
I got to spend a little time answering questions at the booth.

67
00:04:11,600 --> 00:04:16,240
I got to spend some time asking questions at the booth and just meet up with a partner

68
00:04:16,240 --> 00:04:20,120
and customers and a whole bunch of different folks.

69
00:04:20,120 --> 00:04:25,720
And I'm just always amazed at just how many different points on the security journey people

70
00:04:25,720 --> 00:04:26,720
are on.

71
00:04:26,720 --> 00:04:27,720
Like some of them are just starting it.

72
00:04:27,720 --> 00:04:29,720
Some of them have a really small organization.

73
00:04:29,720 --> 00:04:32,560
Sometimes they're the one person that does security on the side.

74
00:04:32,560 --> 00:04:37,280
And sometimes they're part of a huge set of teams and they're one role in one team of

75
00:04:37,280 --> 00:04:40,200
many in the security org.

76
00:04:40,200 --> 00:04:45,000
So I just really enjoy sort of kind of going out there connecting and refreshing with that

77
00:04:45,000 --> 00:04:51,840
because always trying to make sure the guidance works for as many of those folks as possible.

78
00:04:51,840 --> 00:04:52,840
So yeah.

79
00:04:52,840 --> 00:04:57,200
And then I did, like I said earlier, presented the session, which went really well.

80
00:04:57,200 --> 00:05:02,320
Got a chance to collaborate with some awesome folks from NIST.

81
00:05:02,320 --> 00:05:06,200
Maruja Supaya, who is just a fantastic person.

82
00:05:06,200 --> 00:05:11,280
I think he goes by researcher as a title, but he's just sort of really smart at all

83
00:05:11,280 --> 00:05:12,900
sorts of things in security.

84
00:05:12,900 --> 00:05:15,640
And so just does some great work at NIST.

85
00:05:15,640 --> 00:05:22,760
And then Ulf Larsson, I got to meet and work with to talk about what they have learned

86
00:05:22,760 --> 00:05:25,080
about their Zero Trust journey.

87
00:05:25,080 --> 00:05:28,080
And at the SEB, it's a Swedish bank.

88
00:05:28,080 --> 00:05:32,880
And I don't remember what SEB stands for, so I'll have to look that up later.

89
00:05:32,880 --> 00:05:41,600
But they've done a fantastic job adopting Zero Trust concepts and principles and technologies

90
00:05:41,600 --> 00:05:45,880
and have seen a lot of success with it, shared some really good lessons learned in our session.

91
00:05:45,880 --> 00:05:50,080
In fact, it was a session so nice that we had to present it twice.

92
00:05:50,080 --> 00:05:54,160
We ended up doing a repeat session because apparently we didn't expect about a thousand

93
00:05:54,160 --> 00:05:55,160
people to sign up for it.

94
00:05:55,160 --> 00:05:59,120
And so we had to split it up into like a 600 and something room and a 300 and something

95
00:05:59,120 --> 00:06:00,360
room.

96
00:06:00,360 --> 00:06:02,120
So we ended up doing it twice in a row on Friday.

97
00:06:02,120 --> 00:06:03,120
So good times.

98
00:06:03,120 --> 00:06:04,840
Hey, just a stupid question.

99
00:06:04,840 --> 00:06:07,760
So who is the target audience for Ignite?

100
00:06:07,760 --> 00:06:09,040
I mean, I'll take a stab at it.

101
00:06:09,040 --> 00:06:12,640
And then Sarah, you tell me what your thoughts are.

102
00:06:12,640 --> 00:06:16,120
Ignite is sort of an interesting, I think it's fairly unique in the industry or really

103
00:06:16,120 --> 00:06:21,840
in any industry is that we have, I think it's primarily an IT audience.

104
00:06:21,840 --> 00:06:28,200
I think it's like 80, 90% folks that do IT in some form or fashion as a living.

105
00:06:28,200 --> 00:06:30,600
But we also do have developers that come there.

106
00:06:30,600 --> 00:06:36,840
It's not our developer focus conference, of course, but it does have folks there.

107
00:06:36,840 --> 00:06:40,640
And then this time there's a respectable amount of security folks in person.

108
00:06:40,640 --> 00:06:45,280
I mean, I think it was somewhere in the order of, I want to say in the neighborhood of like

109
00:06:45,280 --> 00:06:47,360
800 or a thousand or something like that.

110
00:06:47,360 --> 00:06:52,720
I don't remember exactly, but it's actually a significant percentage of folks there.

111
00:06:52,720 --> 00:06:58,760
And so it was, you know, it's always an interesting mix of folks that, you know, have so many

112
00:06:58,760 --> 00:07:05,080
different angles on technology because of how broad our technology at Microsoft portfolio

113
00:07:05,080 --> 00:07:06,080
at large is.

114
00:07:06,080 --> 00:07:09,480
Yeah, everything I reviewed was developer focused.

115
00:07:09,480 --> 00:07:12,840
All right, so let's get stuck into the guts of this.

116
00:07:12,840 --> 00:07:17,680
So every year, one thing that Microsoft Ignite produces at the end of the event is a thing

117
00:07:17,680 --> 00:07:20,360
called the Book of News.

118
00:07:20,360 --> 00:07:23,880
So what we're going to do is we're going to pick out some of the things that were of interest

119
00:07:23,880 --> 00:07:24,880
to us.

120
00:07:24,880 --> 00:07:25,880
We're going to sort of round robbing this thing.

121
00:07:25,880 --> 00:07:27,680
It's not going to cover absolutely everything.

122
00:07:27,680 --> 00:07:30,640
In fact, the Book of News doesn't cover absolutely everything.

123
00:07:30,640 --> 00:07:35,080
And it really just gives you just the background and, you know, and rather than being the whole

124
00:07:35,080 --> 00:07:39,680
sort of press announcement for things or technical documentation, you can always jump off and

125
00:07:39,680 --> 00:07:44,720
look at other information to find out about some of the things in greater depth.

126
00:07:44,720 --> 00:07:47,800
So we're going to touch on some of the items that sort of piqued each of our interests.

127
00:07:47,800 --> 00:07:50,820
And once we're sort of done, we'll just bring it to an end.

128
00:07:50,820 --> 00:07:53,640
So I'll kick things off a lot.

129
00:07:53,640 --> 00:07:55,900
Actually, fun fact, fun fact.

130
00:07:55,900 --> 00:08:02,000
So the Book of News has 298 references to the word secure or security, which is I think

131
00:08:02,000 --> 00:08:03,520
that may be a record now.

132
00:08:03,520 --> 00:08:06,200
I'm not 100% sure, but that's a lot of references.

133
00:08:06,200 --> 00:08:14,080
And in fact, the opening couple of chapters, sorry, paragraphs talk about the Secure Future

134
00:08:14,080 --> 00:08:19,880
Initiative and how a big driving influence for this particular Microsoft Ignite was exactly

135
00:08:19,880 --> 00:08:24,280
that was SFI and the things we're doing to our various products to help bolster secure

136
00:08:24,280 --> 00:08:27,600
by design, secure by default and secure operations.

137
00:08:27,600 --> 00:08:29,080
So this is really good to see.

138
00:08:29,080 --> 00:08:32,940
And as sort of Sarah mentioned, the fact that, you know, everyone you talk to, even if they

139
00:08:32,940 --> 00:08:38,480
weren't in, as she put it, defender for blah, even they weren't in a security feature, they

140
00:08:38,480 --> 00:08:43,620
still had security work they were doing that mapped onto secure by design, secure by default

141
00:08:43,620 --> 00:08:44,780
or secure operations.

142
00:08:44,780 --> 00:08:47,680
So it's really good to see all that work that's going on.

143
00:08:47,680 --> 00:08:49,920
So anyway, I'll kick things off.

144
00:08:49,920 --> 00:08:55,960
The first thing that piqued my interest and I really, really piqued my interest and Sarah

145
00:08:55,960 --> 00:09:02,080
even noted it when it was released, when it was announced, was the Azure Integrated Hardware

146
00:09:02,080 --> 00:09:04,480
Security Module.

147
00:09:04,480 --> 00:09:07,880
Think of this and it's not a one hunt, not a one to one map.

148
00:09:07,880 --> 00:09:10,120
So please don't go quoting me on this.

149
00:09:10,120 --> 00:09:15,880
Think of it as similar functionality to say Azure Key Vault, but actually in the hardware,

150
00:09:15,880 --> 00:09:19,840
like actually on the motherboard of the particular device.

151
00:09:19,840 --> 00:09:24,320
There's nowhere near as complete or as rich or provide, you know, sort of the same sort

152
00:09:24,320 --> 00:09:29,080
of scalability as Azure Key Vault, but it does provide some really interesting functionality

153
00:09:29,080 --> 00:09:31,280
and there's a lot of benefits that come from this.

154
00:09:31,280 --> 00:09:38,240
So things like storing keys, signing, sealing, encryption, decryption and so on is all done

155
00:09:38,240 --> 00:09:39,240
in the HSM.

156
00:09:39,240 --> 00:09:44,640
And it has a couple of really interesting properties because it's on the motherboard

157
00:09:44,640 --> 00:09:49,720
and the main one being performance because you sort of, you don't have to worry about

158
00:09:49,720 --> 00:09:55,560
network, network latency because it's all done locally on the host.

159
00:09:55,560 --> 00:10:01,240
And I'm not 100% sure how this will be exposed to applications, but this is fantastic.

160
00:10:01,240 --> 00:10:02,960
It's a great thing to see.

161
00:10:02,960 --> 00:10:06,720
And it will adhere to FIPS 140-3 level 3 security requirements.

162
00:10:06,720 --> 00:10:13,400
So it'll be FIPS 140-3 level 3 validated hardware, which is, I'm just, you know, when I saw this,

163
00:10:13,400 --> 00:10:15,440
you know, to be honest with you, that just made my night.

164
00:10:15,440 --> 00:10:16,760
That was the first thing I saw.

165
00:10:16,760 --> 00:10:19,120
So that's the first thing that took my interest.

166
00:10:19,120 --> 00:10:21,280
Sarah, what was the first thing that took your interest?

167
00:10:21,280 --> 00:10:22,280
Oh, okay.

168
00:10:22,280 --> 00:10:28,960
So for me, I think my favorite top, I'm not sure, announcement, there were a lot, but

169
00:10:28,960 --> 00:10:34,560
I think the one that I liked the most, that I'm excited about the most is Zero Day Quest.

170
00:10:34,560 --> 00:10:40,520
So if you missed it, it was in Sati's keynote and we basically announced that we're going

171
00:10:40,520 --> 00:10:45,960
to give an extra, I believe it's $4 million in our pot of money for bug bounties.

172
00:10:45,960 --> 00:10:48,520
And we have, of course, we already have bug bounty programs.

173
00:10:48,520 --> 00:10:54,460
We've had folks on the podcast come and talk to us about bug bounties, but it's an initiative

174
00:10:54,460 --> 00:11:02,040
to work even closer with the security research community and at some point next year in 2025,

175
00:11:02,040 --> 00:11:07,200
where there's going to be, I think they haven't announced quite all the details yet, but they're

176
00:11:07,200 --> 00:11:11,920
going to have an initial competition for people to submit bugs.

177
00:11:11,920 --> 00:11:16,360
And then there's going to be a live hacking event, I think in Redmond at some point next

178
00:11:16,360 --> 00:11:20,120
year as like the culmination of Zero Day Quest.

179
00:11:20,120 --> 00:11:25,200
So I think it's super early days because of course it's just been announced, but I'm really

180
00:11:25,200 --> 00:11:28,280
excited to see how that one pans out.

181
00:11:28,280 --> 00:11:33,680
Plus I want to go to the final, of course, because I like to be involved with all the

182
00:11:33,680 --> 00:11:34,680
things.

183
00:11:34,680 --> 00:11:36,480
Mark, how about you?

184
00:11:36,480 --> 00:11:44,000
So my favorite was the announcement of the GA, General Availability of Exposure Management.

185
00:11:44,000 --> 00:11:51,120
This is one of my favorite tools because when you think about what XDR, Extended Detection

186
00:11:51,120 --> 00:11:55,880
Response, did for sort of right of bang, sort of like, hey, this incident happened, we now

187
00:11:55,880 --> 00:11:58,320
need to manage it.

188
00:11:58,320 --> 00:12:01,480
I'm excited what exposure management is going to do for the left of bang.

189
00:12:01,480 --> 00:12:06,360
The incident hasn't happened yet, but we need to make sure we're blocking the potential

190
00:12:06,360 --> 00:12:08,640
for it.

191
00:12:08,640 --> 00:12:13,160
It's a tool that brings together all sorts of different things that folks would be familiar

192
00:12:13,160 --> 00:12:19,640
with through Secure Score, through external attack surface management, all the various

193
00:12:19,640 --> 00:12:25,480
different types of Defender for Cloud stuff around Identity, Endpoint Cloud, etc.

194
00:12:25,480 --> 00:12:29,880
All those different things and what can the attackers do and what do I need to patch or

195
00:12:29,880 --> 00:12:33,360
reconfigure or fix or whatever.

196
00:12:33,360 --> 00:12:37,080
Really we're in the journey of bringing that all together in one place and then enriching

197
00:12:37,080 --> 00:12:38,920
it and connecting it.

198
00:12:38,920 --> 00:12:45,240
And so it's really giving you that operational visibility on the prevention side.

199
00:12:45,240 --> 00:12:51,680
I'm really excited about this technology because it's very much a game changer.

200
00:12:51,680 --> 00:12:55,200
I think about the way organizations often do this.

201
00:12:55,200 --> 00:12:57,840
They usually call it vulnerability management.

202
00:12:57,840 --> 00:12:59,400
Say that three times fast.

203
00:12:59,400 --> 00:13:03,240
And then they kind of do a scan and shame approach.

204
00:13:03,240 --> 00:13:08,200
I've seen this way too many times where it's like, here's your patch report, go fix it.

205
00:13:08,200 --> 00:13:11,360
We also see this in the AppSec world with the scan and shame thing.

206
00:13:11,360 --> 00:13:13,840
It could be a bunch of false positives.

207
00:13:13,840 --> 00:13:15,400
It's not usually prioritized.

208
00:13:15,400 --> 00:13:17,160
It's not usually actionable, etc.

209
00:13:17,160 --> 00:13:21,680
So it's very painful in the traditional practices of security.

210
00:13:21,680 --> 00:13:24,720
And this tool really goes after fixing that.

211
00:13:24,720 --> 00:13:32,960
So essentially treating an attack path like an incident or any other cases in a queue

212
00:13:32,960 --> 00:13:34,560
type of thing.

213
00:13:34,560 --> 00:13:38,520
And then you can prioritize it and work it and burn down that list.

214
00:13:38,520 --> 00:13:42,680
So we've got all this great stuff that our researchers have done to figure out how the

215
00:13:42,680 --> 00:13:46,960
attackers chain these things together and which one's the most severe, etc.

216
00:13:46,960 --> 00:13:52,440
And then put those into the tool and then the tool finds those in your environment.

217
00:13:52,440 --> 00:13:56,100
As things change, as people reconfigure stuff, as you have configuration drift and all those

218
00:13:56,100 --> 00:13:59,840
things that happen, boom, they pop up and then you can work them.

219
00:13:59,840 --> 00:14:03,760
So it's just a really, really powerful thing.

220
00:14:03,760 --> 00:14:05,160
And of course, that can be overwhelming.

221
00:14:05,160 --> 00:14:09,160
So they have these things called security initiatives, including a catalog of pre-made

222
00:14:09,160 --> 00:14:10,160
ones.

223
00:14:10,160 --> 00:14:15,680
You can make your own that help focus on, hey, I want to specifically work on zero trust.

224
00:14:15,680 --> 00:14:16,680
I want to work on OT devices.

225
00:14:16,680 --> 00:14:17,680
I want to work on IoT devices.

226
00:14:17,680 --> 00:14:18,680
I want to work on endpoints.

227
00:14:18,680 --> 00:14:23,600
I want to work on cloud resources or containers or whatever it happens to be.

228
00:14:23,600 --> 00:14:29,400
And then you can essentially enable all these engineering and operations team and IT and

229
00:14:29,400 --> 00:14:34,720
OT and what have you to go work their lists and then you get to watch the risk tick down

230
00:14:34,720 --> 00:14:37,560
at a big picture perspective.

231
00:14:37,560 --> 00:14:40,880
So obviously lots of collaboration there is ideal.

232
00:14:40,880 --> 00:14:42,280
But I love it.

233
00:14:42,280 --> 00:14:44,420
And the other thing I really like is how accessible it is.

234
00:14:44,420 --> 00:14:51,360
So this isn't like some super premium E5 thing that folks have to pay for extra, etc.

235
00:14:51,360 --> 00:14:53,800
It's in a lot of different licenses, including E3.

236
00:14:53,800 --> 00:14:59,360
And so whatever tools you implement and put in place, it'll include those in its analysis

237
00:14:59,360 --> 00:15:00,640
and in reports.

238
00:15:00,640 --> 00:15:04,280
So it's very much one of those kind of grow with you types of tools.

239
00:15:04,280 --> 00:15:09,120
So very, very excited about Microsoft security exposure management.

240
00:15:09,120 --> 00:15:10,120
All right.

241
00:15:10,120 --> 00:15:12,680
Next one's totally different.

242
00:15:12,680 --> 00:15:14,000
Certainly not cryptography.

243
00:15:14,000 --> 00:15:20,440
And that is the fact that port 3389 is being shut down by default on various VMs that are

244
00:15:20,440 --> 00:15:22,160
rolled out in Azure.

245
00:15:22,160 --> 00:15:28,680
If you look at the secure by design, secure by default mantra or mantras in SFI and the

246
00:15:28,680 --> 00:15:31,840
future initiative, this is an example of secure by default.

247
00:15:31,840 --> 00:15:38,200
In other words, it's all about if you're not using 3389, then why is the port open?

248
00:15:38,200 --> 00:15:42,760
Because all you're going to do is expose some potential code to, well, potentially, depending

249
00:15:42,760 --> 00:15:45,920
on all your network policies, to untrusted users.

250
00:15:45,920 --> 00:15:49,240
But if you're not listening on that port by default, that port is closed by default, then

251
00:15:49,240 --> 00:15:54,040
if there is a vulnerability, for example, in the code behind 3389, which is the remote

252
00:15:54,040 --> 00:15:58,880
sort of desktop services server, then you can't exploit it.

253
00:15:58,880 --> 00:16:03,880
If there's a vulnerability in the code, but you're not listening on 3389, then I'm sure

254
00:16:03,880 --> 00:16:07,000
you should still apply the patch at some point, but at least it's not something you need to

255
00:16:07,000 --> 00:16:09,200
apply immediately.

256
00:16:09,200 --> 00:16:14,640
So I'm a big fan of attack service reduction, shutting down those unnecessary ports, shutting

257
00:16:14,640 --> 00:16:16,600
down unnecessary services.

258
00:16:16,600 --> 00:16:19,360
And then again, if people want to opt in to use it, fantastic.

259
00:16:19,360 --> 00:16:21,000
Off you go and knock yourself out.

260
00:16:21,000 --> 00:16:24,200
So for those that don't need it, they're not exposed by default.

261
00:16:24,200 --> 00:16:26,000
Again, so I'm a big, big, big fan of that.

262
00:16:26,000 --> 00:16:27,560
Sarah, what else you got?

263
00:16:27,560 --> 00:16:28,560
Okay.

264
00:16:28,560 --> 00:16:34,320
So another one, and I know Michael, you will have a comment on this too, because you've

265
00:16:34,320 --> 00:16:41,040
also mentioned, you've also got this in the notes that we write up before we do this episode.

266
00:16:41,040 --> 00:16:46,000
But one of the other things that was really interesting for me is we had a lot of announcements

267
00:16:46,000 --> 00:16:48,240
around Windows security.

268
00:16:48,240 --> 00:16:52,160
So we had the Windows resiliency initiative.

269
00:16:52,160 --> 00:16:56,000
So we'll put links in the show notes.

270
00:16:56,000 --> 00:17:01,640
And I'm also going to put my list of like four videos from Ignite I think you should

271
00:17:01,640 --> 00:17:02,640
watch.

272
00:17:02,640 --> 00:17:07,000
That's just my personal thing, because there was a lot of things that were announced at

273
00:17:07,000 --> 00:17:08,000
Windows.

274
00:17:08,000 --> 00:17:09,480
I know you want to talk about hot patch, Michael.

275
00:17:09,480 --> 00:17:10,480
So I'm going to let you do that.

276
00:17:10,480 --> 00:17:12,800
So I don't step on your toes.

277
00:17:12,800 --> 00:17:19,360
Hot patching allows you to basically apply patches without having to essentially shut

278
00:17:19,360 --> 00:17:21,360
down the service.

279
00:17:21,360 --> 00:17:24,040
Sometimes you have to reboot a service or whatever.

280
00:17:24,040 --> 00:17:25,040
This is an example.

281
00:17:25,040 --> 00:17:26,040
You don't have to do that.

282
00:17:26,040 --> 00:17:29,580
Now for the developers out there, if you're familiar with Visual C++, there's actually

283
00:17:29,580 --> 00:17:32,640
a linker option slash hot patch.

284
00:17:32,640 --> 00:17:38,800
And essentially what it does is it pads certain calls with a little bit of extra space so

285
00:17:38,800 --> 00:17:42,960
that new addresses can be inserted in there on the fly.

286
00:17:42,960 --> 00:17:43,960
Very nice technology.

287
00:17:43,960 --> 00:17:44,960
It works really, really well.

288
00:17:44,960 --> 00:17:49,040
But the whole point here is the ability to be able to literally patch a system without

289
00:17:49,040 --> 00:17:55,120
having to essentially bring the service down, which everyone prefers that.

290
00:17:55,120 --> 00:17:56,120
So yeah, hot patching.

291
00:17:56,120 --> 00:18:01,080
I'm just glad to see that it's really, really becoming a forefront technology.

292
00:18:01,080 --> 00:18:03,760
I think as well, well, some of the other...

293
00:18:03,760 --> 00:18:05,200
There's actually a couple of Windows things.

294
00:18:05,200 --> 00:18:09,640
There's the security stuff, the Resiliency Initiative and Quick Retachine Recovery.

295
00:18:09,640 --> 00:18:14,200
But I'll tell you what the other thing is that isn't so security, but I thought it was

296
00:18:14,200 --> 00:18:20,360
really cool, was the Microsoft link.

297
00:18:20,360 --> 00:18:22,000
That was so cool.

298
00:18:22,000 --> 00:18:27,040
If you didn't see what that was, it's basically a teeny tiny little box that is...

299
00:18:27,040 --> 00:18:29,720
I mean, I didn't even know what to describe it as.

300
00:18:29,720 --> 00:18:35,920
It basically is a machine, an endpoint, but it's all running in the cloud.

301
00:18:35,920 --> 00:18:41,400
It's just something to hook up your keyboard, your screens to, but everything runs out of

302
00:18:41,400 --> 00:18:42,400
the cloud.

303
00:18:42,400 --> 00:18:43,400
It's tiny.

304
00:18:43,400 --> 00:18:46,440
I did get to see one up close when I was hosting.

305
00:18:46,440 --> 00:18:49,360
It really is very, very small.

306
00:18:49,360 --> 00:18:50,920
It's going to be out next year.

307
00:18:50,920 --> 00:18:52,920
I mean, it's probably...

308
00:18:52,920 --> 00:18:57,200
Well, it's the modern version of what we would have called back in the day, a thin client,

309
00:18:57,200 --> 00:19:01,360
because it's got a little bit of hardware just so it can talk to your screens, everything,

310
00:19:01,360 --> 00:19:05,120
your peripherals that you need to plug in, but it's all run off the cloud.

311
00:19:05,120 --> 00:19:07,000
I think that is very cool.

312
00:19:07,000 --> 00:19:08,800
Well, I think it's important, right?

313
00:19:08,800 --> 00:19:10,680
Because it's going to be centrally managed.

314
00:19:10,680 --> 00:19:13,800
They're relatively inexpensive in the overall scheme of things.

315
00:19:13,800 --> 00:19:15,120
They're beautiful to look at as well.

316
00:19:15,120 --> 00:19:16,280
I mean, I was actually really impressed.

317
00:19:16,280 --> 00:19:20,760
I've always been a fan of the mini PC form factor.

318
00:19:20,760 --> 00:19:26,440
I really like it, but the fact that it's being centrally managed and everything's being run

319
00:19:26,440 --> 00:19:31,720
out of our data, out of our data sensors, our Azure data sensors, I think is a very

320
00:19:31,720 --> 00:19:32,720
interesting story.

321
00:19:32,720 --> 00:19:33,720
I mean, it's the old adage, right?

322
00:19:33,720 --> 00:19:38,240
What is old is new again, but we have all the backend infrastructure to support this

323
00:19:38,240 --> 00:19:42,200
now, which in the past didn't really exist very well.

324
00:19:42,200 --> 00:19:47,120
Yeah, and having been through thin client projects like 10, 15 years ago, it's going

325
00:19:47,120 --> 00:19:50,760
to be a lot easier with the cloud backing it than it was with having to put up these

326
00:19:50,760 --> 00:19:53,840
massive sands and this and that and all the other stuff.

327
00:19:53,840 --> 00:20:01,000
So it's just the same ideas keep coming back, but the technology is often much better than

328
00:20:01,000 --> 00:20:02,880
back in the days of Bainframes and whatnot.

329
00:20:02,880 --> 00:20:07,760
Yeah, I also did thin clients deployments back in the day.

330
00:20:07,760 --> 00:20:12,040
And yeah, I think it's going to be better with cloud than when you used to have to go

331
00:20:12,040 --> 00:20:16,680
back into a local data center that could be very temperamental.

332
00:20:16,680 --> 00:20:21,640
And one of my favorite stories for this one, I obviously won't identify the customer on

333
00:20:21,640 --> 00:20:27,200
this was someone came in and there was two different buttons.

334
00:20:27,200 --> 00:20:33,260
One would get you out of the door because it was like a magnetic lock or whatever.

335
00:20:33,260 --> 00:20:39,520
And then the other one on the other side would be an emergency power shutoff.

336
00:20:39,520 --> 00:20:43,740
And someone made the mistake of pushing the wrong one one time when they visited the data

337
00:20:43,740 --> 00:20:45,120
center.

338
00:20:45,120 --> 00:20:52,480
And the master VM image, someone said it was like one of those circus acts where the image

339
00:20:52,480 --> 00:20:56,000
itself was fine, but everything around it looked like it had bullet holes.

340
00:20:56,000 --> 00:20:59,320
So they got so lucky that they were able to bring it back fast.

341
00:20:59,320 --> 00:21:01,800
So sorry, just bring it back a little old story there.

342
00:21:01,800 --> 00:21:03,600
We should probably get back to ignite.

343
00:21:03,600 --> 00:21:04,600
So I have another one.

344
00:21:04,600 --> 00:21:08,080
And that is that I get my old stomping ground as you data.

345
00:21:08,080 --> 00:21:13,680
So SQL Server 2025, which is my guess is in beta or some sort of pre-release and that

346
00:21:13,680 --> 00:21:15,600
I'm not 100% sure.

347
00:21:15,600 --> 00:21:21,480
But one thing that it supports is better use of entry ID managed identities.

348
00:21:21,480 --> 00:21:26,040
So a big part of the secure future initiative is getting rid of credentials.

349
00:21:26,040 --> 00:21:30,040
Any way you can get rid of credentials is always a good thing because that way if there's

350
00:21:30,040 --> 00:21:32,080
no credentials and they can't be compromised, right?

351
00:21:32,080 --> 00:21:35,920
So if you get back to zero trust assume breach, if there's no credentials there, then there's

352
00:21:35,920 --> 00:21:37,920
no credentials there.

353
00:21:37,920 --> 00:21:41,660
And that means they can't be taken because they're just not there.

354
00:21:41,660 --> 00:21:45,400
So SQL Server will continue to make better use of managed identities.

355
00:21:45,400 --> 00:21:49,440
And this is really important when you've got SQL Server on prem accessing resources that

356
00:21:49,440 --> 00:21:50,440
are in the cloud.

357
00:21:50,440 --> 00:21:57,200
So for example, things like backup, you can use a managed identity on the database.

358
00:21:57,200 --> 00:22:03,560
That way you've got no credential that's being stored by SQL Server to access some resource.

359
00:22:03,560 --> 00:22:06,840
For example, a storage account for backup, there's no credential.

360
00:22:06,840 --> 00:22:09,200
There's all the identity of the actual running process.

361
00:22:09,200 --> 00:22:12,400
Again, using managed identities, not Windows identities.

362
00:22:12,400 --> 00:22:13,600
So that's always a good thing to see.

363
00:22:13,600 --> 00:22:19,160
And again, you'll see more and more products over time will start to make much deeper use

364
00:22:19,160 --> 00:22:23,600
of managed identities because again, there's no credential there.

365
00:22:23,600 --> 00:22:25,200
It's all managed by entry ID.

366
00:22:25,200 --> 00:22:28,320
So that's something else that really piqued my interest.

367
00:22:28,320 --> 00:22:33,780
And another one that I enjoyed was seeing some of the enhancements to the purview data

368
00:22:33,780 --> 00:22:37,400
loss prevention for M365 Copilot.

369
00:22:37,400 --> 00:22:42,600
I love the potential of the AI and the generative AI and the models and what they can do.

370
00:22:42,600 --> 00:22:47,300
But when you think about it from a security perspective, these models don't inherently

371
00:22:47,300 --> 00:22:50,000
know how to obey permissions.

372
00:22:50,000 --> 00:22:52,820
And so they're very hard to secure directly.

373
00:22:52,820 --> 00:22:59,160
And so that's why we have to put these deterministic or traditional code wrappers around them and

374
00:22:59,160 --> 00:23:01,720
say the model doesn't get access to data.

375
00:23:01,720 --> 00:23:03,640
It shouldn't be processing for this request.

376
00:23:03,640 --> 00:23:08,600
Because if the user using the model or using the app that uses the model doesn't have

377
00:23:08,600 --> 00:23:12,320
access to the data, then the model shouldn't get access to the data because it might disclose

378
00:23:12,320 --> 00:23:13,760
a secret.

379
00:23:13,760 --> 00:23:19,120
And it almost kind of reminds me of this TV show that I enjoy watching with my kids called

380
00:23:19,120 --> 00:23:20,720
Young Sheldon.

381
00:23:20,720 --> 00:23:24,880
Because it's this brilliant kid, but the kid doesn't really have context for the world

382
00:23:24,880 --> 00:23:27,640
and can't keep a secret.

383
00:23:27,640 --> 00:23:32,480
And so if you don't want the kid to tell a secret, you don't tell the secret to the

384
00:23:32,480 --> 00:23:33,920
kid in the first place.

385
00:23:33,920 --> 00:23:35,240
So it's that kind of model.

386
00:23:35,240 --> 00:23:40,480
And so I love to see the continued development of how do you protect these models so that

387
00:23:40,480 --> 00:23:45,560
they're not giving access to things they don't need to, and allowing users to still leverage

388
00:23:45,560 --> 00:23:50,640
the full power of them using the data that they're actually access to and entitled to.

389
00:23:50,640 --> 00:23:55,880
So I'm really excited about the continued development in that space to make these things

390
00:23:55,880 --> 00:23:57,600
as safe as possible.

391
00:23:57,600 --> 00:23:59,640
I'm a big fan of plausible deniability.

392
00:23:59,640 --> 00:24:00,640
Seriously.

393
00:24:00,640 --> 00:24:01,640
Don't tell me.

394
00:24:01,640 --> 00:24:02,640
I don't want to know.

395
00:24:02,640 --> 00:24:06,040
If I don't need to know, I don't need to know.

396
00:24:06,040 --> 00:24:07,040
It drives my wife nuts.

397
00:24:07,040 --> 00:24:08,840
She's like, oh, I need to tell you about blah, blah, blah.

398
00:24:08,840 --> 00:24:09,840
I'm like, do I need to know?

399
00:24:09,840 --> 00:24:11,120
And she's like, well, no, you don't need to know.

400
00:24:11,120 --> 00:24:12,120
And I'm like, well, don't tell me.

401
00:24:12,120 --> 00:24:13,120
So that way, you're having enough.

402
00:24:13,120 --> 00:24:14,120
It's the opposite of gossip.

403
00:24:14,120 --> 00:24:16,760
And it does it.

404
00:24:16,760 --> 00:24:20,120
She really obviously wants to get something off her chest, but I don't want to know.

405
00:24:20,120 --> 00:24:23,760
I mean, I know, perhaps it's just, I don't know.

406
00:24:23,760 --> 00:24:25,400
Anyway, it is what it is.

407
00:24:25,400 --> 00:24:27,400
Sarah, you got another one?

408
00:24:27,400 --> 00:24:29,560
Mark stole my thing there.

409
00:24:29,560 --> 00:24:36,920
But you're right, there's a lot of activity and stuff around and motions around data security.

410
00:24:36,920 --> 00:24:42,440
So there's the purview side of things, but also SharePoint had quite a lot of announcements

411
00:24:42,440 --> 00:24:50,360
and tools that are integrated now to help control oversharing in M365 Copilot, which

412
00:24:50,360 --> 00:24:56,820
is important too, because as we know, generally when there's a customer problem, it usually

413
00:24:56,820 --> 00:24:59,620
straddles several products in reality.

414
00:24:59,620 --> 00:25:03,400
And so SharePoint also announced some features.

415
00:25:03,400 --> 00:25:04,600
Some of them are already existing.

416
00:25:04,600 --> 00:25:13,880
It's in SAM that also help with that oversharing data loss piece before you let a Copilot run

417
00:25:13,880 --> 00:25:20,240
all over your data and possibly find stuff that it's not supposed to.

418
00:25:20,240 --> 00:25:24,000
And not because it wasn't already insecure, because we know that Copilot inherits the

419
00:25:24,000 --> 00:25:26,420
security posture that's already there.

420
00:25:26,420 --> 00:25:29,240
It's just that Copilot is better at finding stuff than a person, right?

421
00:25:29,240 --> 00:25:37,560
So I'm glad to see we've got some more announcements around that and that we've got more tooling

422
00:25:37,560 --> 00:25:42,280
to help people control that because, let's face it, no one's done data security very

423
00:25:42,280 --> 00:25:44,200
well ever.

424
00:25:44,200 --> 00:25:49,840
And I think AI is going to give people a kick up the butt to probably sort out their data.

425
00:25:49,840 --> 00:25:56,520
I just want to go back to something that Mark talked about, which is the exposure management.

426
00:25:56,520 --> 00:26:00,920
I want to just talk about two aspects of that, because actually a lot of the information

427
00:26:00,920 --> 00:26:03,120
came out of the team that I'm currently in.

428
00:26:03,120 --> 00:26:07,400
And it's attack surface management and attack path analysis.

429
00:26:07,400 --> 00:26:11,200
So I mentioned before about turning off port 3389 by default.

430
00:26:11,200 --> 00:26:14,960
And that's a good example of driving down attack surface.

431
00:26:14,960 --> 00:26:18,800
How much of your environment is exposed to untrusted users?

432
00:26:18,800 --> 00:26:20,720
And you really want to drive that down.

433
00:26:20,720 --> 00:26:23,360
Not to the point where you can't use the environment, obviously.

434
00:26:23,360 --> 00:26:26,400
You've got to have some things running.

435
00:26:26,400 --> 00:26:29,080
But you really want to drive that down, at least to the unnecessary stuff.

436
00:26:29,080 --> 00:26:33,360
So one part of that exposure management is actually attack surface management.

437
00:26:33,360 --> 00:26:35,480
How exposed are you to the world?

438
00:26:35,480 --> 00:26:37,280
It doesn't mean that you've got vulnerabilities.

439
00:26:37,280 --> 00:26:40,440
It means that if you do have vulnerabilities, then perhaps the attacker can actually get

440
00:26:40,440 --> 00:26:44,800
in and get to the particularly vulnerable system, whatever it is.

441
00:26:44,800 --> 00:26:48,040
Which leads me nicely into the next one, which is attack path analysis.

442
00:26:48,040 --> 00:26:52,760
This is basically a graph that shows if you're at this endpoint, then you can get all the

443
00:26:52,760 --> 00:26:55,960
way down to here by doing this, this, this, and this.

444
00:26:55,960 --> 00:27:00,320
Which is actually really, really cool because that can be a real eye-opener because you

445
00:27:00,320 --> 00:27:02,880
don't realize just how exposed you are.

446
00:27:02,880 --> 00:27:06,800
So I was very excited to see that in the exposure management.

447
00:27:06,800 --> 00:27:07,800
Yeah.

448
00:27:07,800 --> 00:27:11,440
And the way I like to think about it, for those that speak the risk language, is the

449
00:27:11,440 --> 00:27:15,720
difference between potential risk and realized risk.

450
00:27:15,720 --> 00:27:16,720
Right?

451
00:27:16,720 --> 00:27:21,280
So you know, okay, we forgot to lock the door is a potential risk.

452
00:27:21,280 --> 00:27:23,440
Oh, an attacker went through the unlocked door.

453
00:27:23,440 --> 00:27:24,440
That's a realized risk.

454
00:27:24,440 --> 00:27:25,560
That's where the sock kicks in, right?

455
00:27:25,560 --> 00:27:26,560
A bang.

456
00:27:26,560 --> 00:27:28,280
And so that's how I kind of think about that.

457
00:27:28,280 --> 00:27:31,640
And the reality is there's just a lot to secure.

458
00:27:31,640 --> 00:27:35,240
There's a lot of things that are open that don't need to be open.

459
00:27:35,240 --> 00:27:37,100
And that's what I love about that tool.

460
00:27:37,100 --> 00:27:42,800
So another thing that was discussed in the book of news, and we can also put a direct

461
00:27:42,800 --> 00:27:48,120
blog link in for it, is a really cool technology called Zero Trust DNS.

462
00:27:48,120 --> 00:27:52,520
And this one's, it took me a little while to wrap my head around, quite frankly.

463
00:27:52,520 --> 00:27:57,320
But I think the simplest way to think about it is, you know, it's really hard to keep

464
00:27:57,320 --> 00:28:01,920
up if, say, you want to make sure that your Windows devices aren't going out to a bunch

465
00:28:01,920 --> 00:28:04,000
of unknown sites, right?

466
00:28:04,000 --> 00:28:09,280
Because adversaries, you know, change IPs, like, you know, you would, I don't know, what

467
00:28:09,280 --> 00:28:10,960
do you change a lot?

468
00:28:10,960 --> 00:28:14,720
People say, they change IPs a lot, right?

469
00:28:14,720 --> 00:28:15,720
And they know that.

470
00:28:15,720 --> 00:28:18,040
And the reality is this legitimate service has changed IPs a lot.

471
00:28:18,040 --> 00:28:20,000
So it's really hard to keep track of that.

472
00:28:20,000 --> 00:28:25,840
And so the Zero Trust DNS, what it does is it allows you to build essentially firewall

473
00:28:25,840 --> 00:28:30,560
rules that you can't talk to this, and the apps on this site can't talk to this, unless

474
00:28:30,560 --> 00:28:33,200
they can look it up in DNS, right?

475
00:28:33,200 --> 00:28:37,960
And that all of a sudden takes this uncontrolled, you know, I can talk to anything on the internet

476
00:28:37,960 --> 00:28:42,040
as long as I have an IP address, you know, which is, you know, a very dangerous thing

477
00:28:42,040 --> 00:28:46,360
and allows you to talk to command and control servers if it's compromised and, you know,

478
00:28:46,360 --> 00:28:52,000
and download exploits, you know, etc. on that box if someone's able to convince you to click

479
00:28:52,000 --> 00:28:54,360
on a phishing link, etc.

480
00:28:54,360 --> 00:29:00,480
And it switches it into the attacker now has to expose the IPs they want at endpoint to

481
00:29:00,480 --> 00:29:03,160
talk to via DNS.

482
00:29:03,160 --> 00:29:09,160
So they have to publish that IP as a DNS record somewhere or take over somebody else's one,

483
00:29:09,160 --> 00:29:14,120
but it puts them in a much more logged in track space other than just some random anonymous

484
00:29:14,120 --> 00:29:15,120
IP connection.

485
00:29:15,120 --> 00:29:21,320
And so it's a really, really interesting technology that starts, you know, I thought it was a

486
00:29:21,320 --> 00:29:26,680
really great creative solution that starts changing, quite frankly, the cost of attack

487
00:29:26,680 --> 00:29:31,640
for the attackers and forcing them to be a lot more into the light.

488
00:29:31,640 --> 00:29:35,800
Basically the equivalent of showing you have to show a legitimate ID, not just say your

489
00:29:35,800 --> 00:29:40,600
name is is is James Bond and will accept your word for it kind of thing.

490
00:29:40,600 --> 00:29:44,800
So really interesting technology there that made its way into the book of news as well.

491
00:29:44,800 --> 00:29:47,320
Okay, so I've got one more.

492
00:29:47,320 --> 00:29:52,900
Also we've got in purview, we've got data security posture management for AI.

493
00:29:52,900 --> 00:29:57,280
So this is building on what I was talking about before.

494
00:29:57,280 --> 00:30:02,960
It's actually a way that you can use purview to actually have an overall look at your data

495
00:30:02,960 --> 00:30:09,800
security posture, which is now specific, more specifically for AI, but also more generally,

496
00:30:09,800 --> 00:30:14,040
because you need to know this, of course, because people have historically not done

497
00:30:14,040 --> 00:30:15,840
their data security super well.

498
00:30:15,840 --> 00:30:21,120
So I know there is the odd organization out there that has so kudos to you if you have,

499
00:30:21,120 --> 00:30:26,020
but the fact is, is that a lot of folks data security hasn't been high up their priority.

500
00:30:26,020 --> 00:30:30,640
And so the posture management now allows in purview is allowing people to have a look

501
00:30:30,640 --> 00:30:35,560
and have an overall view of actually what the heck is my data estate looking like, because

502
00:30:35,560 --> 00:30:40,800
I know when I've talked to customers, they know they want to do their data security well,

503
00:30:40,800 --> 00:30:46,040
but some they have no idea where to start, because, you know, often they don't even they

504
00:30:46,040 --> 00:30:49,880
don't know what they don't know, they know it's not great, but they have no idea what

505
00:30:49,880 --> 00:30:51,120
sort of state it's in.

506
00:30:51,120 --> 00:30:57,040
So this posture management will help that by it'll actively discover things, you know,

507
00:30:57,040 --> 00:31:01,360
as lots of posture management tools do, go and have a look where things are labeled,

508
00:31:01,360 --> 00:31:05,920
looking at where things might have been overshared, and giving you a nice overview.

509
00:31:05,920 --> 00:31:10,480
So then you can make a plan to fix it with all the tools that I talked about before.

510
00:31:10,480 --> 00:31:16,400
So that's definitely one to go and have a look at and have a play around with if you're

511
00:31:16,400 --> 00:31:19,880
well, I think if you've got data, so that would be everybody.

512
00:31:19,880 --> 00:31:26,840
Another one that took my interest was a thing called the security service edge and part

513
00:31:26,840 --> 00:31:32,760
of that was Microsoft Entra private access, which is a way kind of simplifying migrating

514
00:31:32,760 --> 00:31:35,600
from traditional VPNs.

515
00:31:35,600 --> 00:31:41,200
This kind of took my took my interest or repeat my interest, just because I didn't actually

516
00:31:41,200 --> 00:31:42,920
even know we were working on this kind of stuff.

517
00:31:42,920 --> 00:31:48,640
So it's good to see essentially a VPN like technology being built into the product as

518
00:31:48,640 --> 00:31:50,320
well.

519
00:31:50,320 --> 00:31:54,640
And the thing that I love about it is it's very, because I've been following that product

520
00:31:54,640 --> 00:32:01,160
for a little while, is it's bringing together the two access control disciplines, which

521
00:32:01,160 --> 00:32:06,640
are often oil and water in terms of the cultures within the organization, you know, identity

522
00:32:06,640 --> 00:32:10,680
folks and network folks, they're both access control disciplines, they're both stuck with

523
00:32:10,680 --> 00:32:17,400
this strange dual requirement to both enable the business and organization and connectivity

524
00:32:17,400 --> 00:32:19,080
and access to things.

525
00:32:19,080 --> 00:32:23,880
But also they're the frontline of security in terms of, you know, filtering the bad stuff

526
00:32:23,880 --> 00:32:27,720
out as well to make sure that the attackers aren't following the same, you know, paths

527
00:32:27,720 --> 00:32:33,020
and bad readers and the electronic equivalence thereof to get to the stuff.

528
00:32:33,020 --> 00:32:37,360
And so I love the fact that this is now bringing it all together and it's using that same conditional

529
00:32:37,360 --> 00:32:42,360
policy access engine, and it's enforcing it over identity as well as network means.

530
00:32:42,360 --> 00:32:44,580
So love that technology.

531
00:32:44,580 --> 00:32:50,040
Another one that piqued my interest was some updates to Defender for cloud around containers

532
00:32:50,040 --> 00:32:51,280
especially.

533
00:32:51,280 --> 00:32:55,920
So the ability to scan container images from their creation in a CICD pipeline all the

534
00:32:55,920 --> 00:33:01,560
way through to the various cloud platforms, third party and private registries and in

535
00:33:01,560 --> 00:33:04,160
Kubernetes clusters.

536
00:33:04,160 --> 00:33:08,920
It's in preview right now, but the fact that we have something like this in place now is

537
00:33:08,920 --> 00:33:09,920
really good to see.

538
00:33:09,920 --> 00:33:10,920
Anything to add to that, Sarah?

539
00:33:10,920 --> 00:33:11,920
I know containers are sort of your thing.

540
00:33:11,920 --> 00:33:12,920
I love containers.

541
00:33:12,920 --> 00:33:13,920
Containers are great.

542
00:33:13,920 --> 00:33:14,920
And we need to use more.

543
00:33:14,920 --> 00:33:21,480
Well, I think nowadays, to be honest with you, I think most stuff is containerized in

544
00:33:21,480 --> 00:33:23,480
some way, shape or form.

545
00:33:23,480 --> 00:33:28,900
And so the more that we can do to monitor them because they are still trickier because

546
00:33:28,900 --> 00:33:34,480
of their ephemeral nature, the better, to be honest with you, because I think nowadays

547
00:33:34,480 --> 00:33:40,040
most folks are not building anything that's not containerized, which is a good thing.

548
00:33:40,040 --> 00:33:41,940
Hey, so I actually have a question for you.

549
00:33:41,940 --> 00:33:47,840
So one of the things that this update to Defender for Cloud has added is binary drift detection.

550
00:33:47,840 --> 00:33:50,000
I'll read verbatim from the book of news.

551
00:33:50,000 --> 00:33:55,520
It says, identifies and responds to unauthorized changes in container configurations at runtime

552
00:33:55,520 --> 00:34:01,140
and helps users ensure container images remain unmodified after deployment.

553
00:34:01,140 --> 00:34:03,040
Binary drift detection is now generally available.

554
00:34:03,040 --> 00:34:04,960
So here's a question for you, Sarah.

555
00:34:04,960 --> 00:34:09,880
Don't we already have that problem kind of solved with signatures on containers?

556
00:34:09,880 --> 00:34:12,400
Does that imply that people are not using signatures?

557
00:34:12,400 --> 00:34:13,560
No, it doesn't.

558
00:34:13,560 --> 00:34:19,000
So when you have a signature, it's a container image signed before you deploy it.

559
00:34:19,000 --> 00:34:24,080
So when you go to grab a container image from like a container registry or whatever, it

560
00:34:24,080 --> 00:34:27,120
will have been signed and you can check there.

561
00:34:27,120 --> 00:34:31,680
But when it's been deployed, there's not ongoing signature checking of the image.

562
00:34:31,680 --> 00:34:40,520
So that binary drift is to address a running container and a change there.

563
00:34:40,520 --> 00:34:41,520
That's interesting.

564
00:34:41,520 --> 00:34:44,200
Actually, it's kind of sad, but interesting.

565
00:34:44,200 --> 00:34:47,000
I mean, the whole point of signatures is you.

566
00:34:47,000 --> 00:34:48,680
Yeah, I see what you're getting at.

567
00:34:48,680 --> 00:34:50,240
I mean, once the thing's running, yeah.

568
00:34:50,240 --> 00:34:51,240
Okay.

569
00:34:51,240 --> 00:34:52,240
All right.

570
00:34:52,240 --> 00:34:53,240
That makes sense.

571
00:34:53,240 --> 00:34:55,140
Actually, in which case that's really exciting to see.

572
00:34:55,140 --> 00:34:57,800
Because actually, it has been a challenge.

573
00:34:57,800 --> 00:35:02,960
In fact, the signing a container image before you deploy it and store it, that's kind of

574
00:35:02,960 --> 00:35:06,120
relatively straightforward because we can lean on technologies that we've had for a

575
00:35:06,120 --> 00:35:08,220
while to sign things.

576
00:35:08,220 --> 00:35:12,920
But when the container is running and we've done the signature checking, that is a trickier

577
00:35:12,920 --> 00:35:14,100
thing to monitor.

578
00:35:14,100 --> 00:35:17,440
So it's cool to see we have some stuff to be able to do that now.

579
00:35:17,440 --> 00:35:18,440
Nice.

580
00:35:18,440 --> 00:35:24,800
While we're on the topic of Defender for cloud, there is now, I believe this is coming, is

581
00:35:24,800 --> 00:35:31,880
API security posture management using Defender cloud security posture management.

582
00:35:31,880 --> 00:35:37,480
So basically, it's going to be able to keep track of your API security posture, which

583
00:35:37,480 --> 00:35:38,960
is super nice as well.

584
00:35:38,960 --> 00:35:46,000
Because when you look at so many environments are compromised through APIs, through REST

585
00:35:46,000 --> 00:35:53,520
endpoints, it's good to see that we're expanding the Defender arm as it will, as it were, to

586
00:35:53,520 --> 00:35:55,040
cover API security as well.

587
00:35:55,040 --> 00:35:59,720
Isn't there a Defender for API security or is that just part of cloud security posture

588
00:35:59,720 --> 00:36:00,720
management?

589
00:36:00,720 --> 00:36:01,720
I actually don't know.

590
00:36:01,720 --> 00:36:02,720
We need to get Yori back on.

591
00:36:02,720 --> 00:36:05,440
Yeah, I think it's part of the Defender for cloud family.

592
00:36:05,440 --> 00:36:08,520
I don't know if they use that standalone term anymore.

593
00:36:08,520 --> 00:36:11,040
I think that might be how the evolution works.

594
00:36:11,040 --> 00:36:12,040
I can't keep up with this.

595
00:36:12,040 --> 00:36:15,600
Yeah, I think you're right, Mark, that it's still there, the functionality.

596
00:36:15,600 --> 00:36:21,840
But I think we've stopped explicitly calling it Defender for APIs, but it's just been integrated

597
00:36:21,840 --> 00:36:23,720
more and doesn't have a separate name.

598
00:36:23,720 --> 00:36:25,320
It's not that it's gone away.

599
00:36:25,320 --> 00:36:27,000
That's what I think Mark is correct.

600
00:36:27,000 --> 00:36:32,600
But yeah, we need a friend of the podcast Yori to confirm.

601
00:36:32,600 --> 00:36:34,600
Yeah, we do.

602
00:36:34,600 --> 00:36:39,160
Okay, now I have a silly way of transitioning to my next one, so I've got to do it.

603
00:36:39,160 --> 00:36:43,800
So if you drop the P, there's also AI security posture management.

604
00:36:43,800 --> 00:36:46,280
API to AI, nevermind.

605
00:36:46,280 --> 00:36:47,280
Okay.

606
00:36:47,280 --> 00:36:49,080
That's just, I like that.

607
00:36:49,080 --> 00:36:50,080
You can tell you're a father.

608
00:36:50,080 --> 00:36:51,080
That's just a bad joke.

609
00:36:51,080 --> 00:36:54,200
I was just about to say, come on dad jokes.

610
00:36:54,200 --> 00:36:59,520
I had enough dad jokes when I was at Ignite because Seth, one of the co-hosts, is the

611
00:36:59,520 --> 00:37:01,480
king of dad jokes.

612
00:37:01,480 --> 00:37:03,040
So yeah.

613
00:37:03,040 --> 00:37:04,120
Nice.

614
00:37:04,120 --> 00:37:08,640
So I'll list off like a three or four that also caught my attention, but I do really

615
00:37:08,640 --> 00:37:11,760
genuinely, and I wasn't joking, it does exist.

616
00:37:11,760 --> 00:37:17,240
The AI security posture management to really, much like you would look at all your different

617
00:37:17,240 --> 00:37:21,720
SaaS apps, it's essentially a very similar approach to look at all of those different

618
00:37:21,720 --> 00:37:27,840
AI applications and apply controls and inventory and all those kinds of things to it.

619
00:37:27,840 --> 00:37:34,040
A couple of things that it was really nice to see is a lot of enhancements to essentially

620
00:37:34,040 --> 00:37:41,760
USX, that converged platform of Defender XDR and Sentinel coming together into a single

621
00:37:41,760 --> 00:37:43,360
sort of soft console.

622
00:37:43,360 --> 00:37:44,840
So a lot of good stuff there.

623
00:37:44,840 --> 00:37:50,320
There's also the addition of insider risk management alerts and events into there so

624
00:37:50,320 --> 00:37:55,400
that you can bring those in and whether your sock handles that and handles those, or you've

625
00:37:55,400 --> 00:38:00,440
got your HR folks or somebody else that works on those, it's all in the same tool set and

626
00:38:00,440 --> 00:38:03,120
benefits from all that cross correlation.

627
00:38:03,120 --> 00:38:09,520
And then two things on the sort of more sort of like personal windows sort of side is the

628
00:38:09,520 --> 00:38:14,360
personal data encryption where stuff is encrypted with an additional layer of security and you

629
00:38:14,360 --> 00:38:17,840
cannot access it without going through the Windows Hello thing.

630
00:38:17,840 --> 00:38:22,640
So really kind of keeping those extra secure so like an app can't sort of sneak a copy

631
00:38:22,640 --> 00:38:26,660
of the data off in the background, which is really cool.

632
00:38:26,660 --> 00:38:32,160
And then there was also a lot of progress on the Microsoft virus initiative.

633
00:38:32,160 --> 00:38:36,600
If y'all remember, there was some significant downtime a few months back from a vendor,

634
00:38:36,600 --> 00:38:38,400
which we won't name.

635
00:38:38,400 --> 00:38:44,360
And so there's a whole lot of good things that were being done to sort of make sure

636
00:38:44,360 --> 00:38:49,240
that everyone's doing their part to make sure it doesn't happen again, including, hey, how

637
00:38:49,240 --> 00:38:54,680
do we enhance the platform to help avoid those kinds of mistakes from happening in the future

638
00:38:54,680 --> 00:38:58,680
and engineer it so it's the right thing to do is the easy thing to do.

639
00:38:58,680 --> 00:39:03,000
And so there was a bunch of announcements around that and around the way that we're

640
00:39:03,000 --> 00:39:06,640
thinking about the rules of integrating within Windows and whatnot.

641
00:39:06,640 --> 00:39:09,040
So very, very happy to see those.

642
00:39:09,040 --> 00:39:10,040
All right.

643
00:39:10,040 --> 00:39:14,560
Another one I have is secure password deployment in Edge.

644
00:39:14,560 --> 00:39:19,840
This allows IT admins to deploy encrypted shared passwords to a specific set of users

645
00:39:19,840 --> 00:39:21,520
if it's needed.

646
00:39:21,520 --> 00:39:25,840
This is really, really cool because that way you're not just sending passwords in plain

647
00:39:25,840 --> 00:39:29,280
text or something and telling people to type them in or something.

648
00:39:29,280 --> 00:39:33,320
This is all being managed centrally and in a secure and encrypted manner.

649
00:39:33,320 --> 00:39:37,600
So good to see Edge getting some more sort of IT administration love.

650
00:39:37,600 --> 00:39:38,600
All right.

651
00:39:38,600 --> 00:39:43,720
So look, we haven't even, I don't even think we grazed the surface of everything that's

652
00:39:43,720 --> 00:39:47,200
in the book of news.

653
00:39:47,200 --> 00:39:51,120
The document is absolutely immense just from a security standpoint.

654
00:39:51,120 --> 00:39:55,120
But with that said, we had to bring this episode to an end at some point.

655
00:39:55,120 --> 00:39:59,540
So as many of you know, whenever we have an episode, we always ask our guests if they

656
00:39:59,540 --> 00:40:02,680
had one, just that one final thought to leave our listeners with and we're going to do the

657
00:40:02,680 --> 00:40:03,680
same.

658
00:40:03,680 --> 00:40:04,880
So Mark, why don't you kick things off?

659
00:40:04,880 --> 00:40:08,600
If you had just like one thought to leave our listeners with, what would it be?

660
00:40:08,600 --> 00:40:09,600
Continuous learning.

661
00:40:09,600 --> 00:40:15,560
I mean, just keeping up with security and this is just the Microsoft news, right?

662
00:40:15,560 --> 00:40:17,600
I mean, there's always more.

663
00:40:17,600 --> 00:40:20,240
There's always the attack evolution, the threat intelligence stuff.

664
00:40:20,240 --> 00:40:23,660
There's things that other platform providers are doing.

665
00:40:23,660 --> 00:40:25,880
There's things the government's doing.

666
00:40:25,880 --> 00:40:31,120
It's just really critical to always be in that continuous learning mode and definitely

667
00:40:31,120 --> 00:40:37,060
be confident in the stuff that you know, but also be willing to question it and learn something

668
00:40:37,060 --> 00:40:42,760
new at any time because that is just the nature of our industry is it's constantly in motion.

669
00:40:42,760 --> 00:40:43,760
Yeah.

670
00:40:43,760 --> 00:40:48,960
So my final thought is reading through the book of news and watching a lot of the announcements,

671
00:40:48,960 --> 00:40:53,780
a lot of the presentations that came out of Ignite, it's impossible to walk away without

672
00:40:53,780 --> 00:40:58,720
seeing the impact Secure Future Initiative is already having across Microsoft products.

673
00:40:58,720 --> 00:41:03,960
You're seeing really important technology changes like the use of managed identities,

674
00:41:03,960 --> 00:41:07,760
protection of credentials if we have to have credentials being pushed all the way down

675
00:41:07,760 --> 00:41:09,960
into the hardware.

676
00:41:09,960 --> 00:41:12,920
There's lots of things we're seeing around attack surface analysis and attack surface

677
00:41:12,920 --> 00:41:13,920
reduction.

678
00:41:13,920 --> 00:41:18,080
Again, this is all secure by default, which is one of the pillars of SFI.

679
00:41:18,080 --> 00:41:25,040
So it's really heartwarming to see all the work that is still ongoing in SFI, but already

680
00:41:25,040 --> 00:41:30,640
seeing, already sort of manifesting itself with some pretty serious changes across the

681
00:41:30,640 --> 00:41:35,160
whole spectrum of Microsoft products from Azure to Windows to Office to everything in

682
00:41:35,160 --> 00:41:36,160
between.

683
00:41:36,160 --> 00:41:37,660
So it's fantastic to see.

684
00:41:37,660 --> 00:41:43,340
So I guess my final thought is, I can tell you for a fact, it does seem like there's

685
00:41:43,340 --> 00:41:48,000
an overwhelming amount of information at Ignite and I think there is.

686
00:41:48,000 --> 00:41:52,040
We've got loads of cool teams who work on loads of things.

687
00:41:52,040 --> 00:41:55,840
And so there's a lot to digest.

688
00:41:55,840 --> 00:42:01,600
And of course, only a tiny fraction of the Microsoft customer base gets to go to Ignite

689
00:42:01,600 --> 00:42:02,900
in person.

690
00:42:02,900 --> 00:42:07,600
So if you were there and you didn't catch everything you wanted to see, or if you weren't

691
00:42:07,600 --> 00:42:11,200
there, remember we do upload everything to YouTube.

692
00:42:11,200 --> 00:42:16,020
And if you registered as an online attendee, you can also go on demand and watch the sessions

693
00:42:16,020 --> 00:42:18,460
in the Microsoft website as well.

694
00:42:18,460 --> 00:42:22,280
So make sure that you actually do that and catch up on things.

695
00:42:22,280 --> 00:42:27,120
I can tell you as somebody who actually attended Ignite, but didn't get to see any sessions

696
00:42:27,120 --> 00:42:33,920
because I was in my little celebrity studio, I have had to catch up on quite a few of them.

697
00:42:33,920 --> 00:42:41,160
So yeah, my main final thought for this time is, even if you didn't get to go to Ignite,

698
00:42:41,160 --> 00:42:47,040
because we're recording this about two weeks-ish after Ignite, remember we put everything online

699
00:42:47,040 --> 00:42:51,520
so you can digest at your own pace and go through and find what's relevant.

700
00:42:51,520 --> 00:42:56,600
So I think it's great that folks, even if you weren't able to attend or you did attend

701
00:42:56,600 --> 00:42:59,840
and didn't see everything, you can still catch up on stuff later.

702
00:42:59,840 --> 00:43:01,160
So definitely go do that.

703
00:43:01,160 --> 00:43:03,160
Yeah, it definitely is a fire hose.

704
00:43:03,160 --> 00:43:05,480
I mean, even the book of news itself is a fire hose.

705
00:43:05,480 --> 00:43:08,000
And you've got to realize that's just like the high level.

706
00:43:08,000 --> 00:43:13,400
It's not the full sort of bit of information for whatever that particular thing is.

707
00:43:13,400 --> 00:43:19,280
Like I'm looking right now at delegated managed service accounts in Windows 24H2 or Windows

708
00:43:19,280 --> 00:43:20,280
Server 2025.

709
00:43:20,280 --> 00:43:23,760
And it's like one paragraph, it's like three sentences.

710
00:43:23,760 --> 00:43:28,360
But you know full well, there's like five pages of documentation behind that one particular

711
00:43:28,360 --> 00:43:29,360
feature.

712
00:43:29,360 --> 00:43:31,600
So yeah, it's a real fire hose.

713
00:43:31,600 --> 00:43:36,200
And to Sarah's point, make sure you take a look at not just the book of news, but also

714
00:43:36,200 --> 00:43:39,400
all the online classes or online sessions.

715
00:43:39,400 --> 00:43:42,840
All right, Mark, Sarah, let's bring this to an end.

716
00:43:42,840 --> 00:43:47,920
And to all our listeners out there, we hope you found this episode of use of interest.

717
00:43:47,920 --> 00:43:54,040
Again, go look at the book of news and springboard off into lots of other categories and topics

718
00:43:54,040 --> 00:43:55,760
related to security.

719
00:43:55,760 --> 00:43:57,320
Stay safe and we'll see you in the next one.

720
00:43:57,320 --> 00:44:00,280
Thanks for listening to the Azure Security Podcast.

721
00:44:00,280 --> 00:44:07,120
You can find show notes and other resources at our website azsecuritypodcast.net.

722
00:44:07,120 --> 00:44:11,920
If you have any questions, please find us on Twitter at Azure Setpod.

723
00:44:11,920 --> 00:44:17,280
Background music is from ccmixtor.com and licensed under the Creative Commons license.

