1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast, where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:17,800
Hey everybody, welcome to episode 107.

4
00:00:17,800 --> 00:00:20,880
This week is myself, Michael, with Sarah and Mark.

5
00:00:20,880 --> 00:00:25,500
Our guests this week are Max and Emily to talk about Purview Blueprints.

6
00:00:25,500 --> 00:00:28,360
But before we get to our guests, let's take a little lap around the news.

7
00:00:28,360 --> 00:00:30,080
Sarah, why don't you kick things off?

8
00:00:30,080 --> 00:00:31,080
Sure.

9
00:00:31,080 --> 00:00:34,160
So just one piece of news.

10
00:00:34,160 --> 00:00:38,800
The MVP Summit has been announced for March next year.

11
00:00:38,800 --> 00:00:42,680
So if you are an MVP and if you're not, why are you not?

12
00:00:42,680 --> 00:00:45,640
Then you should go to the MVP Global Summit.

13
00:00:45,640 --> 00:00:52,240
It is on the 25th of March in Redmond.

14
00:00:52,240 --> 00:00:56,880
So if you're an MVP, make sure you go sign up and go because it'll be pretty awesome.

15
00:00:56,880 --> 00:01:00,240
And everybody I know who's gone has really enjoyed it.

16
00:01:00,240 --> 00:01:03,080
So that's my one little bit of news this time.

17
00:01:03,080 --> 00:01:04,080
Okay.

18
00:01:04,080 --> 00:01:09,680
I only have one little piece of news and it's from my old stomping ground in SQL Server.

19
00:01:09,680 --> 00:01:12,760
SQL Server Management Studio version 21.

20
00:01:12,760 --> 00:01:13,760
There's been a new feature.

21
00:01:13,760 --> 00:01:17,360
A new feature has been added for always encrypted.

22
00:01:17,360 --> 00:01:21,280
And this allows you to essentially do an assessment on what you want to use.

23
00:01:21,280 --> 00:01:23,080
Like, how do you want to use always encrypted?

24
00:01:23,080 --> 00:01:27,200
And it'll look at your data, make suggestions, say if there may be some issues.

25
00:01:27,200 --> 00:01:32,320
It's been one of the, I'm not going to say adoption blockers, but a thing that makes

26
00:01:32,320 --> 00:01:37,160
always encrypted a little difficult is just knowing how to sort of use it upfront.

27
00:01:37,160 --> 00:01:41,360
And so this always encrypted assessment really takes away a lot of that friction, makes things

28
00:01:41,360 --> 00:01:46,640
a lot easier to know what data to encrypt and what sort of keys to use.

29
00:01:46,640 --> 00:01:48,560
Okay, Mike, what do you got?

30
00:01:48,560 --> 00:01:54,520
So again, my area, what I've got is a couple things that folks I found very valuable on

31
00:01:54,520 --> 00:01:58,560
LinkedIn that I wanted to share some links with.

32
00:01:58,560 --> 00:02:01,680
The first is for the Microsoft CISO workshop.

33
00:02:01,680 --> 00:02:08,280
There's a sort of this hidden gem, as it were, of how to think about security metrics and

34
00:02:08,280 --> 00:02:12,800
for the overall organization and know these aren't sort of how many times you're attacked

35
00:02:12,800 --> 00:02:13,880
on the firewall or whatnot.

36
00:02:13,880 --> 00:02:17,760
They're actually meant to overcome that and get you thinking strategically about how good

37
00:02:17,760 --> 00:02:22,280
you are at preventing stuff versus responding stuff versus are you slowing down the business

38
00:02:22,280 --> 00:02:26,520
organization you're supporting and are the continuous improvement projects moving forward

39
00:02:26,520 --> 00:02:27,520
each month.

40
00:02:27,520 --> 00:02:29,820
So we'll send a link out to those there.

41
00:02:29,820 --> 00:02:35,080
And then there was also some mappings between the Microsoft capabilities and technologies

42
00:02:35,080 --> 00:02:39,420
and the guidance and some of our workshops and how they help you sort of bring a zero

43
00:02:39,420 --> 00:02:43,840
trust vision to life using the open group zero trust reference model standards.

44
00:02:43,840 --> 00:02:47,960
So I'll go ahead and share some links to those in the show notes.

45
00:02:47,960 --> 00:02:48,960
All right.

46
00:02:48,960 --> 00:02:49,960
So let's turn our attention to our guests.

47
00:02:49,960 --> 00:02:52,200
And as I mentioned at the beginning, we have two guests this week.

48
00:02:52,200 --> 00:02:56,880
We have Max and Emily who are here to talk to us about Purview Blueprints.

49
00:02:56,880 --> 00:02:58,440
So Max, welcome Max and Emily.

50
00:02:58,440 --> 00:02:59,440
Welcome to the podcast.

51
00:02:59,440 --> 00:03:03,000
Would you like to take a moment and introduce yourself to our listeners?

52
00:03:03,000 --> 00:03:04,000
Yeah.

53
00:03:04,000 --> 00:03:05,000
Hi everyone.

54
00:03:05,000 --> 00:03:06,620
So I'm Emily Blundow.

55
00:03:06,620 --> 00:03:14,160
I'm part of the CXE Cat team, which is our customer facing side of the product group.

56
00:03:14,160 --> 00:03:21,760
I'm specifically working on our Purview for AI product features, which is all about expanding

57
00:03:21,760 --> 00:03:26,760
Purview data security capabilities to AI applications.

58
00:03:26,760 --> 00:03:28,440
Looking forward to today.

59
00:03:28,440 --> 00:03:29,440
Thank you.

60
00:03:29,440 --> 00:03:30,680
And my name is Max Bombadier.

61
00:03:30,680 --> 00:03:32,800
I work in a similar team as Emily.

62
00:03:32,800 --> 00:03:36,720
I am focused on data security and helping customer deploy.

63
00:03:36,720 --> 00:03:37,720
All right.

64
00:03:37,720 --> 00:03:40,120
So let's get stuck into this.

65
00:03:40,120 --> 00:03:44,580
So whenever I see the word Blueprints, the first thing that comes to mind is Azure Blueprints

66
00:03:44,580 --> 00:03:50,000
with a capital B. Is this them or is this some other kind of Blueprints?

67
00:03:50,000 --> 00:03:54,200
I want to make sure that no one's confused here about whether this is part of the Azure

68
00:03:54,200 --> 00:03:56,760
Blueprints or it's something different.

69
00:03:56,760 --> 00:03:57,760
Thank you, Michael.

70
00:03:57,760 --> 00:03:58,760
It's not.

71
00:03:58,760 --> 00:04:03,400
It's no direct relation from a feature standpoint or product standpoint.

72
00:04:03,400 --> 00:04:08,840
It is if we had to make a comparison, I would say it's closer to the WAF or the architecture

73
00:04:08,840 --> 00:04:09,840
framework.

74
00:04:09,840 --> 00:04:14,000
So it's more of a documentation approach on how customers can leverage best.

75
00:04:14,000 --> 00:04:19,520
They're attempting to solve similar challenges, hence the term Blueprint, but it's there to

76
00:04:19,520 --> 00:04:24,200
provide you a solid foundation on how to get started on whatever scenario the Blueprint

77
00:04:24,200 --> 00:04:25,820
is addressing.

78
00:04:25,820 --> 00:04:30,440
So to be sort of more specific, is that it's a lowercase b Blueprint as opposed to Azure

79
00:04:30,440 --> 00:04:31,440
Blueprints.

80
00:04:31,440 --> 00:04:32,440
Okay.

81
00:04:32,440 --> 00:04:33,440
For the upper case b.

82
00:04:33,440 --> 00:04:34,440
Okay.

83
00:04:34,440 --> 00:04:35,440
Fantastic.

84
00:04:35,440 --> 00:04:36,440
Thanks.

85
00:04:36,440 --> 00:04:37,440
We're not productizing this, at least not yet.

86
00:04:37,440 --> 00:04:38,440
Okay.

87
00:04:38,440 --> 00:04:44,160
So what's the problem that you're trying to solve with these Blueprints for Purview?

88
00:04:44,160 --> 00:04:45,160
What's kind of the point?

89
00:04:45,160 --> 00:04:53,320
I think there were a few different reasons that came together as to why we started this.

90
00:04:53,320 --> 00:04:58,320
The core challenge is Purview has multiple solutions.

91
00:04:58,320 --> 00:05:04,120
Sometimes they work together, they integrate together, sometimes they overlap as well.

92
00:05:04,120 --> 00:05:07,080
And we had a lot of questions about how to deploy.

93
00:05:07,080 --> 00:05:12,880
I mean, at the base of this is how do we use the products available in Purview?

94
00:05:12,880 --> 00:05:17,420
And then it got compounded with some of the other questions we would get is as we think

95
00:05:17,420 --> 00:05:22,440
of solutions such as data loss prevention, information protection, insider risk management,

96
00:05:22,440 --> 00:05:23,680
and so forth.

97
00:05:23,680 --> 00:05:28,600
How do they start integrating and touching to each other into where a data loss prevention

98
00:05:28,600 --> 00:05:34,800
or DLP admin may not consider some of their components of insider risk, for example, and

99
00:05:34,800 --> 00:05:36,560
how really it works best together?

100
00:05:36,560 --> 00:05:43,560
So at the base of this, it was truly about providing a good foundation for most organizations

101
00:05:43,560 --> 00:05:47,040
to start their deployment journey.

102
00:05:47,040 --> 00:05:52,840
And then it grew into what other scenarios can we try to solve?

103
00:05:52,840 --> 00:05:59,080
So there's a lot we can expand from the tactical aspect and we'll dive into this in a little

104
00:05:59,080 --> 00:06:00,080
bit.

105
00:06:00,080 --> 00:06:07,320
But outside of that, the core is helping customers deploy at scale with the Purview solutions.

106
00:06:07,320 --> 00:06:12,760
So if it is essentially best practices and documentation and what have you, so how actually

107
00:06:12,760 --> 00:06:15,640
is this delivered and what are you actually delivering?

108
00:06:15,640 --> 00:06:18,820
Yes, so we're delivering this on Microsoft Learn.

109
00:06:18,820 --> 00:06:24,960
We have carved out a section on Microsoft Learn to have a voice of engineering.

110
00:06:24,960 --> 00:06:29,760
Think of it as notes from engineering where we can provide best practices with proven

111
00:06:29,760 --> 00:06:33,280
practices on deployments that we've done with customers.

112
00:06:33,280 --> 00:06:41,920
It is backed by both the RCXE team, so Emily and I and the rest of our team, as well as

113
00:06:41,920 --> 00:06:46,560
product marketing, product feature PMs as well, working with us to define what are the best

114
00:06:46,560 --> 00:06:50,840
practices and then it's delivered on Microsoft Learn.

115
00:06:50,840 --> 00:06:55,840
The small nuance compared to the existing docs that exists, I like to describe this

116
00:06:55,840 --> 00:07:00,720
as the technical references that we have on Learn describes a lot about the features.

117
00:07:00,720 --> 00:07:05,960
It's the what of the product, whereas the Blueprints is trying to solve the how do we

118
00:07:05,960 --> 00:07:07,400
do it.

119
00:07:07,400 --> 00:07:13,640
With the Blueprint, we have three different things that we're outputting with each of

120
00:07:13,640 --> 00:07:16,120
the different Blueprints that come out.

121
00:07:16,120 --> 00:07:18,760
Maxine touched on this a little bit.

122
00:07:18,760 --> 00:07:23,880
The first one that we have is the Blueprint diagram.

123
00:07:23,880 --> 00:07:26,280
That's really our why.

124
00:07:26,280 --> 00:07:31,960
It's going to showcase what the scenario is, what the problem we're trying to solve is,

125
00:07:31,960 --> 00:07:35,500
and then beyond that, we have our Blueprint slides.

126
00:07:35,500 --> 00:07:41,760
So this is really showing the list of activities that need to be done to solve that scenario

127
00:07:41,760 --> 00:07:45,080
that we're showing in the Blueprint diagram.

128
00:07:45,080 --> 00:07:50,920
And then for even more detail, we go over to the Microsoft Learn documentation, and

129
00:07:50,920 --> 00:07:55,520
that's going to show you how to accomplish each of the activities that are listed out

130
00:07:55,520 --> 00:07:57,820
within the Blueprint.

131
00:07:57,820 --> 00:08:03,940
With the method that we've taken to write this, it's been a fascinating process.

132
00:08:03,940 --> 00:08:10,440
When we think of the Blueprint slide, the single slide has been a very powerful visual

133
00:08:10,440 --> 00:08:16,040
because in one page, we know the list of activities taken to address that scenario.

134
00:08:16,040 --> 00:08:21,940
But it's also been powerful for us as a team, or multiple teams in fact, to go through this

135
00:08:21,940 --> 00:08:24,080
and how do we make this concise?

136
00:08:24,080 --> 00:08:26,560
We've got some amazing products and features.

137
00:08:26,560 --> 00:08:28,560
How do we make it simple?

138
00:08:28,560 --> 00:08:31,240
It's something that can derive from a single slide.

139
00:08:31,240 --> 00:08:35,080
And that turns out that this Blueprint is a great format for the table of content of

140
00:08:35,080 --> 00:08:37,680
the detail guide after that.

141
00:08:37,680 --> 00:08:44,340
So as we attach this, the narrative is for most people to be, whether that's technical

142
00:08:44,340 --> 00:08:48,480
resources in the field, someone who wants to present to executives, someone trying to

143
00:08:48,480 --> 00:08:54,840
as an admin to do the work, to just take on that presentation and know how to re-present

144
00:08:54,840 --> 00:08:57,040
that to others.

145
00:08:57,040 --> 00:08:59,720
I see the Blueprint slide as a takeaway.

146
00:08:59,720 --> 00:09:03,760
We come out of that meeting, this is a list of actions we've got to move on to.

147
00:09:03,760 --> 00:09:05,000
Can you explain some of the actual scenarios?

148
00:09:05,000 --> 00:09:10,880
I mean, we've talked about what the delivery is, but we haven't really touched on yet what

149
00:09:10,880 --> 00:09:12,400
the actual material covers.

150
00:09:12,400 --> 00:09:14,360
What scenarios does it actually cover?

151
00:09:14,360 --> 00:09:19,600
Yeah, I can touch on the first one and then I'll pass it off to Maxime.

152
00:09:19,600 --> 00:09:26,160
So one of the Blueprints that we produced was an oversharing Blueprint.

153
00:09:26,160 --> 00:09:32,480
How this started was throughout a number of customer conversations, I was continuously

154
00:09:32,480 --> 00:09:38,320
hearing that customers were stopping their copilot deployments completely because of

155
00:09:38,320 --> 00:09:41,280
oversharing concerns they were having.

156
00:09:41,280 --> 00:09:46,880
At that time, we didn't really have the guidance that would fully cover how to address those

157
00:09:46,880 --> 00:09:49,720
oversharing concerns that they had.

158
00:09:49,720 --> 00:09:57,760
It needed documentation that was more scenario-based rather than documentation that was on features

159
00:09:57,760 --> 00:10:05,720
or specific products, especially because with this scenario, it would require multiple Microsoft

160
00:10:05,720 --> 00:10:08,880
products to accomplish the goal.

161
00:10:08,880 --> 00:10:16,320
I knew Maxime had started the scenario guide, so I decided to give it a shot with the Blueprint

162
00:10:16,320 --> 00:10:22,560
template that he built and it did really quickly become a hit with customers because as we've

163
00:10:22,560 --> 00:10:27,000
touched on, it's really easy to digest.

164
00:10:27,000 --> 00:10:33,480
It includes multiple products into just one story and it gives step-by-step guidance for

165
00:10:33,480 --> 00:10:34,480
the best practices.

166
00:10:34,480 --> 00:10:41,160
So yeah, that first scenario that I had tackled was about addressing oversharing concerns.

167
00:10:41,160 --> 00:10:47,280
And oversharing being essentially taking something that could be sensitive and divulging that

168
00:10:47,280 --> 00:10:49,400
through some kind of prompt.

169
00:10:49,400 --> 00:10:50,400
Exactly.

170
00:10:50,400 --> 00:10:53,040
Yeah, you got it.

171
00:10:53,040 --> 00:10:57,240
The outcome is fantastic of the Blueprint that Emily worked on because this is making

172
00:10:57,240 --> 00:11:01,560
you comfortable in leveraging M365 Copilot, ensuring that you prepare your environment

173
00:11:01,560 --> 00:11:04,520
for it in a very quick and efficient manner.

174
00:11:04,520 --> 00:11:11,280
What's also very interesting is this is not a SharePoint or an M365 or a Purview separate

175
00:11:11,280 --> 00:11:12,280
initiative.

176
00:11:12,280 --> 00:11:14,520
It's one that's joint across all the teams together.

177
00:11:14,520 --> 00:11:20,560
So truly giving you one Microsoft list of things you can do to address in this context

178
00:11:20,560 --> 00:11:22,920
the oversharing for Copilot.

179
00:11:22,920 --> 00:11:25,040
And as we evolve the...

180
00:11:25,040 --> 00:11:29,520
So the other scenarios that we've published back in September was it's called secure by

181
00:11:29,520 --> 00:11:33,920
default with Microsoft Purview data security.

182
00:11:33,920 --> 00:11:38,040
And the purpose of that one was twofold.

183
00:11:38,040 --> 00:11:40,220
First of all, was to reframe the narrative.

184
00:11:40,220 --> 00:11:45,880
The problem statement was that information labeling takes months, years, some time to

185
00:11:45,880 --> 00:11:46,880
deploy.

186
00:11:46,880 --> 00:11:50,680
And as we were preparing for Copilot and talking about more thinking through features that

187
00:11:50,680 --> 00:11:57,120
we're going to have to help you, again, address the oversharing piece, we also need to ramp

188
00:11:57,120 --> 00:12:02,340
up how quickly we label at scale in organizations.

189
00:12:02,340 --> 00:12:08,880
And what secure by default does is flipping that narrative of, well, labeling is a sticker.

190
00:12:08,880 --> 00:12:13,980
The label is a sticker itself about protection and what protection mechanism are attached.

191
00:12:13,980 --> 00:12:18,680
Something like confidential all employees that is very intuitive for users to know.

192
00:12:18,680 --> 00:12:23,220
And what the blueprint tackle was just, first of all, that reframe of positioning labeling

193
00:12:23,220 --> 00:12:24,300
the correct way.

194
00:12:24,300 --> 00:12:27,660
It's not necessarily just a peer classification method.

195
00:12:27,660 --> 00:12:29,360
It's about protection.

196
00:12:29,360 --> 00:12:36,960
And how can we provide you methods to label most documents that you have in SharePoint,

197
00:12:36,960 --> 00:12:40,240
OneDrive, your emails, and on endpoints as well.

198
00:12:40,240 --> 00:12:46,120
So how do we do that at scale with different approach to accelerate and moving in from

199
00:12:46,120 --> 00:12:52,100
the old crawl, walk, run approach that would have taken you multiple months or sometimes

200
00:12:52,100 --> 00:12:57,040
a few years and move that into how can I do this in less than a year?

201
00:12:57,040 --> 00:13:00,040
How can I possibly do that even faster?

202
00:13:00,040 --> 00:13:05,520
And similar to Emily's experience, we've had meetings where we had been blocked for multiple

203
00:13:05,520 --> 00:13:10,520
months where progress was not progressing on deployment.

204
00:13:10,520 --> 00:13:17,160
And you showcase this again with a blueprint as the takeaway with CISOs in a room and simply

205
00:13:17,160 --> 00:13:21,560
in single meeting being able to unlock, move forward with a plan, a strong plan with the

206
00:13:21,560 --> 00:13:24,720
list of activities that they know how to get to it.

207
00:13:24,720 --> 00:13:28,480
They will adapt the timelines, of course, based on the organization size and their capacity

208
00:13:28,480 --> 00:13:30,780
at what pace they can move.

209
00:13:30,780 --> 00:13:38,640
But outside of that, it was an extremely powerful tool to help unblock and get to deployment.

210
00:13:38,640 --> 00:13:43,280
And just maybe as an extension, so those are the two that are currently published as of

211
00:13:43,280 --> 00:13:44,280
now.

212
00:13:44,280 --> 00:13:47,240
We have a few other ones that are in the work.

213
00:13:47,240 --> 00:13:52,840
So we're thinking through deploying DLP across any workloads that we support in Purview is

214
00:13:52,840 --> 00:13:56,400
the bigger one that's going to be out there in the coming months.

215
00:13:56,400 --> 00:13:59,840
So targeting Q1 in 2025.

216
00:13:59,840 --> 00:14:05,400
We also have more tactical scenarios, something like reducing false positive with advanced

217
00:14:05,400 --> 00:14:06,800
classification.

218
00:14:06,800 --> 00:14:10,400
We're going to have deploying insider risk management policies.

219
00:14:10,400 --> 00:14:16,120
We're going to get in even for insider risk or IRM policies, we're going to go into more

220
00:14:16,120 --> 00:14:18,840
tactical something like just for Gen.AI, for example.

221
00:14:18,840 --> 00:14:22,600
So key scenarios that we hear from customers that we should give them the quick steps on

222
00:14:22,600 --> 00:14:23,760
how to address that.

223
00:14:23,760 --> 00:14:31,060
So it's going to be a mix of Uber product or solution deployment down to I want to protect

224
00:14:31,060 --> 00:14:34,640
source code type scenarios, like something much more tactical.

225
00:14:34,640 --> 00:14:38,200
So you're absolutely speaking my language with sort of the customer outcome thing, because

226
00:14:38,200 --> 00:14:41,720
that's like one of the things I'm always talking about is, you know, ultimately, technology

227
00:14:41,720 --> 00:14:43,220
solves people's problems.

228
00:14:43,220 --> 00:14:45,000
It's not technology for technology sake.

229
00:14:45,000 --> 00:14:46,000
So I love that.

230
00:14:46,000 --> 00:14:52,000
I was wondering if you could actually dig in a little bit deeper on kind of the secure

231
00:14:52,000 --> 00:14:57,440
by default thing and kind of how you think about that and how y'all are approaching that.

232
00:14:57,440 --> 00:15:01,640
What was interesting to see, everybody followed is a bit of a buzzword in one shot.

233
00:15:01,640 --> 00:15:06,460
But at the same time, there's a methodology on how we can actively do it.

234
00:15:06,460 --> 00:15:10,740
And now this can be broken down into products up to a point.

235
00:15:10,740 --> 00:15:16,080
But in context of the first blueprint, what we've done is reshape the thinking of labeling

236
00:15:16,080 --> 00:15:23,600
to what if the default label that's out there that's used for most information is protected

237
00:15:23,600 --> 00:15:24,960
by some means.

238
00:15:24,960 --> 00:15:29,280
And what I mean by this is I give options into at least at the minimum, you should have

239
00:15:29,280 --> 00:15:31,000
DLP on it.

240
00:15:31,000 --> 00:15:37,020
And you can have DLP on content is labeled versus DLP on content that is not labeled.

241
00:15:37,020 --> 00:15:41,920
And what's your strategy with information that are both labeled or unlabeled to start

242
00:15:41,920 --> 00:15:42,920
training user.

243
00:15:42,920 --> 00:15:47,160
So that's the flip or second super important point here.

244
00:15:47,160 --> 00:15:51,680
Training users to manage exceptions and sharing is an exception.

245
00:15:51,680 --> 00:15:54,160
External collaboration is an exception.

246
00:15:54,160 --> 00:15:59,840
Training them at that point for that action is a lot more easy for them to grasp what

247
00:15:59,840 --> 00:16:03,900
is needed than trying to train users on when to secure.

248
00:16:03,900 --> 00:16:08,560
So that whole notion of secure by default is sort of putting the efforts on where and

249
00:16:08,560 --> 00:16:15,240
when do we train users and in this context of labeling, it is about what if every time

250
00:16:15,240 --> 00:16:19,720
I create a document in SharePoint or OneDrive, that document is automatically labeled to

251
00:16:19,720 --> 00:16:22,640
something like confidential all employees.

252
00:16:22,640 --> 00:16:26,720
And I have DLP policies that says I'm not able to share this externally.

253
00:16:26,720 --> 00:16:29,680
It is for confidential all employees.

254
00:16:29,680 --> 00:16:34,600
And this could be, again, you can also have DLP on not labeled just to enforce or reinforce

255
00:16:34,600 --> 00:16:38,760
to the users the need to label properly before sending it out.

256
00:16:38,760 --> 00:16:43,920
And then you can keep the rest of the traditional DLP strategy on top of it.

257
00:16:43,920 --> 00:16:49,840
But the key focus here is what if that information can be protected.

258
00:16:49,840 --> 00:16:55,120
And I say DLP first, then the second part of the conversation would be or second option

259
00:16:55,120 --> 00:16:57,200
could be do we encrypt it?

260
00:16:57,200 --> 00:17:01,440
And the target should be to be able to encrypt this in the long term, understanding where

261
00:17:01,440 --> 00:17:04,100
it has impact when you do encrypt or not.

262
00:17:04,100 --> 00:17:09,040
So one of the example that we do have with customers addressing this is they're going

263
00:17:09,040 --> 00:17:14,040
to start the journey of securing by default with labeling as soon as it goes in SharePoint

264
00:17:14,040 --> 00:17:19,280
and it's encrypted so that users makes a decision when they need to share whether that encryption

265
00:17:19,280 --> 00:17:20,280
needs to be removed.

266
00:17:20,280 --> 00:17:25,240
So I actually think that's a very important point that you make because I've been spending

267
00:17:25,240 --> 00:17:28,340
a lot of time on the secure future initiative these days.

268
00:17:28,340 --> 00:17:35,000
And one of the pillars of SFI is secure by default, but it's very much an Azure specific

269
00:17:35,000 --> 00:17:39,120
thing, rolling out products that are secure by default.

270
00:17:39,120 --> 00:17:44,120
So for example, endpoints require TLS, for example, right?

271
00:17:44,120 --> 00:17:45,120
You can't turn it off.

272
00:17:45,120 --> 00:17:48,280
I mean, that's an example of secure by default.

273
00:17:48,280 --> 00:17:51,320
Or perhaps using TLS 1.2 and above only.

274
00:17:51,320 --> 00:17:53,480
That's another example of secure by default.

275
00:17:53,480 --> 00:17:59,280
But your scenario here is essentially around data loss prevention and how to put policies

276
00:17:59,280 --> 00:18:05,680
around information and data so that it's not lost and bring default configuration settings

277
00:18:05,680 --> 00:18:08,640
in place that enforce that by default.

278
00:18:08,640 --> 00:18:10,120
Is that what you're aiming at?

279
00:18:10,120 --> 00:18:12,000
Did I get that right?

280
00:18:12,000 --> 00:18:13,000
You are.

281
00:18:13,000 --> 00:18:18,320
And that's where the magic of combining information protection, data loss prevention, and insider

282
00:18:18,320 --> 00:18:20,840
risk management comes well together.

283
00:18:20,840 --> 00:18:24,340
When you look at this from a single set of solution, rather than three silos in three

284
00:18:24,340 --> 00:18:28,840
different teams, and most often in an organization, that's how we can address it best.

285
00:18:28,840 --> 00:18:34,680
And I think that's why I'm layering also, if you can have a default that's encrypted

286
00:18:34,680 --> 00:18:35,920
everywhere, that's great.

287
00:18:35,920 --> 00:18:41,200
But I want to give you an ability to also work with sort of the slider between security

288
00:18:41,200 --> 00:18:44,960
and collaboration is always sort of the tug here that we get.

289
00:18:44,960 --> 00:18:49,320
How do I make sure I'm not breaking your regular or existing business process tomorrow?

290
00:18:49,320 --> 00:18:54,040
So I want to make sure that I give you some possibilities to have adoption and options

291
00:18:54,040 --> 00:18:55,040
to configure that.

292
00:18:55,040 --> 00:19:01,600
So it's at the base layer, a lot of organizations have stronger requirements to secure more

293
00:19:01,600 --> 00:19:05,220
in SharePoint and OneDrive, for example, than they have on Endpoint.

294
00:19:05,220 --> 00:19:07,000
We can meet that.

295
00:19:07,000 --> 00:19:11,740
And we can meet it with DLP, we can augment this with information protection, where if

296
00:19:11,740 --> 00:19:14,040
you do provide a mean to...

297
00:19:14,040 --> 00:19:18,600
Well, so it becomes encrypted, but it provides a mean that if it goes out, your protection,

298
00:19:18,600 --> 00:19:21,880
your information is protected where it goes.

299
00:19:21,880 --> 00:19:27,480
And then the third pillar of that with insider risk management is great because now I also

300
00:19:27,480 --> 00:19:31,280
forced an action on the user when they need to share, they need to change the label.

301
00:19:31,280 --> 00:19:35,400
They can do that at the file, they can do it at the site to manage scale of multiple

302
00:19:35,400 --> 00:19:39,440
files, but it's also an audited action.

303
00:19:39,440 --> 00:19:43,560
And then that's where insider risk management can become very interesting because now you're

304
00:19:43,560 --> 00:19:45,760
looking into the behavior of the user.

305
00:19:45,760 --> 00:19:51,360
If they're doing this at scale, they're now risky users and it can have further DLP policies

306
00:19:51,360 --> 00:19:52,360
to block them more.

307
00:19:52,360 --> 00:19:58,680
You know, it's funny, being a cryptographic nerd, I'm always a big fan of encryption or

308
00:19:58,680 --> 00:20:05,400
cryptographic controls on data because if it is leaked, then it's encrypted.

309
00:20:05,400 --> 00:20:08,360
You often hear people say security is only as strong as the weakest link.

310
00:20:08,360 --> 00:20:10,700
It's actually not true at all.

311
00:20:10,700 --> 00:20:16,680
If you have, for example, as you noted before, encrypted data and that data is compromised,

312
00:20:16,680 --> 00:20:21,200
as long as it's encrypted, the attacker just gets ciphertext.

313
00:20:21,200 --> 00:20:23,840
And that's why we often refer to these things as compensating controls, right?

314
00:20:23,840 --> 00:20:27,160
It's there to compensate for weaknesses elsewhere in the system.

315
00:20:27,160 --> 00:20:33,120
So I'm a big fan of things like encryption or just cryptographic controls in general

316
00:20:33,120 --> 00:20:37,480
around data and having that be the default because that way if it is leaked, then I'm

317
00:20:37,480 --> 00:20:39,400
not going to say it doesn't matter.

318
00:20:39,400 --> 00:20:44,200
It does matter, but it doesn't matter as much as it being plain text that has all the product

319
00:20:44,200 --> 00:20:47,160
plans for your next two years.

320
00:20:47,160 --> 00:20:50,000
So yeah, that's good to see.

321
00:20:50,000 --> 00:20:51,000
That's right.

322
00:20:51,000 --> 00:20:54,280
That's perfectly the reason why we're doing it like this.

323
00:20:54,280 --> 00:21:00,440
And the reason why I'm also giving you the options is I recognize that some organizations

324
00:21:00,440 --> 00:21:02,840
have a lot of light of business applications.

325
00:21:02,840 --> 00:21:07,840
It could be a system of record that does not yet support encrypted documents or does not

326
00:21:07,840 --> 00:21:12,800
support MIP or information protection encrypted documents.

327
00:21:12,800 --> 00:21:16,840
So giving you a little bit of that option, I would say that's why the min bar for this

328
00:21:16,840 --> 00:21:22,240
is the default documents that you're creating should be prevented by DLP to go out without

329
00:21:22,240 --> 00:21:23,240
content inspection.

330
00:21:23,240 --> 00:21:29,080
And if you can get that in from a user training perspective, then we're forcing on a user

331
00:21:29,080 --> 00:21:39,480
to make a choice when they need to exfil whether it's legitimate or non-legitimate ways of

332
00:21:39,480 --> 00:21:40,480
exfiltrating the content.

333
00:21:40,480 --> 00:21:44,040
So we're forcing the user to make an action first.

334
00:21:44,040 --> 00:21:49,400
So on the information oversharing, so what were the drivers behind that?

335
00:21:49,400 --> 00:21:51,120
Why do customers want that?

336
00:21:51,120 --> 00:21:52,120
Yeah.

337
00:21:52,120 --> 00:21:57,920
So with oversharing, as I mentioned a little bit earlier, we were having these customer

338
00:21:57,920 --> 00:22:07,520
conversations where our customers were trying to deploy M365 Copilot and that deployment

339
00:22:07,520 --> 00:22:15,200
was being stopped because there were these oversharing concerns that they needed to address.

340
00:22:15,200 --> 00:22:24,020
And similar to what Max was touching on in terms of the secure by default blueprint,

341
00:22:24,020 --> 00:22:31,720
we took a similar approach with the oversharing blueprint because a lot of those common causes

342
00:22:31,720 --> 00:22:39,860
we were seeing of oversharing in Copilot deployments is all about setting the default sharing across

343
00:22:39,860 --> 00:22:47,400
the entire organization through using links that share to the entire organization or sites

344
00:22:47,400 --> 00:22:54,400
that were positioning or permissioning for sites, permissioning for all users within

345
00:22:54,400 --> 00:22:56,400
an organization.

346
00:22:56,400 --> 00:23:04,320
We took that same secure by default approach by setting the default instead to private

347
00:23:04,320 --> 00:23:05,840
for sites.

348
00:23:05,840 --> 00:23:11,140
We chose to limit entire org sharing links.

349
00:23:11,140 --> 00:23:17,800
And then as with the secure by default blueprint, we're utilizing sensitivity labels to either

350
00:23:17,800 --> 00:23:21,960
allow or disallow Copilot access to files.

351
00:23:21,960 --> 00:23:27,940
So it's still that same approach of securing by default to address the oversharing concerns

352
00:23:27,940 --> 00:23:29,200
in a Copilot deployment.

353
00:23:29,200 --> 00:23:31,920
I have a cynical view of this.

354
00:23:31,920 --> 00:23:36,440
No, no, no, no, no negative, just cynical.

355
00:23:36,440 --> 00:23:37,440
Yeah.

356
00:23:37,440 --> 00:23:40,880
So this is from personal experience.

357
00:23:40,880 --> 00:23:43,040
I'm a huge fan of secure by default.

358
00:23:43,040 --> 00:23:45,120
And there's a reason for it.

359
00:23:45,120 --> 00:23:49,560
If you've overshared some data because you don't have a secure by default environment,

360
00:23:49,560 --> 00:23:50,560
the cat's out the bag.

361
00:23:50,560 --> 00:23:53,560
I mean, it's gone, right?

362
00:23:53,560 --> 00:24:00,480
It's much better to have a default secure configuration that is secure out of the box.

363
00:24:00,480 --> 00:24:06,480
And then if a customer needs to loosen the screws, they can do so understanding the risks

364
00:24:06,480 --> 00:24:10,280
versus shipping something where we rely on the customer.

365
00:24:10,280 --> 00:24:14,120
Hey, security of this is completely up to you.

366
00:24:14,120 --> 00:24:19,960
I would much rather we go from our perspective, knowing the product very well and how it works

367
00:24:19,960 --> 00:24:24,520
to have a secure configuration, which then if it doesn't meet the customer's needs and

368
00:24:24,520 --> 00:24:26,240
they can loosen the screws.

369
00:24:26,240 --> 00:24:29,840
But we do need to give them the ability to loosen those screws if they have to because

370
00:24:29,840 --> 00:24:31,860
you can't block the business.

371
00:24:31,860 --> 00:24:37,000
But at least it's better to have something secure out of the box that is not going to

372
00:24:37,000 --> 00:24:41,080
just start leaking stuff all over the internet in the case of some kind of exfiltration.

373
00:24:41,080 --> 00:24:45,600
So my hat's off to you for doing this because I think it's incredibly important.

374
00:24:45,600 --> 00:24:47,320
Yeah, absolutely.

375
00:24:47,320 --> 00:24:53,720
You're exactly correct that we want to make sure we're setting default security, but allowing

376
00:24:53,720 --> 00:24:56,960
the option for efficiency as well.

377
00:24:56,960 --> 00:25:04,360
So it's all about choosing the default being private or choosing specific users to have

378
00:25:04,360 --> 00:25:11,840
permission, but then your end users will know when they can increase the permissions there

379
00:25:11,840 --> 00:25:15,920
and have the options to send or share with more people.

380
00:25:15,920 --> 00:25:21,080
I'll say one last thing and then I promise I'll hand it over to Sarah.

381
00:25:21,080 --> 00:25:25,600
I would much rather have a support call from a customer who said, hey, how do I enable

382
00:25:25,600 --> 00:25:28,840
this business scenario because I'm not having a lot of luck with it because of the default

383
00:25:28,840 --> 00:25:34,320
configuration versus a support call that's, hey, I got whacked and all my data is all

384
00:25:34,320 --> 00:25:35,520
over the internet.

385
00:25:35,520 --> 00:25:36,520
Now what?

386
00:25:36,520 --> 00:25:38,840
That first scenario you can control.

387
00:25:38,840 --> 00:25:43,120
The second scenario, best of luck controlling that.

388
00:25:43,120 --> 00:25:44,660
The damage is done.

389
00:25:44,660 --> 00:25:45,660
That's just my opinion.

390
00:25:45,660 --> 00:25:48,080
I'll hand it over to Sarah for her question.

391
00:25:48,080 --> 00:25:50,920
Actually, I'd like to react to that.

392
00:25:50,920 --> 00:25:54,000
Go for it, Max.

393
00:25:54,000 --> 00:25:59,160
It's been key, so that statement, it resonates so well to why we've done security default,

394
00:25:59,160 --> 00:26:05,320
but also the second statement that I said about training users to manage exception.

395
00:26:05,320 --> 00:26:10,480
If you take the example that you had that you get a support call because a user tried

396
00:26:10,480 --> 00:26:14,920
to expel something, they try to share something, it's not working, and you explain to them

397
00:26:14,920 --> 00:26:19,960
how they can do this securely, they're going to remember this action.

398
00:26:19,960 --> 00:26:21,160
They're not going to call support again.

399
00:26:21,160 --> 00:26:25,160
So not only are you more secure by itself, but you're also training on when they should

400
00:26:25,160 --> 00:26:26,720
do that operation.

401
00:26:26,720 --> 00:26:31,480
It's a reminder too for them that these actions are an exception.

402
00:26:31,480 --> 00:26:36,720
They're tracked, they're audited, and they're secured in the cell.

403
00:26:36,720 --> 00:26:42,600
The key here is trying to train users, and we've seen this internally as well at Microsoft.

404
00:26:42,600 --> 00:26:49,240
We've been teaching on when to label for years, yet we could have done a better job on labeling.

405
00:26:49,240 --> 00:26:51,760
Auto labeling itself can only do so much.

406
00:26:51,760 --> 00:26:56,140
In typical percentage, we see less than 3% of documents would be auto labeled.

407
00:26:56,140 --> 00:26:59,400
This is for your highest sensitivity stuff.

408
00:26:59,400 --> 00:27:03,440
Something like confidential all employees, and that's why we reframed that, is how can

409
00:27:03,440 --> 00:27:06,520
we get that by default and move on?

410
00:27:06,520 --> 00:27:08,200
We're doing the same thing for DLP.

411
00:27:08,200 --> 00:27:15,080
So if I can share some of the thinking there on the blueprint for DLP is traditionally

412
00:27:15,080 --> 00:27:18,960
it's open by default for most organization, and then they look for specific things they

413
00:27:18,960 --> 00:27:20,040
want to block.

414
00:27:20,040 --> 00:27:25,360
So you're essentially relying on your security for a few admins to think of all the scenarios

415
00:27:25,360 --> 00:27:27,280
they should block something.

416
00:27:27,280 --> 00:27:33,720
Well what if the strategy expands that a little bit on closing down some vectors completely,

417
00:27:33,720 --> 00:27:37,600
thinking of a strategy on conditions that are contextual.

418
00:27:37,600 --> 00:27:39,160
If you think of a...

419
00:27:39,160 --> 00:27:41,120
So the label on label, it was one example.

420
00:27:41,120 --> 00:27:46,720
It could be, what's the strategy for file types, file size, and just funneling down

421
00:27:46,720 --> 00:27:54,120
the risk into what's left to do content inspection and then allow sharing.

422
00:27:54,120 --> 00:27:59,560
So it's a fundamental shift into how we think and shape policies.

423
00:27:59,560 --> 00:28:05,560
And just as a close off remark on this, what's great is the mix of simplification of your

424
00:28:05,560 --> 00:28:06,560
policies.

425
00:28:06,560 --> 00:28:12,100
It reduces the burden on incident management, but ironically it makes AI better on your policies

426
00:28:12,100 --> 00:28:17,380
and what we see for a copilot for security, for example, to do an even better job on helping

427
00:28:17,380 --> 00:28:19,860
you identify where there is a gap.

428
00:28:19,860 --> 00:28:23,720
How can people actually use these blueprints?

429
00:28:23,720 --> 00:28:29,320
Because well, we've already touched on why you made them, but what would be your advice

430
00:28:29,320 --> 00:28:33,340
to someone who thinks, hey, I should probably look at these.

431
00:28:33,340 --> 00:28:34,660
What do I do?

432
00:28:34,660 --> 00:28:36,400
Where did they start?

433
00:28:36,400 --> 00:28:44,420
For the oversharing blueprint, I would personally suggest starting by using the blueprint diagram

434
00:28:44,420 --> 00:28:45,640
or guidance.

435
00:28:45,640 --> 00:28:52,560
So the diagram itself really gives you an idea of what we're trying to solve for in

436
00:28:52,560 --> 00:28:58,320
each of the different blueprint scenarios and how we're going to solve for it.

437
00:28:58,320 --> 00:29:04,440
After you've seen that blueprint diagram and the overview, I would move to the detailed

438
00:29:04,440 --> 00:29:11,180
slides that we have to learn what you're going to do in each of the stages of the guide.

439
00:29:11,180 --> 00:29:17,420
For the oversharing blueprint itself, it's going to be broken into a guide for E3 customers

440
00:29:17,420 --> 00:29:20,820
and then a guide for E5 customers.

441
00:29:20,820 --> 00:29:25,760
So you can choose the one that's relevant to your organization.

442
00:29:25,760 --> 00:29:33,480
For some customers, they may have already gone past the point of the pilot or the deploy

443
00:29:33,480 --> 00:29:34,480
phase.

444
00:29:34,480 --> 00:29:40,360
So I would try to get an understanding while you're looking at the diagram and the slides

445
00:29:40,360 --> 00:29:45,160
of where your organization is in a copilot deployment to know where you need to start

446
00:29:45,160 --> 00:29:47,920
within the blueprint.

447
00:29:47,920 --> 00:29:53,520
Once they have that general understanding, you can then move to our detailed guidance,

448
00:29:53,520 --> 00:29:55,480
learn documentation.

449
00:29:55,480 --> 00:30:00,480
That's going to actually walk you through how to take each of the steps that are in

450
00:30:00,480 --> 00:30:01,480
the guide.

451
00:30:01,480 --> 00:30:06,920
Maxime, anything you'd add on to that or thoughts from your side with the secure by default

452
00:30:06,920 --> 00:30:07,920
blueprint?

453
00:30:07,920 --> 00:30:08,920
It's very similar.

454
00:30:08,920 --> 00:30:14,880
I would say starting with consuming the content is the first go-to.

455
00:30:14,880 --> 00:30:20,280
What we're also doing is we're ensuring that all the Microsoft teams, the scale teams,

456
00:30:20,280 --> 00:30:26,000
so if I think of FastTrack or your cloud solution architect that may be assigned to your account,

457
00:30:26,000 --> 00:30:29,520
are all trained with the same blueprints in mind.

458
00:30:29,520 --> 00:30:33,640
What are the foundations that they should teach you about, for example, with the blueprints?

459
00:30:33,640 --> 00:30:39,800
They will be able to nuance some of the writing into what does it mean for you and your organization.

460
00:30:39,800 --> 00:30:46,360
We're also doing this with a number of partners that have taken the blueprints as their foundation

461
00:30:46,360 --> 00:30:48,440
on how they help customer deploy.

462
00:30:48,440 --> 00:30:52,040
I think there's a few options there available to you, but between consuming the content

463
00:30:52,040 --> 00:30:56,800
and leveraging scale teams and partners is a great way to start.

464
00:30:56,800 --> 00:30:57,800
Fantastic stuff.

465
00:30:57,800 --> 00:31:02,760
Love the approach, love the way it helps people connect to the technology capabilities and

466
00:31:02,760 --> 00:31:04,640
actually get value out of it.

467
00:31:04,640 --> 00:31:05,640
Big fan.

468
00:31:05,640 --> 00:31:10,600
Now, let's switch gears just a little bit as we get ready to wrap up.

469
00:31:10,600 --> 00:31:11,800
Let's talk about y'all.

470
00:31:11,800 --> 00:31:17,840
What's a day in the life of a Max or an Emily like?

471
00:31:17,840 --> 00:31:20,080
Let me ask that of each of you specifically.

472
00:31:20,080 --> 00:31:21,960
What's a day in the life look like, Max?

473
00:31:21,960 --> 00:31:26,520
And then Emily, just to understand what y'all do.

474
00:31:26,520 --> 00:31:31,440
I think what's interesting with the question is I think that almost every day is different.

475
00:31:31,440 --> 00:31:36,160
But what is the commonality across each of them is the impact that we can have across

476
00:31:36,160 --> 00:31:38,840
when we help customers.

477
00:31:38,840 --> 00:31:45,920
But if I think of the core activities that we have is how do we help scale everyone at

478
00:31:45,920 --> 00:31:51,880
Microsoft, everyone in the field, partners and customers to deploy.

479
00:31:51,880 --> 00:31:57,880
But it's also to listen in on the common challenges or define the commonality in the challenges

480
00:31:57,880 --> 00:32:00,680
that customers may find in deploying.

481
00:32:00,680 --> 00:32:05,360
Identify with our product teams what would be the best way that we can address this.

482
00:32:05,360 --> 00:32:07,360
It could be sometimes documentation.

483
00:32:07,360 --> 00:32:08,580
It can be features.

484
00:32:08,580 --> 00:32:11,960
It can be some quick updates that we throw out there.

485
00:32:11,960 --> 00:32:17,240
And just really being able to help you move and get you more secure tomorrow.

486
00:32:17,240 --> 00:32:19,200
I'm not aiming for perfection tomorrow.

487
00:32:19,200 --> 00:32:20,800
Just making you more secure.

488
00:32:20,800 --> 00:32:22,280
I'll be happy.

489
00:32:22,280 --> 00:32:23,480
But that's the day in life.

490
00:32:23,480 --> 00:32:27,200
It's going to be a mix of deployment and insights from customers.

491
00:32:27,200 --> 00:32:28,200
Yeah.

492
00:32:28,200 --> 00:32:30,080
So I am on the same team as Max.

493
00:32:30,080 --> 00:32:31,960
So it's going to be pretty similar.

494
00:32:31,960 --> 00:32:34,440
But I'll just add on to what he said.

495
00:32:34,440 --> 00:32:40,760
So to get a little bit more specific, we're part of the product group, but we're part

496
00:32:40,760 --> 00:32:46,480
of the CXE cat side, which is the customer facing side of the product group.

497
00:32:46,480 --> 00:32:51,840
As he mentioned, we're really there to help customers quickly deploy the product that

498
00:32:51,840 --> 00:32:53,900
we're specifically working on.

499
00:32:53,900 --> 00:32:59,160
For myself, I'm working on our purview for AI capabilities.

500
00:32:59,160 --> 00:33:05,400
And it's really all about how do we help to implement these capabilities, implement the

501
00:33:05,400 --> 00:33:10,920
customer feedback that we're hearing, and enhance our product roadmap based off of that

502
00:33:10,920 --> 00:33:11,920
feedback.

503
00:33:11,920 --> 00:33:15,660
For day to day, he kind of talked about this too.

504
00:33:15,660 --> 00:33:16,960
We have customer calls.

505
00:33:16,960 --> 00:33:25,320
That's all about gathering feedback on our products, internal calls to plan out our roadmap,

506
00:33:25,320 --> 00:33:32,560
discuss some upcoming features, talk about maybe potential bugs or timelines for features.

507
00:33:32,560 --> 00:33:34,800
And then it's all about scaling.

508
00:33:34,800 --> 00:33:41,720
So outside of the customer and internal calls, we really just work to scale our knowledge

509
00:33:41,720 --> 00:33:48,960
from the product team to other internal or external customers.

510
00:33:48,960 --> 00:33:54,180
And that could be through documentation writing, blog posts, webinars, or just answering some

511
00:33:54,180 --> 00:33:58,900
questions in the tech communities and meeting new customers.

512
00:33:58,900 --> 00:34:05,160
So Emily and Max, we always ask our guests before we wrap up, what's your final thought

513
00:34:05,160 --> 00:34:06,780
that you want to leave our listeners with?

514
00:34:06,780 --> 00:34:07,780
It can be security.

515
00:34:07,780 --> 00:34:10,100
It can be something else.

516
00:34:10,100 --> 00:34:14,500
But if there's just one thing at the end of this episode.

517
00:34:14,500 --> 00:34:15,960
I have a couple of thoughts.

518
00:34:15,960 --> 00:34:22,340
So as I'm on the Permu4AI team, I want to touch on AI.

519
00:34:22,340 --> 00:34:29,120
As we're really embracing the power of AI and Co-Pilot, I just want to emphasize that

520
00:34:29,120 --> 00:34:35,480
it's really crucial to think about your data security and your governance.

521
00:34:35,480 --> 00:34:40,220
But my final thought for this would be to not stop there.

522
00:34:40,220 --> 00:34:44,800
Data security is extremely important outside of AI too.

523
00:34:44,800 --> 00:34:49,840
And the problem still exists without Co-Pilot or without AI.

524
00:34:49,840 --> 00:34:55,080
So I would use the Blueprint initiative that we're working on to start thinking about your

525
00:34:55,080 --> 00:35:01,640
oversharing and your data security outside of AI and be sure to secure your environment

526
00:35:01,640 --> 00:35:02,640
by default.

527
00:35:02,640 --> 00:35:06,860
And it would be hard for me not to mention secure by default.

528
00:35:06,860 --> 00:35:10,200
It is definitely a foundation that we're trying to push forward.

529
00:35:10,200 --> 00:35:14,520
But I would also recommend just taking a look, taking the opportunity when you think of a

530
00:35:14,520 --> 00:35:17,840
cross purview, reframing how you're doing things.

531
00:35:17,840 --> 00:35:23,680
So you might be moving in from an existing products, DLP for example, and just take the

532
00:35:23,680 --> 00:35:29,640
opportunity to reframe sometimes years of policies that you've had, leverage the best

533
00:35:29,640 --> 00:35:36,120
of what the platform offers you across multiple channels, take advantage of what the AI capabilities

534
00:35:36,120 --> 00:35:39,080
gives you, but also break the silos.

535
00:35:39,080 --> 00:35:45,360
I've seen too many times when if just taking the three products between MIP, DLP and IRM,

536
00:35:45,360 --> 00:35:47,960
how siloed it is within our organization.

537
00:35:47,960 --> 00:35:51,200
It's truly one solution together.

538
00:35:51,200 --> 00:35:58,120
And really to break across those silos internally, I would, and I would, the final, final thought

539
00:35:58,120 --> 00:36:01,640
is absolutely about securing better.

540
00:36:01,640 --> 00:36:05,720
I understand that full security is a challenge to collaboration.

541
00:36:05,720 --> 00:36:08,040
Its adoption may be difficult.

542
00:36:08,040 --> 00:36:14,960
So bringing in waves of, you know, maybe you have some, like giving them a note.

543
00:36:14,960 --> 00:36:19,920
So when I talked about changing label or DLP would override, for example, give them a way

544
00:36:19,920 --> 00:36:25,200
that they can still collaborate, but it's more secure than yesterday and keep iterating

545
00:36:25,200 --> 00:36:26,840
every week this way.

546
00:36:26,840 --> 00:36:30,560
All right, so let's bring this episode to an end.

547
00:36:30,560 --> 00:36:32,120
Max and Emily, thank you for joining us this week.

548
00:36:32,120 --> 00:36:36,840
I know you guys are busy and I always learn something on these podcast episodes and certainly

549
00:36:36,840 --> 00:36:37,840
learned a lot.

550
00:36:37,840 --> 00:36:43,000
Although I heard the oversharing word again, which always brings me back to my daughter

551
00:36:43,000 --> 00:36:45,000
on TikTok, but that's another story.

552
00:36:45,000 --> 00:36:46,480
So with that, let's bring this episode to an end.

553
00:36:46,480 --> 00:36:48,720
Again, thank you for joining us and to all our listeners out there.

554
00:36:48,720 --> 00:36:50,800
We hope you found this episode of use.

555
00:36:50,800 --> 00:36:52,720
Stay safe and we'll see you next time.

556
00:36:52,720 --> 00:36:55,680
Thanks for listening to the Azure Security Podcast.

557
00:36:55,680 --> 00:37:02,520
You can find show notes and other resources at our website, azsecuritypodcast.net.

558
00:37:02,520 --> 00:37:07,360
If you have any questions, please find us on Twitter at Azure Setpod.

559
00:37:07,360 --> 00:37:27,400
Background music is from ccmixtor.com and licensed under the Creative Commons license.

