1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast, where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:16,900
Hey everybody, welcome to episode 70.

4
00:00:16,900 --> 00:00:19,800
This week, it's just myself, Michael and Sarah.

5
00:00:19,800 --> 00:00:24,920
We're here with two guests, Beau Fahl and Lou McCurry, who are here to talk to us about

6
00:00:24,920 --> 00:00:26,880
Microsoft Purview.

7
00:00:26,880 --> 00:00:29,240
But before we get to our guests, let's take a quick lap around the news.

8
00:00:29,240 --> 00:00:30,240
Sarah, why don't you kick things off?

9
00:00:30,240 --> 00:00:34,240
A couple of really cool new features in Defender for Cloud that it's worth going to have a

10
00:00:34,240 --> 00:00:40,920
look at, which is the Attack Path blade and also the Cloud Security Explorer.

11
00:00:40,920 --> 00:00:46,560
So if you're using Defender for Cloud, the Attack Path will actually use your environment

12
00:00:46,560 --> 00:00:51,760
context to look at your security issues and it will tell you which things are the biggest

13
00:00:51,760 --> 00:00:57,200
security risks, which obviously is important because we can't fix everything all the time

14
00:00:57,200 --> 00:00:59,640
immediately, so it will help you prioritize.

15
00:00:59,640 --> 00:01:01,840
So that's one thing you should go and look at.

16
00:01:01,840 --> 00:01:09,040
The other one is the Cloud Security Explorer, which will allow you to, again, go and run

17
00:01:09,040 --> 00:01:16,160
graph-based queries on the Cloud Security graph around identifying risks, which is another

18
00:01:16,160 --> 00:01:20,960
way of getting information out of Defender for Cloud about your security posture, if

19
00:01:20,960 --> 00:01:24,080
that's something that you need to do.

20
00:01:24,080 --> 00:01:30,120
And then moving on to my favorite, which is, of course, my baby, Azure Sentinel or Microsoft

21
00:01:30,120 --> 00:01:32,320
Sentinel even.

22
00:01:32,320 --> 00:01:37,160
It's not had that name for a while, but I think the big one I wanted to shout out here

23
00:01:37,160 --> 00:01:42,960
is that the Microsoft 365 Defender data connector is finally GA.

24
00:01:42,960 --> 00:01:48,480
For those of you who have been following this saga or use Sentinel, you know that the Microsoft

25
00:01:48,480 --> 00:01:55,720
365 Defender data connector has been in public preview for a very long time and some organizations

26
00:01:55,720 --> 00:01:59,020
will not use it until it is GA.

27
00:01:59,020 --> 00:02:03,520
So if you're one of those people, hooray, it is here.

28
00:02:03,520 --> 00:02:08,560
The other two things come out recently that I wanted to call out is scheduling for analytics

29
00:02:08,560 --> 00:02:14,560
rules, which means it's in preview, but what it means is that you can actually schedule

30
00:02:14,560 --> 00:02:16,800
your analytics rules execution times.

31
00:02:16,800 --> 00:02:20,240
You didn't used to be able to do this.

32
00:02:20,240 --> 00:02:26,680
When you clicked go on your create in Sentinel, the analytics rule would start running and

33
00:02:26,680 --> 00:02:32,040
then it would have a defined frequency, but you couldn't set the exact time that you wanted

34
00:02:32,040 --> 00:02:33,040
it to run.

35
00:02:33,040 --> 00:02:37,040
You could try and do it awkwardly hitting the create button, but it didn't work well.

36
00:02:37,040 --> 00:02:41,120
But now we have a way that you can actually do that scheduling, which is awesome.

37
00:02:41,120 --> 00:02:44,320
And I'm going to leave it at that for Sentinel.

38
00:02:44,320 --> 00:02:45,840
There's a couple of other things.

39
00:02:45,840 --> 00:02:52,200
Make sure you go to aka.ms slash as new for all the new Sentinel stuff.

40
00:02:52,200 --> 00:02:57,000
We've also had some new announcements in the purview side of things, but rather than me

41
00:02:57,000 --> 00:03:02,600
talk about that, I think that that would be something I will leave when we get to our

42
00:03:02,600 --> 00:03:03,960
guests.

43
00:03:03,960 --> 00:03:06,200
That's everything from me for the news this time, Michael.

44
00:03:06,200 --> 00:03:07,560
You only have one item.

45
00:03:07,560 --> 00:03:14,080
One of the things I'm a huge fan of, Trusted Launch for VMs is now available for US government

46
00:03:14,080 --> 00:03:15,080
regions.

47
00:03:15,080 --> 00:03:18,720
If you're not familiar with Trusted Launch, Trusted Launch uses virtual TPMs.

48
00:03:18,720 --> 00:03:22,520
It's a really foundational technology that's built into Windows, just this notion of Trusted

49
00:03:22,520 --> 00:03:23,520
Launch.

50
00:03:23,520 --> 00:03:27,840
It helps mitigate things like bootkits and rootkits, allows you to use credential guard.

51
00:03:27,840 --> 00:03:33,960
It also allows you to address some critical Department of Defense STIG requirements.

52
00:03:33,960 --> 00:03:38,200
STIGs are essentially documents that are used by the government for making sure that they're

53
00:03:38,200 --> 00:03:41,120
adhering to appropriate security practices.

54
00:03:41,120 --> 00:03:42,960
So this is really great to see.

55
00:03:42,960 --> 00:03:48,360
So it's general availability for Trusted Launch for Azure VMs in Azure for US government regions,

56
00:03:48,360 --> 00:03:49,760
which is great to see.

57
00:03:49,760 --> 00:03:56,200
Michael, I have to interrupt you there because I have trauma related to STIGs.

58
00:03:56,200 --> 00:04:00,400
In a previous job years and years ago, I know of the STIGs because we used to use them to

59
00:04:00,400 --> 00:04:06,400
manually assess the security of virtual machines for our customers.

60
00:04:06,400 --> 00:04:07,620
That was horrible.

61
00:04:07,620 --> 00:04:10,040
So I just wanted to add that in.

62
00:04:10,040 --> 00:04:11,840
That's why I know what STIGs are.

63
00:04:11,840 --> 00:04:15,720
They are great documents, by the way, but I don't recommend doing a manual assessment

64
00:04:15,720 --> 00:04:16,720
on them.

65
00:04:16,720 --> 00:04:21,600
I actually worked on one of the STIGs for secure software development back in the day.

66
00:04:21,600 --> 00:04:25,440
That was actually a lot of fun doing it with DISA to produce this document.

67
00:04:25,440 --> 00:04:26,920
So yeah, I'm a huge fan of STIGs.

68
00:04:26,920 --> 00:04:29,800
They're very complete, but I would agree they can be complex.

69
00:04:29,800 --> 00:04:33,120
And if you're doing it manually, they can be less than optimal.

70
00:04:33,120 --> 00:04:34,120
All right.

71
00:04:34,120 --> 00:04:36,520
So now let's turn our attention to our guests.

72
00:04:36,520 --> 00:04:42,520
As I mentioned, we have Bo and Lou here to talk to us about Microsoft Purview.

73
00:04:42,520 --> 00:04:46,600
Now before we get on to that discussion, here's an interesting fun fact.

74
00:04:46,600 --> 00:04:52,960
I actually work in the Azure Data Team, security engineering, and we technically have Purview

75
00:04:52,960 --> 00:04:54,960
within our sort of Ballywig.

76
00:04:54,960 --> 00:04:59,160
So I'm interested to see what you gentlemen have to talk about because as far as I'm concerned,

77
00:04:59,160 --> 00:05:00,160
we do Purview.

78
00:05:00,160 --> 00:05:01,800
So it'll be really interesting to see what...

79
00:05:01,800 --> 00:05:05,880
I'm not a marketing guy by any stretch, so let's see what you guys have got to say.

80
00:05:05,880 --> 00:05:09,480
So Bo and Lou, again, thank you so much for joining us on the podcast.

81
00:05:09,480 --> 00:05:11,920
We'd like to spend a moment and introduce yourself.

82
00:05:11,920 --> 00:05:13,680
It's good to be here, Michael.

83
00:05:13,680 --> 00:05:14,680
We all do Purview.

84
00:05:14,680 --> 00:05:15,680
That's the reality of it.

85
00:05:15,680 --> 00:05:16,680
So I'm Lou McCurry.

86
00:05:16,680 --> 00:05:21,320
I'm on the East Coast of Australia here looking after the risk and compliance solutions within

87
00:05:21,320 --> 00:05:22,320
the Purview stack.

88
00:05:22,320 --> 00:05:24,800
I'm one of the technical specialists here.

89
00:05:24,800 --> 00:05:26,760
And Bo is my colleague over there in Perth.

90
00:05:26,760 --> 00:05:27,760
Yeah.

91
00:05:27,760 --> 00:05:28,760
How's it going, everyone?

92
00:05:28,760 --> 00:05:29,760
So I'm Bo Foll.

93
00:05:29,760 --> 00:05:31,680
I'm a technology specialist alongside Lou.

94
00:05:31,680 --> 00:05:36,320
I operate out of Perth, Western Australia, and I look after the Purview stack as well.

95
00:05:36,320 --> 00:05:40,240
It's just interesting on that point you made, Michael, around the Purview name, if you like.

96
00:05:40,240 --> 00:05:44,080
It's a bit of a marketing exercise at the moment because what's actually happened is

97
00:05:44,080 --> 00:05:50,920
Microsoft Purview itself relates to a collection of solutions rather than a product name now.

98
00:05:50,920 --> 00:05:56,480
So what you would have thought of as Azure Purview in the past is actually is now referred

99
00:05:56,480 --> 00:06:02,240
to as data map, data catalog, or data estate insights, or the data governance side of the

100
00:06:02,240 --> 00:06:03,240
equation.

101
00:06:03,240 --> 00:06:07,800
Bo and I work on the other side of the Purview solutions stack in the risk and compliance

102
00:06:07,800 --> 00:06:09,480
solutions, the data risk stuff.

103
00:06:09,480 --> 00:06:14,600
So things relating to DLP, information protection, et cetera, those sorts of solutions.

104
00:06:14,600 --> 00:06:19,160
But collectively, it's all now under the Microsoft Purview family name.

105
00:06:19,160 --> 00:06:20,160
OK, that's good to know.

106
00:06:20,160 --> 00:06:21,160
I'm going to be honest with you.

107
00:06:21,160 --> 00:06:23,480
I was really confused when we first started.

108
00:06:23,480 --> 00:06:25,600
Again, I'm really happy.

109
00:06:25,600 --> 00:06:27,680
I'm a nerd and not a marketing guy.

110
00:06:27,680 --> 00:06:31,760
OK, Sarah, why don't you kick it off in terms of, let's see what these guys have got to

111
00:06:31,760 --> 00:06:32,760
say.

112
00:06:32,760 --> 00:06:40,080
Bo and Lou, for the uninitiated, just for a level set before we get into it, what is

113
00:06:40,080 --> 00:06:44,760
Purview and why should people care about it as well?

114
00:06:44,760 --> 00:06:50,400
Yeah, so I guess briefly speaking, it's had that rebrand that Lou was talking about before

115
00:06:50,400 --> 00:06:56,240
where they joined the old compliance section within M365 with the Azure Purview section

116
00:06:56,240 --> 00:06:58,200
and made one complete product.

117
00:06:58,200 --> 00:07:03,200
So I guess while we're talking about what it is and why people care about it, it is

118
00:07:03,200 --> 00:07:07,120
more aligned to, I guess, a data security conversation.

119
00:07:07,120 --> 00:07:09,680
It used to be a more compliance focused.

120
00:07:09,680 --> 00:07:14,880
And I think that message has landed the wrong way so that a group of those technologies,

121
00:07:14,880 --> 00:07:19,160
we would be having the wrong target audience when we're having those conversations.

122
00:07:19,160 --> 00:07:24,680
A lot of it comes down to the fact that data security is the pure focus and you can get

123
00:07:24,680 --> 00:07:28,480
compliance out of it, but it encompasses a bunch of different technologies.

124
00:07:28,480 --> 00:07:33,440
So there's things like information protection, where we have a look at classifying and placing

125
00:07:33,440 --> 00:07:38,560
protective controls on that information, like, for example, encrypting it to specific users

126
00:07:38,560 --> 00:07:39,720
or groups.

127
00:07:39,720 --> 00:07:41,940
It has data loss prevention.

128
00:07:41,940 --> 00:07:45,940
So that's to help data exfiltration, as most of us already know, whether that's accidental

129
00:07:45,940 --> 00:07:50,120
or malicious, and there's been some big changes announced recently that we'll have a chat

130
00:07:50,120 --> 00:07:51,360
about later.

131
00:07:51,360 --> 00:07:56,320
It's got insider risk, and this is kind of like what we want to call as our same within

132
00:07:56,320 --> 00:07:57,320
M365.

133
00:07:57,320 --> 00:08:02,400
So it correlates other information within the tenant to assign a risk level to a user

134
00:08:02,400 --> 00:08:03,960
based on behavior.

135
00:08:03,960 --> 00:08:08,640
Let's say there might be a typical exfiltration sequence that gets detected where someone's

136
00:08:08,640 --> 00:08:12,400
put in their resignation, they've downloaded a bunch of information from SharePoint, and

137
00:08:12,400 --> 00:08:15,160
then they've copied it across to a USB.

138
00:08:15,160 --> 00:08:19,100
That will alert, we'll get telemetry on that as well, and that can also be fed into things

139
00:08:19,100 --> 00:08:21,360
like Sentinel, for example.

140
00:08:21,360 --> 00:08:23,440
There's things like comms compliance.

141
00:08:23,440 --> 00:08:27,360
So this is part of insider risk, and it gives us more visibility in alerting into comms

142
00:08:27,360 --> 00:08:29,720
platforms for things like harassment.

143
00:08:29,720 --> 00:08:34,040
And that goes into Teams, it goes into Exchange, we can put it into third party products with

144
00:08:34,040 --> 00:08:35,040
data connectors.

145
00:08:35,040 --> 00:08:39,440
And I guess the final last little bits in there is there's information governance, which

146
00:08:39,440 --> 00:08:45,720
is kind of tied to what we normally call our records management or ADRMS related capabilities.

147
00:08:45,720 --> 00:08:50,520
So that allows us to put controls over data and use automatic workflows to retain or delete

148
00:08:50,520 --> 00:08:52,440
data at a particular time.

149
00:08:52,440 --> 00:08:55,760
And that helps with things like data hoarding, for example.

150
00:08:55,760 --> 00:09:01,020
And I guess the last bit in here that is actually focused on compliance is compliance manager.

151
00:09:01,020 --> 00:09:04,820
So that's where we use built in templates to measure and track an organization's compliance

152
00:09:04,820 --> 00:09:06,620
to a specific regulation.

153
00:09:06,620 --> 00:09:09,800
So for example, ISO 27001.

154
00:09:09,800 --> 00:09:14,400
So as you can see, there's a lot of technologies and disciplines in there.

155
00:09:14,400 --> 00:09:17,480
So it's good just to touch base on what they can do and what they can accomplish.

156
00:09:17,480 --> 00:09:23,200
Yeah, but I might add to that because we do have a lot of individual solutions within

157
00:09:23,200 --> 00:09:24,520
the purview stack there.

158
00:09:24,520 --> 00:09:29,280
But you know, I want to stress the point that it is an integrated platform, and it does

159
00:09:29,280 --> 00:09:30,280
work seamlessly.

160
00:09:30,280 --> 00:09:34,960
And it might help to think about, I guess, the risks that we're trying to mitigate with

161
00:09:34,960 --> 00:09:35,960
these solutions.

162
00:09:35,960 --> 00:09:42,480
But if you think about the kind of collaboration our customers are doing these days where you're

163
00:09:42,480 --> 00:09:47,120
dealing with sensitive information, which might come into an organization via an email.

164
00:09:47,120 --> 00:09:52,280
So that email comes into an organization, it might contain some PII or some other sensitive

165
00:09:52,280 --> 00:09:55,000
information within an organization.

166
00:09:55,000 --> 00:09:59,080
Oftentimes that information will be exchanged via messaging.

167
00:09:59,080 --> 00:10:03,080
And there's risks here of data leakage within an organization.

168
00:10:03,080 --> 00:10:07,840
That information ends up somewhere within shared storage, so within SharePoint.

169
00:10:07,840 --> 00:10:11,040
And it's there for the sake of secure collaboration.

170
00:10:11,040 --> 00:10:13,960
But there's still a risk here of data exposure.

171
00:10:13,960 --> 00:10:19,120
From here, there's also that risk, the insider threat risk that Bo sort of spoke to where

172
00:10:19,120 --> 00:10:25,280
there's a risk that people might start offloading this to personal cloud storage locations.

173
00:10:25,280 --> 00:10:30,640
So there's some data theft potential here, whether it's malicious or accidental.

174
00:10:30,640 --> 00:10:36,480
And even at the end point itself, offloading or moving files across USB keys or even physically

175
00:10:36,480 --> 00:10:41,280
printing sensitive information, especially in the context of people working from home

176
00:10:41,280 --> 00:10:45,360
with personal devices and God knows who's living with them in their household.

177
00:10:45,360 --> 00:10:48,320
So who's seeing this hard copy?

178
00:10:48,320 --> 00:10:55,400
So with all of that risk, if you like, the idea is within that purview stack is that

179
00:10:55,400 --> 00:10:59,560
the solutions, individual solutions Bo was speaking to do actually inform each other,

180
00:10:59,560 --> 00:11:03,600
they work together, they provide insights all the way through that stack.

181
00:11:03,600 --> 00:11:08,320
Basically, what I'm hearing here, and I know you've given us a really good overview there,

182
00:11:08,320 --> 00:11:13,240
both of you, is that actually, because I always thought, honestly, that purview is just a

183
00:11:13,240 --> 00:11:14,880
compliance thing.

184
00:11:14,880 --> 00:11:17,520
And actually, that's a part of it.

185
00:11:17,520 --> 00:11:25,160
But in fact, that data security piece is probably way, way more important and kind of a bigger

186
00:11:25,160 --> 00:11:26,160
part of the product.

187
00:11:26,160 --> 00:11:31,840
There was a misconception that when it was called compliance, that was its core focus.

188
00:11:31,840 --> 00:11:36,000
It was a tool to meet regulatory compliance, for example.

189
00:11:36,000 --> 00:11:40,400
But really, it's all about that data security or data protection side of things.

190
00:11:40,400 --> 00:11:42,280
So I think it's more of a byproduct.

191
00:11:42,280 --> 00:11:48,320
So these things can achieve compliance in terms of regulation, but it's more about securing

192
00:11:48,320 --> 00:11:49,320
the data itself.

193
00:11:49,320 --> 00:11:54,360
So like information protection, for example, when we encrypt our data, that can achieve

194
00:11:54,360 --> 00:11:57,240
a compliance regulation obligation that we need to meet.

195
00:11:57,240 --> 00:12:02,280
But the core focus we're actually trying to do in that situation is safeguarding and protecting

196
00:12:02,280 --> 00:12:03,520
that data.

197
00:12:03,520 --> 00:12:08,660
So I think in general, the language that we, as Microsoft and the industry itself, has

198
00:12:08,660 --> 00:12:10,860
to change in relation to that.

199
00:12:10,860 --> 00:12:13,360
It's not purely a compliance focus.

200
00:12:13,360 --> 00:12:17,560
We need to make sure that those conversations we're having are about data security or data

201
00:12:17,560 --> 00:12:22,160
protection, because that's what we're actually looking to protect going forward.

202
00:12:22,160 --> 00:12:27,800
But we have a lot of conversations where we're speaking about compliance and the audience

203
00:12:27,800 --> 00:12:30,520
thinks we're there to focus about regulations.

204
00:12:30,520 --> 00:12:33,380
So in the typical sense, we'll end up with the wrong people in that room.

205
00:12:33,380 --> 00:12:37,360
So we'll end up with people like auditors or regulators instead of people that are actually

206
00:12:37,360 --> 00:12:39,600
looking to safeguard that information.

207
00:12:39,600 --> 00:12:45,120
But really, we kind of need to have those conversations with the CSO, privacy officers,

208
00:12:45,120 --> 00:12:48,360
or like SecOps instead of a typical compliance crowd.

209
00:12:48,360 --> 00:12:52,280
We've got a lot of stories where we'll go in to speak to a customer or to a different

210
00:12:52,280 --> 00:12:53,280
audience.

211
00:12:53,280 --> 00:12:56,720
And then they'll realize and we'll realize that we're speaking about two completely different

212
00:12:56,720 --> 00:12:57,720
things.

213
00:12:57,720 --> 00:12:59,600
So the language you're using here is really important.

214
00:12:59,600 --> 00:13:00,600
Yeah.

215
00:13:00,600 --> 00:13:01,600
But the risks are the same.

216
00:13:01,600 --> 00:13:05,460
And that's the interesting thing as well, because I'm finding my job at the moment is

217
00:13:05,460 --> 00:13:10,320
much more interesting and rewarding than it's been for a long time, because those risks

218
00:13:10,320 --> 00:13:12,240
that I was talking about earlier don't change.

219
00:13:12,240 --> 00:13:16,560
But the difference in audience means that we need to satisfy different requirements.

220
00:13:16,560 --> 00:13:21,640
So I'm talking to people about how to implement these DRP controls and how to put in place

221
00:13:21,640 --> 00:13:26,440
protective markings with encryption and how to secure SharePoint sites.

222
00:13:26,440 --> 00:13:30,880
And we're having that conversation because they've got operational requirements and they've

223
00:13:30,880 --> 00:13:34,000
got a real threat they're trying to mitigate.

224
00:13:34,000 --> 00:13:37,960
But then I'll have the same conversation with a different audience who are more compliance

225
00:13:37,960 --> 00:13:38,960
focused.

226
00:13:38,960 --> 00:13:42,800
And for them, it's about aligning those controls with some regulatory framework.

227
00:13:42,800 --> 00:13:45,680
So we're still talking about the same thing, but the motivations are different.

228
00:13:45,680 --> 00:13:49,880
For them, it's all about we've got an audit coming and I've got to be able to prove that

229
00:13:49,880 --> 00:13:53,440
we're actually going to be able to satisfy and secure our data.

230
00:13:53,440 --> 00:13:57,800
But it all comes back to the same purview solution stack, if you like.

231
00:13:57,800 --> 00:13:58,800
All right.

232
00:13:58,800 --> 00:14:01,960
So I have a few questions slash comments slash observations.

233
00:14:01,960 --> 00:14:06,640
So you mentioned SharePoint a few times, but this is not restricted to protecting data

234
00:14:06,640 --> 00:14:07,640
that's in SharePoint, right?

235
00:14:07,640 --> 00:14:15,000
Like I could have a blob store and it has your blob store and assign labels and so on

236
00:14:15,000 --> 00:14:17,080
to that, to the data that's in a blob store, is that right?

237
00:14:17,080 --> 00:14:20,520
Like it can be in many, many places, not just SharePoint.

238
00:14:20,520 --> 00:14:21,800
Yeah, spot on.

239
00:14:21,800 --> 00:14:24,360
So it goes by default out of the box.

240
00:14:24,360 --> 00:14:30,760
It goes into all of the M365 areas, SharePoint Teams, Exchange, OneDrive, Yammer.

241
00:14:30,760 --> 00:14:34,920
With the Azure purview side of things that we spoke about before, that's where we can

242
00:14:34,920 --> 00:14:39,800
do things like extending those sensitivity labels under information protection across

243
00:14:39,800 --> 00:14:44,680
into things like storage blobs, SQL databases, S3 buckets.

244
00:14:44,680 --> 00:14:46,080
That's how we can extend it over.

245
00:14:46,080 --> 00:14:51,960
So we can do things like if the system anywhere detects something like a credit card, let's

246
00:14:51,960 --> 00:14:56,160
put a highly confidential sensitivity label across onto that data.

247
00:14:56,160 --> 00:15:00,400
That can do things like encryption or it can do things like watermarking or safeguarding

248
00:15:00,400 --> 00:15:02,440
that data for reporting as well.

249
00:15:02,440 --> 00:15:03,440
Yeah.

250
00:15:03,440 --> 00:15:08,280
My colleague, Andreas Walter, actually wrote a blog post that came out about a month ago

251
00:15:08,280 --> 00:15:16,040
on integration with purview policies and SQL, which is all pretty cool stuff.

252
00:15:16,040 --> 00:15:20,240
So that's question number one and the observation as well.

253
00:15:20,240 --> 00:15:22,680
You also said DLP, data loss prevention.

254
00:15:22,680 --> 00:15:24,760
But don't we already have a product that does that?

255
00:15:24,760 --> 00:15:26,240
I can add to that as well.

256
00:15:26,240 --> 00:15:27,320
It is purview.

257
00:15:27,320 --> 00:15:30,640
So that data loss prevention is part of the purview stack?

258
00:15:30,640 --> 00:15:33,520
Oh, part of the purview family.

259
00:15:33,520 --> 00:15:36,880
So we're just sort of rebranding it and bring it all under one governance umbrella.

260
00:15:36,880 --> 00:15:37,880
Yeah, spot on.

261
00:15:37,880 --> 00:15:41,440
So it used to be called within M365, the compliance stack.

262
00:15:41,440 --> 00:15:42,440
Okay.

263
00:15:42,440 --> 00:15:46,640
Now all of that stuff that was compliance previously, so things like data loss prevention,

264
00:15:46,640 --> 00:15:51,240
insider risk, information protection, that's all under the purview family umbrella now.

265
00:15:51,240 --> 00:15:52,240
Okay.

266
00:15:52,240 --> 00:15:55,960
So it's a little bit like, okay, this is again, I'm not a marketing person as well.

267
00:15:55,960 --> 00:16:03,000
So what Entra is to identity, purview is to like compliance management of data.

268
00:16:03,000 --> 00:16:04,480
Is that a fair analogy?

269
00:16:04,480 --> 00:16:05,480
Yes.

270
00:16:05,480 --> 00:16:06,480
Risk and data governance, yeah.

271
00:16:06,480 --> 00:16:07,480
Spot on.

272
00:16:07,480 --> 00:16:11,760
The challenge here, Michael, is the fact that all of these individual solutions we're talking

273
00:16:11,760 --> 00:16:16,760
about, you know, data protection, DLP, et cetera, they're evolving as well.

274
00:16:16,760 --> 00:16:23,320
So if I come back briefly to sensitivity labels, you know, in the past, this was exclusively

275
00:16:23,320 --> 00:16:28,240
about putting a label on a Word document or an email, a little drop thing for sensitivity

276
00:16:28,240 --> 00:16:32,640
labeling, confidential versus personal, et cetera.

277
00:16:32,640 --> 00:16:37,200
We can do so much more with these labels now, including what we've just talked about.

278
00:16:37,200 --> 00:16:43,200
But we have this ability now to do container labeling versus content labeling.

279
00:16:43,200 --> 00:16:48,700
So all the traditional ways of applying a label to a document or an email, et cetera.

280
00:16:48,700 --> 00:16:54,800
But now we can create a label within our stack that exposes into your side of the equation,

281
00:16:54,800 --> 00:16:58,800
if you like, within the data governance side of things.

282
00:16:58,800 --> 00:17:03,400
But also that label can actually apply to Microsoft 365 Groups.

283
00:17:03,400 --> 00:17:06,160
We can apply a label to a Teams meeting.

284
00:17:06,160 --> 00:17:10,200
We can apply a label to a Teams channel.

285
00:17:10,200 --> 00:17:16,520
These labels are a way of identifying sensitive information or sensitive locations that need

286
00:17:16,520 --> 00:17:17,960
to be managed in some way.

287
00:17:17,960 --> 00:17:24,600
So when that label is applied to a Teams channel, for example, it doesn't apply markings, but

288
00:17:24,600 --> 00:17:28,600
it controls whether or not you can invite external guests and what happens with the

289
00:17:28,600 --> 00:17:32,640
documents within that channel and the Teams chats and so on and so forth.

290
00:17:32,640 --> 00:17:37,920
But the main point here, without overcomplicating it, is that what we're doing is centralizing

291
00:17:37,920 --> 00:17:39,260
all of that.

292
00:17:39,260 --> 00:17:44,960
So the definition of what is sensitive for my organization is done once.

293
00:17:44,960 --> 00:17:51,440
And I create that once on our site, in our console, if you like, because that never changes.

294
00:17:51,440 --> 00:17:56,400
And then now that I've defined what is sensitive, the second part of that conversation, how

295
00:17:56,400 --> 00:18:02,080
do I control this information when it's in the SQL database, when it's being discussed

296
00:18:02,080 --> 00:18:09,160
in a Teams chat, when it's being exchanged within a SharePoint site?

297
00:18:09,160 --> 00:18:15,200
And DLP is a similar thing as well, because we now have what we refer to as unified DLP.

298
00:18:15,200 --> 00:18:18,920
So it's much more than just transport rules in exchange now.

299
00:18:18,920 --> 00:18:22,080
DLP spans endpoint DLP.

300
00:18:22,080 --> 00:18:23,640
We reach into the CASB as well.

301
00:18:23,640 --> 00:18:29,040
So we've got file policies that we generate from outside that come from the same console.

302
00:18:29,040 --> 00:18:34,840
There's DLP that applies to Power BI, there's DLP that applies to Teams chat, it goes on

303
00:18:34,840 --> 00:18:35,840
and on.

304
00:18:35,840 --> 00:18:40,400
So our traditional understanding of some of these solutions is kind of evolving as well,

305
00:18:40,400 --> 00:18:45,200
which makes it a lot of fun, especially when I'm talking to customers who've been using

306
00:18:45,200 --> 00:18:46,640
a lot of these solutions for a long time.

307
00:18:46,640 --> 00:18:50,320
Because in those cases, it's often a bit of an awakening.

308
00:18:50,320 --> 00:18:53,280
Is something you're probably not aware you have access to?

309
00:18:53,280 --> 00:18:56,600
There's so much more control here and capability.

310
00:18:56,600 --> 00:18:59,000
What does a day in Per-View look like?

311
00:18:59,000 --> 00:19:02,680
Here I am, I'm in charge of data governance.

312
00:19:02,680 --> 00:19:04,920
And what would my day look like?

313
00:19:04,920 --> 00:19:06,360
What things would I be administering?

314
00:19:06,360 --> 00:19:07,880
What alerts would I be looking for?

315
00:19:07,880 --> 00:19:10,080
What policies would I be setting?

316
00:19:10,080 --> 00:19:12,080
I'm just curious.

317
00:19:12,080 --> 00:19:18,360
So typically, if you've got that capability, all of that telemetry is going into the insider

318
00:19:18,360 --> 00:19:19,720
risk engine.

319
00:19:19,720 --> 00:19:23,480
So all of those DLP alerts, all of the MIP access.

320
00:19:23,480 --> 00:19:28,720
So then life with like a Per-View admin, for example, is we'd be focusing on the insider

321
00:19:28,720 --> 00:19:31,680
risk portal and generating alerts out of that.

322
00:19:31,680 --> 00:19:33,440
Then you can do your investigations.

323
00:19:33,440 --> 00:19:38,960
For example, with an insider risk, if we've detected something, let's say we've hooked

324
00:19:38,960 --> 00:19:43,840
it into the HR system, someone's resigned, they've then dumped a bunch of SharePoint

325
00:19:43,840 --> 00:19:47,600
information onto a USB drive and we've blocked it, for example.

326
00:19:47,600 --> 00:19:50,200
We can then automatically do that investigation.

327
00:19:50,200 --> 00:19:55,080
We can reach out into eDiscovery, for example, which is our electronic discovery product

328
00:19:55,080 --> 00:19:59,760
within Per-View, and we can lock down all that information that users touched and keep

329
00:19:59,760 --> 00:20:05,200
that evidence put across so that we can actually do that further investigation.

330
00:20:05,200 --> 00:20:11,080
So it's more depending on where you are within an environment, you can do that alerting and

331
00:20:11,080 --> 00:20:12,760
that monitoring from insider risk.

332
00:20:12,760 --> 00:20:16,520
A lot of the other time, it might be, for example, you might be a records manager and

333
00:20:16,520 --> 00:20:19,360
you're looking after the information governance side of things.

334
00:20:19,360 --> 00:20:24,240
So there's particular regulations or information like credit card data that you shouldn't keep

335
00:20:24,240 --> 00:20:25,800
for longer than you need to.

336
00:20:25,800 --> 00:20:29,840
So the system could be built to say, if we've got credit card information that hasn't been

337
00:20:29,840 --> 00:20:34,840
accessed for seven years, for example, trigger what's called a disposition review where someone

338
00:20:34,840 --> 00:20:39,320
will come in and review that information and then delete it automatically.

339
00:20:39,320 --> 00:20:44,800
I think part of the Per-View instance is that it's so broad and covers so many different

340
00:20:44,800 --> 00:20:49,280
groups and organizations within a business that depending on what product and what your

341
00:20:49,280 --> 00:20:52,600
actual information you're trying to look after, your day to day will be different.

342
00:20:52,600 --> 00:20:57,960
Yeah, I find often I'm also just walking customers back and just getting back to the fundamentals.

343
00:20:57,960 --> 00:21:06,280
So describe your workflow, describe your information flow, because it's possible to start the very

344
00:21:06,280 --> 00:21:07,280
basic level.

345
00:21:07,280 --> 00:21:13,840
And, you know, let's talk about a user who is jumping into SharePoint, downgrading the

346
00:21:13,840 --> 00:21:18,840
sensitivity of the document so that they can then download it and then exfiltrate the content.

347
00:21:18,840 --> 00:21:22,680
Now, everything I just described will be visible within auditing.

348
00:21:22,680 --> 00:21:24,880
So how do I know this is happening?

349
00:21:24,880 --> 00:21:26,600
How can I be alerted of it?

350
00:21:26,600 --> 00:21:30,520
Well, you can jump into the audit log and have a look at the raw data and try and correlate

351
00:21:30,520 --> 00:21:32,480
what's going on and connect the dots.

352
00:21:32,480 --> 00:21:37,120
The next step up is to say, well, let's put some DLP policy in there to control that.

353
00:21:37,120 --> 00:21:41,680
So now I'm generating DLP alerts, which might get a bit noisy, frankly, but at the same

354
00:21:41,680 --> 00:21:43,440
time, you know, they're going to be informative.

355
00:21:43,440 --> 00:21:49,760
And I've got an operational task now to manage these DLP alerts and investigate those.

356
00:21:49,760 --> 00:21:55,760
The next step up to Bo's point is that let's look at all that activity and let's be a bit

357
00:21:55,760 --> 00:22:00,920
clever about this and actually generate some insight as opposed to just raw alerts.

358
00:22:00,920 --> 00:22:05,760
So tell me what my people are doing and where's all this information going and where do I prioritize

359
00:22:05,760 --> 00:22:06,760
my attention?

360
00:22:06,760 --> 00:22:13,200
And with all that in mind, give me one alert when something I really care about is happening.

361
00:22:13,200 --> 00:22:17,240
And I'll drill down from the top within insider risk management.

362
00:22:17,240 --> 00:22:21,600
And I can see that I've got an insider threat going on with this particular user.

363
00:22:21,600 --> 00:22:26,280
As I drill down the stack, I'll then get to the DLP alerts, which may or may not have

364
00:22:26,280 --> 00:22:27,280
been responded to.

365
00:22:27,280 --> 00:22:32,680
And then if I need more detail, drill down into the audit log and actually see the all

366
00:22:32,680 --> 00:22:38,040
the activity, date, time, you know, where's it happening from, et cetera.

367
00:22:38,040 --> 00:22:43,720
That for me is I think the value of when we talk about purview collectively, it's that

368
00:22:43,720 --> 00:22:49,440
conversation about purview as opposed to having four separate conversations with four different

369
00:22:49,440 --> 00:22:51,280
groups of people.

370
00:22:51,280 --> 00:22:56,120
One about data auditing, the second one about information protection, the third one about

371
00:22:56,120 --> 00:23:00,600
DLP and potentially two different people with different policies.

372
00:23:00,600 --> 00:23:03,400
Rather having four conversations, let's have one.

373
00:23:03,400 --> 00:23:04,760
Tell me what the risk is.

374
00:23:04,760 --> 00:23:06,640
Let's think about that and we work top down.

375
00:23:06,640 --> 00:23:11,880
Yeah, and it comes down a lot like because we target or we have those conversations with

376
00:23:11,880 --> 00:23:14,880
so many different target audiences within a business.

377
00:23:14,880 --> 00:23:18,600
Let's say eDiscovery, for example, might be legal or HR.

378
00:23:18,600 --> 00:23:20,680
Same with something like communications compliance.

379
00:23:20,680 --> 00:23:24,800
If we're looking for harassment, for example, that might be a HR job.

380
00:23:24,800 --> 00:23:29,200
That SecOps might be more concerned on data loss prevention and what's being exfiltrated.

381
00:23:29,200 --> 00:23:33,760
Whereas, like I said before, the data governance side of thing might be those records managers

382
00:23:33,760 --> 00:23:37,000
or privacy officers within an organization.

383
00:23:37,000 --> 00:23:38,000
It's very broad.

384
00:23:38,000 --> 00:23:40,160
Actually, this makes Sarah happy.

385
00:23:40,160 --> 00:23:44,280
That alert that I'm talking about with an insider risk can be pumped out to Sentinel.

386
00:23:44,280 --> 00:23:48,480
Then you can do correlation against other services outside of the Microsoft 365 stack

387
00:23:48,480 --> 00:23:51,760
and just go deeper if you need to.

388
00:23:51,760 --> 00:23:54,160
That makes me very happy.

389
00:23:54,160 --> 00:23:55,160
I've got a question.

390
00:23:55,160 --> 00:24:00,200
For those of us, you've already touched on this, but when I think DLP, it reminds me

391
00:24:00,200 --> 00:24:06,160
of as an end user, it reminds me of that old school network based DLP where one of the

392
00:24:06,160 --> 00:24:13,360
employers I used to work for, basically all you had to do was, let's be honest, all you

393
00:24:13,360 --> 00:24:19,560
had to do was zip up a file, put it in a zip file, and then you could email it out.

394
00:24:19,560 --> 00:24:23,840
I have had a play around with Purview and I know it's a lot more smart than that, but

395
00:24:23,840 --> 00:24:30,160
I wondered if you could just tell me about how much more smart it is than that old, you

396
00:24:30,160 --> 00:24:35,240
know, network based DLP that probably quite a few people who are listening are familiar

397
00:24:35,240 --> 00:24:36,240
with.

398
00:24:36,240 --> 00:24:37,240
Yeah.

399
00:24:37,240 --> 00:24:40,080
Our DLP solution goes across N365.

400
00:24:40,080 --> 00:24:45,280
It can go out into the inbuilt agent within Windows as well with endpoint DLP to detect

401
00:24:45,280 --> 00:24:48,520
things like copying something across to a USB.

402
00:24:48,520 --> 00:24:53,600
All of these different bits of technology tie into a central area called sensitive information

403
00:24:53,600 --> 00:24:54,880
types.

404
00:24:54,880 --> 00:24:59,440
We have over, I think, 300 in the system at the moment covering things like credit card

405
00:24:59,440 --> 00:25:03,920
information, Australia's driver's licenses, things like that, depending on your country.

406
00:25:03,920 --> 00:25:09,440
We can use those as well within DLP to automatically find, classify, and block that data from being

407
00:25:09,440 --> 00:25:10,540
shared.

408
00:25:10,540 --> 00:25:15,800
As of a couple of days ago as well, we announced something called adaptive protection.

409
00:25:15,800 --> 00:25:19,400
Within our DLP engine, it can talk to inside risk.

410
00:25:19,400 --> 00:25:24,200
Inside risk, depending on a person's behavior within an organization, can assign that user

411
00:25:24,200 --> 00:25:25,200
a risk level.

412
00:25:25,200 --> 00:25:28,480
Let's say they've done a bunch of dodgy behavior over time.

413
00:25:28,480 --> 00:25:30,000
They're at a high risk level.

414
00:25:30,000 --> 00:25:35,680
We can adaptively put a DLP policy across to every user that has a high risk level.

415
00:25:35,680 --> 00:25:41,440
We can say, if you're a high risk user, automatically put in a DLP action where they can't share

416
00:25:41,440 --> 00:25:43,320
anything externally at all.

417
00:25:43,320 --> 00:25:47,880
Whereas, if we're a medium risk, let's say they've got to do an override when they try

418
00:25:47,880 --> 00:25:50,160
to share and put a justification in.

419
00:25:50,160 --> 00:25:53,800
If you're a low risk user, you can share that information externally, depending on what

420
00:25:53,800 --> 00:25:55,440
that information is.

421
00:25:55,440 --> 00:26:00,640
I think that adaptive protection step up is a bit of a game changer for us.

422
00:26:00,640 --> 00:26:05,240
With that tie into inside risk, because we have all that new telemetry, we were able

423
00:26:05,240 --> 00:26:10,280
to automate that assignment of DLP policies instead of someone manually finding out a

424
00:26:10,280 --> 00:26:14,120
user is risky and having to move them into those different policies manually.

425
00:26:14,120 --> 00:26:16,040
That's a big difference for us.

426
00:26:16,040 --> 00:26:17,040
Yeah.

427
00:26:17,040 --> 00:26:21,040
That's a massive update actually, because your risk level is dynamic as well.

428
00:26:21,040 --> 00:26:28,080
I might be able to copy stuff over to my USB key today and get endpoint DLP giving me a

429
00:26:28,080 --> 00:26:33,400
warning, like a tool tip if you like, and I can dismiss that and just go my merry way.

430
00:26:33,400 --> 00:26:37,640
But if all of a sudden, let's say for argument's sake, I tend my resignation or I'm doing something

431
00:26:37,640 --> 00:26:40,960
else that's dodgy, it will adapt.

432
00:26:40,960 --> 00:26:43,080
What I was able to do yesterday is no longer possible.

433
00:26:43,080 --> 00:26:46,360
I'm now being blocked from copying across.

434
00:26:46,360 --> 00:26:54,320
One of the nice things with this new capability is our ability to go back on existing DLP

435
00:26:54,320 --> 00:26:55,320
policies.

436
00:26:55,320 --> 00:27:00,800
One of the challenges we've seen from customers is if they're too stringent with DLP, it gets

437
00:27:00,800 --> 00:27:03,400
really noisy and users get frustrated.

438
00:27:03,400 --> 00:27:10,320
When people higher up in the food chain complain, often the response is dumb down the policy

439
00:27:10,320 --> 00:27:11,440
and just weaken it.

440
00:27:11,440 --> 00:27:15,800
Let's just drop it back from block to warn instead.

441
00:27:15,800 --> 00:27:18,160
There's a real risk there in doing that.

442
00:27:18,160 --> 00:27:24,560
By going back to those policies and actually making them adaptive, what happens is we will

443
00:27:24,560 --> 00:27:29,120
block when it's justified, but we will drop it back to just a warning to keep everybody

444
00:27:29,120 --> 00:27:30,120
happy and productive.

445
00:27:30,120 --> 00:27:32,320
It's the best of both worlds.

446
00:27:32,320 --> 00:27:36,560
It's all very well labeling an object saying, hey, this document is sensitive or whatever,

447
00:27:36,560 --> 00:27:40,200
but that doesn't really count for much really unless the data is encrypted.

448
00:27:40,200 --> 00:27:42,720
If it is sensitive, you want to really encrypt it.

449
00:27:42,720 --> 00:27:48,080
Can I automatically encrypt something that's detected as being sensitive?

450
00:27:48,080 --> 00:27:51,240
There's multiple stages that we can apply to information protection.

451
00:27:51,240 --> 00:27:56,360
The first one typically is discovery where we look at your data estate after you've defined

452
00:27:56,360 --> 00:27:59,040
your sensitive information types.

453
00:27:59,040 --> 00:28:03,560
As Bo has mentioned, we've got over 300 out of the box ones and you can add your own.

454
00:28:03,560 --> 00:28:07,200
You get a sense of, well, there's all my PII and there's all my stuff.

455
00:28:07,200 --> 00:28:11,440
The next step is to say, well, we're going to map these sensitive information types to

456
00:28:11,440 --> 00:28:13,120
this label.

457
00:28:13,120 --> 00:28:14,720
What does confidential mean?

458
00:28:14,720 --> 00:28:16,840
Anything with a credit card number is confidential.

459
00:28:16,840 --> 00:28:19,840
Anything with this is confidential, et cetera.

460
00:28:19,840 --> 00:28:26,560
You then end up with the ability to apply that protective marking onto a document.

461
00:28:26,560 --> 00:28:32,000
Within the same policy builder, there's an option to say, now let's encrypt it as well

462
00:28:32,000 --> 00:28:35,520
and control people's access to this based on groups.

463
00:28:35,520 --> 00:28:36,520
It's very flexible.

464
00:28:36,520 --> 00:28:41,920
You can give part of the organization co-author rights and other parts read-only access and

465
00:28:41,920 --> 00:28:44,800
other parts of the organization zero access.

466
00:28:44,800 --> 00:28:48,120
It's protection that travels with that content, which is really important.

467
00:28:48,120 --> 00:28:52,280
Now, I was having this conversation with a customer and they were walking through their

468
00:28:52,280 --> 00:28:55,440
console while we were talking and I was demonstrating as well.

469
00:28:55,440 --> 00:28:58,840
They did a bit of the click through and I got to the point where I said, and this is

470
00:28:58,840 --> 00:29:02,520
where you can see where your labels are currently applied.

471
00:29:02,520 --> 00:29:09,120
As you can see in my demo environment, I've got confidential applied to 4,500 emails.

472
00:29:09,120 --> 00:29:15,320
They said, oh, we've got 15,000 emails that have left our organization all marked as confidential.

473
00:29:15,320 --> 00:29:20,640
Then I said, but I'm okay with that because all my emails are protected with encryption.

474
00:29:20,640 --> 00:29:24,320
They said, oh, we haven't gotten to that bit yet.

475
00:29:24,320 --> 00:29:30,120
The problem with that is that their end users doing the right thing are creating emails

476
00:29:30,120 --> 00:29:35,360
and dragging attachments in, sending them externally and applying a label called confidential,

477
00:29:35,360 --> 00:29:41,880
which is a marking only and hitting send without any encryption or protection on that content.

478
00:29:41,880 --> 00:29:42,880
That's a problem.

479
00:29:42,880 --> 00:29:48,240
It's a long answer to your question, Michael, but yes, we can do automatic encryption.

480
00:29:48,240 --> 00:29:52,560
Hopefully through that little anecdote, the important thing is that's even more obvious.

481
00:29:52,560 --> 00:29:53,560
Yeah.

482
00:29:53,560 --> 00:29:56,720
The number of times I've seen an email flying across the wire where it said, this document

483
00:29:56,720 --> 00:29:59,120
is proprietary and confidential, but it's not encrypted.

484
00:29:59,120 --> 00:30:03,280
A lot of the time it'll come down to the organization's maturity as well.

485
00:30:03,280 --> 00:30:07,280
They get a bit worried that if they start encrypting things, people will lose access

486
00:30:07,280 --> 00:30:09,240
and that's not really the case.

487
00:30:09,240 --> 00:30:15,320
We can specify, for example, let's say the SOC group within AD has full access to these

488
00:30:15,320 --> 00:30:16,320
documents.

489
00:30:16,320 --> 00:30:17,600
They can do whatever they want.

490
00:30:17,600 --> 00:30:21,400
Everyone else in the organization has read-only access, so they can't print it, they can't

491
00:30:21,400 --> 00:30:23,320
forward it, they can't send it.

492
00:30:23,320 --> 00:30:27,640
As long as you go through that activity at the beginning to map those controls against

493
00:30:27,640 --> 00:30:32,000
those sensitivity labels, it's not like you're going to get locked out of the document entirely.

494
00:30:32,000 --> 00:30:33,240
You're still going to have access.

495
00:30:33,240 --> 00:30:35,200
You're still going to be able to do your work.

496
00:30:35,200 --> 00:30:38,560
It's just making sure they're at that level where they're comfortable and moving forward

497
00:30:38,560 --> 00:30:40,000
and putting encryption across everything.

498
00:30:40,000 --> 00:30:43,640
I'm not even going to go down the rabbit hole of key management, but that can be another

499
00:30:43,640 --> 00:30:45,800
conversation for another day.

500
00:30:45,800 --> 00:30:52,440
My guess is that the key management is pretty much handled by Purview, so you don't have

501
00:30:52,440 --> 00:30:54,440
to because it could become a complete nightmare.

502
00:30:54,440 --> 00:30:55,440
Correct.

503
00:30:55,440 --> 00:30:57,680
We can do bring your own key.

504
00:30:57,680 --> 00:31:00,720
There are options there for that as well, but you're right.

505
00:31:00,720 --> 00:31:07,480
Folks, I know that there were some Purview announcements this week, well, the week that

506
00:31:07,480 --> 00:31:09,440
we're recording this.

507
00:31:09,440 --> 00:31:14,640
Do you want to tell our listeners a little bit about some of those new things that we've

508
00:31:14,640 --> 00:31:17,240
just announced?

509
00:31:17,240 --> 00:31:21,480
The key thing I want to stress is that adaptive protection that we have now.

510
00:31:21,480 --> 00:31:25,960
I think that's a big game changer for us and that the way that we have Purview with its

511
00:31:25,960 --> 00:31:31,280
integration, as far as I'm aware, I don't think anyone else can deliver that capability.

512
00:31:31,280 --> 00:31:34,840
That saves a lot of time in manually managing your DLP policies.

513
00:31:34,840 --> 00:31:39,160
I know a lot of, at least the customers that I speak to, are really keen for it.

514
00:31:39,160 --> 00:31:42,640
That's currently in public preview, so everyone should be getting access to it.

515
00:31:42,640 --> 00:31:45,560
If not already, it'll be coming really shortly to your tenant.

516
00:31:45,560 --> 00:31:47,240
Definitely worth having a play with.

517
00:31:47,240 --> 00:31:50,520
Just a quick reminder, we do all that without any agents.

518
00:31:50,520 --> 00:31:52,200
Zero deployment.

519
00:31:52,200 --> 00:31:55,200
That is cool because people hate installing agents.

520
00:31:55,200 --> 00:31:56,200
Agreed.

521
00:31:56,200 --> 00:31:57,600
Agents sometimes worry me as well.

522
00:31:57,600 --> 00:32:00,640
I've seen many vulnerabilities in agents over the years, so.

523
00:32:00,640 --> 00:32:02,680
In a previous life, I was rolling out a capability.

524
00:32:02,680 --> 00:32:09,440
I was a desktop service owner and I had a KPI on agent health, which speaks to the problem

525
00:32:09,440 --> 00:32:10,440
with agents.

526
00:32:10,440 --> 00:32:12,120
Forget about the solution we're rolling out.

527
00:32:12,120 --> 00:32:13,560
It was all about the health of the agent.

528
00:32:13,560 --> 00:32:17,040
The ability to actually implement the solution itself was secondary.

529
00:32:17,040 --> 00:32:23,920
So, Bo and Lou, the thing that we always ask our guests before we finish is, if you had

530
00:32:23,920 --> 00:32:29,680
a final thought that you wanted to leave our listeners with, what would it be?

531
00:32:29,680 --> 00:32:36,280
I guess the thing I kind of speak about a lot is don't focus on purview being purely

532
00:32:36,280 --> 00:32:38,440
compliance focused.

533
00:32:38,440 --> 00:32:41,280
Focus it being data protection and data security.

534
00:32:41,280 --> 00:32:45,480
We need to make sure that when we're using that language, we're having conversations

535
00:32:45,480 --> 00:32:50,240
with the right people, but the focus is to make sure that data that we are holding as

536
00:32:50,240 --> 00:32:55,160
organizations is protected and that we're not keeping it longer than it needs to be.

537
00:32:55,160 --> 00:32:58,720
If we're doing that and we're doing it correctly, it puts everyone in a better place moving

538
00:32:58,720 --> 00:32:59,720
forward.

539
00:32:59,720 --> 00:33:04,460
Yeah, my final thought is don't think about the product name.

540
00:33:04,460 --> 00:33:05,460
Think about the risk.

541
00:33:05,460 --> 00:33:11,560
So, when I have customers coming looking for a DLP conversation, in reality, it's an inside

542
00:33:11,560 --> 00:33:16,240
of threat conversation and DLP is one of the possible outcomes of that conversation in

543
00:33:16,240 --> 00:33:17,240
terms of control.

544
00:33:17,240 --> 00:33:19,400
There's a lot of capability there.

545
00:33:19,400 --> 00:33:21,600
Well, let's bring this episode to an end.

546
00:33:21,600 --> 00:33:23,640
Bo and Lou, thanks so much for joining us this week.

547
00:33:23,640 --> 00:33:25,200
I know you guys are very busy.

548
00:33:25,200 --> 00:33:29,880
From my perspective, we look after some parts of purview and those guys are really, really

549
00:33:29,880 --> 00:33:30,880
busy.

550
00:33:30,880 --> 00:33:33,360
So, again, thank you so much for joining us.

551
00:33:33,360 --> 00:33:36,240
To all our listeners out there, we hope you found this useful.

552
00:33:36,240 --> 00:33:38,440
Stay safe and we'll see you next time.

553
00:33:38,440 --> 00:33:41,480
Thanks for listening to the Azure Security Podcast.

554
00:33:41,480 --> 00:33:48,280
You can find show notes and other resources at our website, azsecuritypodcast.net.

555
00:33:48,280 --> 00:33:53,160
If you have any questions, please find us on Twitter at AzureSecPod.

556
00:33:53,160 --> 00:34:17,040
Background music is from ccmixtr.com and licensed under the Creative Commons license.

