1
00:00:00,000 --> 00:00:09,600
Welcome to the Azure Security Podcast, where we discuss topics relating to security, privacy,

2
00:00:09,600 --> 00:00:13,280
reliability and compliance on the Microsoft Cloud Platform.

3
00:00:13,280 --> 00:00:17,440
Hey everybody, welcome to the podcast.

4
00:00:17,440 --> 00:00:20,200
This week's episode is episode 99.

5
00:00:20,200 --> 00:00:23,560
Yes, we just won away from the big 100.

6
00:00:23,560 --> 00:00:29,120
This week, our guest is Andrew McMurray, who's here to talk to us about securing copilot

7
00:00:29,120 --> 00:00:30,120
data.

8
00:00:30,120 --> 00:00:33,480
But before we get to our guest, let's take a little lap around the news.

9
00:00:33,480 --> 00:00:35,080
Mike, why don't you kick things off?

10
00:00:35,080 --> 00:00:36,080
Thanks, Michael.

11
00:00:36,080 --> 00:00:38,640
So, my big news is I took a vacation.

12
00:00:38,640 --> 00:00:42,720
Yeah, I actually ended up doing kind of like the big family vacation while the kids are

13
00:00:42,720 --> 00:00:48,680
kind of at that age that's old enough to enjoy it, but not so old that they are teenagers.

14
00:00:48,680 --> 00:00:53,660
One of the things that sort of struck me, we visited things like the ancient Roman ruins

15
00:00:53,660 --> 00:00:57,360
and some of the ancient Greece stuff and whatnot.

16
00:00:57,360 --> 00:01:03,160
I was really struck by how much history some human disciplines have.

17
00:01:03,160 --> 00:01:07,960
Like when you think about what the ancient Greeks did 25 centuries ago, like the Parthenon

18
00:01:07,960 --> 00:01:11,320
on the top of the Acropolis in the center of town, there was not a straight line on

19
00:01:11,320 --> 00:01:14,040
it because it wouldn't look straight if it was straight.

20
00:01:14,040 --> 00:01:15,960
They slightly curved the columns.

21
00:01:15,960 --> 00:01:21,040
They made the columns slightly different sizes so that it looked perfectly straight.

22
00:01:21,040 --> 00:01:25,120
And it was just an amazing amount of stuff that they were able to accomplish back then.

23
00:01:25,120 --> 00:01:28,360
And I was thinking like, wow, like 25 centuries of stuff.

24
00:01:28,360 --> 00:01:33,880
Like in cyber, we barely have two, maybe three or four decades to lean on.

25
00:01:33,880 --> 00:01:37,280
I mean, obviously we have other disciplines and other conflict and all that, but cybersecurity

26
00:01:37,280 --> 00:01:38,760
as a discipline is so new.

27
00:01:38,760 --> 00:01:43,060
I mean, we got old people around that were literally there at the beginning and figuring

28
00:01:43,060 --> 00:01:44,060
some of this stuff out.

29
00:01:44,060 --> 00:01:45,760
I mean, like Michael Howard's still with us.

30
00:01:45,760 --> 00:01:47,360
Sorry, couldn't help it.

31
00:01:47,360 --> 00:01:51,160
Ultimately, it was just, it really struck me as like how new of a discipline we are

32
00:01:51,160 --> 00:01:54,760
and how much we have to kind of figure out the basic rules of it.

33
00:01:54,760 --> 00:02:00,320
And we've been sort of bootstrapped into this like super important role in the world, you

34
00:02:00,320 --> 00:02:06,440
know, protecting elections, protecting democracy, you know, safeguarding the world's information

35
00:02:06,440 --> 00:02:07,440
and knowledge.

36
00:02:07,440 --> 00:02:11,480
It's just, that was one of the things that really struck me is like how new we are, but

37
00:02:11,480 --> 00:02:15,920
also how important we are to the world and how much we have to learn from other disciplines

38
00:02:15,920 --> 00:02:18,040
in the world as well.

39
00:02:18,040 --> 00:02:21,720
And so that's kind of the big thing I've been thinking about since the last episode.

40
00:02:21,720 --> 00:02:27,000
Okay, so I've got a couple of bits of news, both of them about Azure Container apps.

41
00:02:27,000 --> 00:02:29,000
So hooray.

42
00:02:29,000 --> 00:02:34,760
Now Azure Container apps support Azure Key Vault certificates and it's now GA, which

43
00:02:34,760 --> 00:02:36,760
of course is a very good thing.

44
00:02:36,760 --> 00:02:40,880
And you should definitely be using Key Vault to store things.

45
00:02:40,880 --> 00:02:46,760
And in public preview for container apps is managed identity support for scaling rules.

46
00:02:46,760 --> 00:02:49,160
So of course we do love managed identities.

47
00:02:49,160 --> 00:02:53,580
Managed identities are important and we've been, I've been having a lot of discussions

48
00:02:53,580 --> 00:02:55,640
about those over the past year or so.

49
00:02:55,640 --> 00:02:59,640
So of course you shouldn't be just storing a random secret in an app.

50
00:02:59,640 --> 00:03:03,000
You should be using a proper managed identity wherever you can.

51
00:03:03,000 --> 00:03:05,760
So it's nice that we are now supporting that too.

52
00:03:05,760 --> 00:03:08,120
Yeah, it's good to see the managed identity stuff.

53
00:03:08,120 --> 00:03:11,960
Again, I've been talking about this for many, many years now about more and more applications

54
00:03:11,960 --> 00:03:16,600
and as you're moving to managed identities, that way you're not storing a credential somewhere

55
00:03:16,600 --> 00:03:17,600
to authenticate.

56
00:03:17,600 --> 00:03:21,520
And this is really important because this is exactly what the attackers are going after.

57
00:03:21,520 --> 00:03:26,800
So you'll see any of the applications that have historically not had managed identities,

58
00:03:26,800 --> 00:03:31,680
you're certainly going to see more and more take on that technology, which is always good

59
00:03:31,680 --> 00:03:35,880
to see because if the credential is not there, then it can't be compromised by the attackers.

60
00:03:35,880 --> 00:03:37,280
So yeah, this is good to see.

61
00:03:37,280 --> 00:03:42,620
Well, since we are going to be talking about Purview in a little bit, and I really love

62
00:03:42,620 --> 00:03:50,920
conditional access due to how it uses information gathered from the infrastructure, including

63
00:03:50,920 --> 00:04:00,360
user and device in almost real time, and take that information to authorize decisions on

64
00:04:00,360 --> 00:04:01,360
the fly.

65
00:04:01,360 --> 00:04:07,520
I'm going to talk about conditional access policies that allow block access for a user

66
00:04:07,520 --> 00:04:10,800
with insider risk.

67
00:04:10,800 --> 00:04:20,360
Insider risk is a conditional access that basically leverages the signals from Microsoft Purview

68
00:04:20,360 --> 00:04:27,680
adaptive protection capability to detect and automatically try to mitigate insider threats.

69
00:04:27,680 --> 00:04:37,960
For example, Purview may detect unusual activity from a user and conditional access can enforce

70
00:04:37,960 --> 00:04:45,000
security measures such as requiring multi-factor authentication or blocking access.

71
00:04:45,000 --> 00:04:49,520
This is a premium feature and requires a P2 license.

72
00:04:49,520 --> 00:04:56,560
For more information visit a common conditional access policy, block access for users with

73
00:04:56,560 --> 00:04:59,160
insider risk blog.

74
00:04:59,160 --> 00:05:03,200
I found the information provided really interesting.

75
00:05:03,200 --> 00:05:11,680
The second news that I wanted to share is about Active Directory Federation server.

76
00:05:11,680 --> 00:05:19,000
Microsoft has enabled some new migration capabilities provided by IntraID.

77
00:05:19,000 --> 00:05:27,480
This is important because migrating away from ADFS, we are decreasing the surface of attack.

78
00:05:27,480 --> 00:05:33,800
Users basically do not have to maintain the extra software plus the device and use instead

79
00:05:33,800 --> 00:05:37,640
they can use Microsoft IntraID.

80
00:05:37,640 --> 00:05:45,480
The ADFS migration wizard allows customers to quickly identify which ADFS relying party

81
00:05:45,480 --> 00:05:51,760
applications are compatible with being migrated to Microsoft IntraID.

82
00:05:51,760 --> 00:06:01,320
The tool also provides migration readiness of each application, highlights issues and

83
00:06:01,320 --> 00:06:05,060
provide suggested actions to remediate.

84
00:06:05,060 --> 00:06:11,560
In addition, it provides guides to help prepare the individual application for migration and

85
00:06:11,560 --> 00:06:16,160
configure their new Microsoft Intra application.

86
00:06:16,160 --> 00:06:18,960
I've got a few items.

87
00:06:18,960 --> 00:06:24,400
The first one is, and this is a really important one, we're changing our multi-factor authentication

88
00:06:24,400 --> 00:06:27,400
requirements for Azure Sign-In.

89
00:06:27,400 --> 00:06:33,720
Right now we're starting to roll out MFA support for the Azure portal only.

90
00:06:33,720 --> 00:06:39,480
Over time, starting early next year, we will also include the Azure CLI and Azure PowerShell

91
00:06:39,480 --> 00:06:42,600
and other sort of infrastructure as code tools like that.

92
00:06:42,600 --> 00:06:45,280
Right now it's just for the Azure portal.

93
00:06:45,280 --> 00:06:48,920
So you need to make sure that you're all configured to support that.

94
00:06:48,920 --> 00:06:52,640
We will notify global admins of each tenant to make sure that they're aware that this

95
00:06:52,640 --> 00:06:54,600
thing is coming out.

96
00:06:54,600 --> 00:06:59,640
Next one, which is from my old stomping ground in Azure data, Azure Policy Support is now

97
00:06:59,640 --> 00:07:02,760
GA for Postgres SQL Flexible Server.

98
00:07:02,760 --> 00:07:03,760
This is great to see.

99
00:07:03,760 --> 00:07:05,520
I'm a huge fan of Azure Policy.

100
00:07:05,520 --> 00:07:07,640
So there's all sorts of different security policies.

101
00:07:07,640 --> 00:07:12,480
So for example, restricting locations and so on for Postgres SQL.

102
00:07:12,480 --> 00:07:14,680
So that's always good to see.

103
00:07:14,680 --> 00:07:23,040
Next one is Microsoft Entra ID using Windows principles for SQL Managed Instance.

104
00:07:23,040 --> 00:07:27,640
So one of the cool things about Managed Instance is that it's a really good bridge between

105
00:07:27,640 --> 00:07:32,880
on-prem SQL Server and into the cloud because it's the one that's most compatible with SQL

106
00:07:32,880 --> 00:07:36,720
Server, a lot more compatible than Azure SQL Database.

107
00:07:36,720 --> 00:07:41,760
So now we support using Entra ID login, but using Windows principles as well.

108
00:07:41,760 --> 00:07:45,520
So this is good to see because it just makes things a little bit more seamless.

109
00:07:45,520 --> 00:07:51,960
And the last one I have, which again is something that we see across the board, is moving encryption

110
00:07:51,960 --> 00:07:54,800
keys to support customer managed keys.

111
00:07:54,800 --> 00:07:59,640
And the latest product or the latest service that supports that is Backup Vaults.

112
00:07:59,640 --> 00:08:03,760
So now you can have your own customer managed key rather than just the key that's provided

113
00:08:03,760 --> 00:08:06,880
by Azure, by a service managed key.

114
00:08:06,880 --> 00:08:07,880
Great to see.

115
00:08:07,880 --> 00:08:12,240
Okay, so with that, let's turn our attention to our guest.

116
00:08:12,240 --> 00:08:17,320
As I mentioned at the top of the podcast, this week our guest is Andrew McMurray, all

117
00:08:17,320 --> 00:08:19,540
the way from Australia.

118
00:08:19,540 --> 00:08:23,060
And he's here to talk to us about securing copilot data.

119
00:08:23,060 --> 00:08:25,520
So Andrew, welcome to the podcast.

120
00:08:25,520 --> 00:08:28,840
Would you like to spend just a quick minute and give our listeners an idea of what you

121
00:08:28,840 --> 00:08:29,840
do?

122
00:08:29,840 --> 00:08:30,840
Yeah, absolutely.

123
00:08:30,840 --> 00:08:31,840
Thanks, Michael.

124
00:08:31,840 --> 00:08:34,760
So my name is Andrew McMurray, Macca to his friends.

125
00:08:34,760 --> 00:08:36,400
In fact, Macca to everybody.

126
00:08:36,400 --> 00:08:39,520
The only people that call me Andrew are people that are angry with me.

127
00:08:39,520 --> 00:08:44,760
So being Australian, everyone with a Mac in their name becomes Macca, which is unfortunate

128
00:08:44,760 --> 00:08:47,000
given that we call McDonald's Maccas out here.

129
00:08:47,000 --> 00:08:49,400
And that effectively makes me named after a hamburger.

130
00:08:49,400 --> 00:08:54,000
But apart from that, I'm part of the Purview engineering team.

131
00:08:54,000 --> 00:08:55,800
I'm a principal product manager.

132
00:08:55,800 --> 00:08:59,920
I stand to be looking after two areas of Purview.

133
00:08:59,920 --> 00:09:03,600
Firstly the data governance experience, the new data governance experience that we'll

134
00:09:03,600 --> 00:09:06,440
be bringing out later this year.

135
00:09:06,440 --> 00:09:13,280
And I also do quite a bit of work with Purview for AI and looking at how we can secure interactions

136
00:09:13,280 --> 00:09:17,000
with the various copilots that we have in our stable.

137
00:09:17,000 --> 00:09:21,720
Alright, so let's start at the very, very beginning because I know a lot of people,

138
00:09:21,720 --> 00:09:27,960
and I'll be honest, I'm one of them, are really confused about Purview and what it actually

139
00:09:27,960 --> 00:09:34,760
is to spend as long as you want explaining to our listeners precisely what Purview is,

140
00:09:34,760 --> 00:09:36,160
including all the moving parts.

141
00:09:36,160 --> 00:09:43,620
Yeah, well, that is a fantastic question because we have done a fairly reasonable job at confusing

142
00:09:43,620 --> 00:09:45,880
a lot of our customers out there.

143
00:09:45,880 --> 00:09:56,520
We ostensibly Purview is a suite of capabilities that span three major pillars of data security,

144
00:09:56,520 --> 00:09:58,800
data governance, and risk and compliance posture.

145
00:09:58,800 --> 00:10:02,720
So those are the three areas that we look at.

146
00:10:02,720 --> 00:10:09,920
We've conglomerated a bunch of different capabilities into the one umbrella brand very recently

147
00:10:09,920 --> 00:10:12,440
in the last couple of years.

148
00:10:12,440 --> 00:10:19,360
And prior to that, we had Azure Purview and Azure Purview has now become the data governance

149
00:10:19,360 --> 00:10:26,480
pillar of the Microsoft Purview suite, and we also had M365 compliance, which really

150
00:10:26,480 --> 00:10:31,440
covered the data security and the risk and compliance posture capabilities.

151
00:10:31,440 --> 00:10:39,120
Now we did emerge of all of these capabilities into one umbrella term called Microsoft Purview.

152
00:10:39,120 --> 00:10:46,320
So Purview gives you firstly, data security capabilities in M365, as well as multi-cloud

153
00:10:46,320 --> 00:10:52,520
capabilities in terms of data loss prevention, insider risk management, information protection.

154
00:10:52,520 --> 00:10:57,020
And we use this capability called adaptive protection that sits underneath it to really

155
00:10:57,020 --> 00:11:02,440
understand the risk profile of a user and then dynamically adjust their access to information

156
00:11:02,440 --> 00:11:04,780
based on that current risk profile.

157
00:11:04,780 --> 00:11:12,300
In the data governance space, we're all about finding and governing rather than securing

158
00:11:12,300 --> 00:11:13,780
structured data.

159
00:11:13,780 --> 00:11:17,460
And we're starting to make forays into unstructured data as well.

160
00:11:17,460 --> 00:11:22,660
We have a capability called the data map, which is an area that allows us to scan in

161
00:11:22,660 --> 00:11:27,900
various different data source types and get an understanding of the metadata of those

162
00:11:27,900 --> 00:11:32,360
types and what's actually out there in our structured data estate.

163
00:11:32,360 --> 00:11:35,300
We can do that across a multitude of different providers.

164
00:11:35,300 --> 00:11:41,120
We have scanning connectors into around about 150 plus different providers ranging from

165
00:11:41,120 --> 00:11:47,120
Azure, AWS, GCP, and the various workloads that live inside of those.

166
00:11:47,120 --> 00:11:51,300
When we scan that information into our data map, we're then able to promote that information

167
00:11:51,300 --> 00:11:58,700
into a data catalog that is essentially an area where a data consumer working in a business

168
00:11:58,700 --> 00:12:02,980
can easily go in and find the types of data products that they need to work with based

169
00:12:02,980 --> 00:12:04,640
on their job role.

170
00:12:04,640 --> 00:12:08,060
This stuff can be curated by data stewards in the organization.

171
00:12:08,060 --> 00:12:14,140
We can provide access control and policy-based frameworks for getting access to that data.

172
00:12:14,140 --> 00:12:19,060
And we also have the ability to start scanning unstructured data in certain areas such as

173
00:12:19,060 --> 00:12:20,420
AWS S3.

174
00:12:20,420 --> 00:12:25,420
And we're really bringing to life a lot of unstructured scanning of data sources.

175
00:12:25,420 --> 00:12:30,660
Worth being aware that the data governance solution is a metadata scanner.

176
00:12:30,660 --> 00:12:36,100
So whilst we go and scan the data in situ, we don't bring any of that data back into

177
00:12:36,100 --> 00:12:37,100
the data map.

178
00:12:37,100 --> 00:12:39,940
We simply bring metadata about that data back.

179
00:12:39,940 --> 00:12:45,540
And then finally, we have the risk and compliance posture capabilities that are in Purview.

180
00:12:45,540 --> 00:12:50,940
Now these capabilities include things like compliance manager that allow you to get an

181
00:12:50,940 --> 00:12:56,180
understanding of whether your environment meets certain regulatory compliance acts and

182
00:12:56,180 --> 00:12:59,300
is compliant with those.

183
00:12:59,300 --> 00:13:05,180
We have e-discovery and audit, which is very important for us to keep hold of information

184
00:13:05,180 --> 00:13:07,540
when it's needed in discovery cases.

185
00:13:07,540 --> 00:13:12,380
And of course, the audit capability is really manifested in the unified audit log that's

186
00:13:12,380 --> 00:13:17,660
currently sitting inside N365 so that we can get an understanding of all of the administrative

187
00:13:17,660 --> 00:13:24,220
actions that are taken across the tenant and what's actually happening there.

188
00:13:24,220 --> 00:13:27,820
We also have communication compliance and data lifecycle management that sit inside

189
00:13:27,820 --> 00:13:29,380
this pillar as well.

190
00:13:29,380 --> 00:13:35,300
Communication compliance allows us to really understand what people are doing in a team's

191
00:13:35,300 --> 00:13:41,420
environment and be able to track down issues in that environment, such as profanity, sexual

192
00:13:41,420 --> 00:13:46,900
harassment and various other types of non-acceptable conduct.

193
00:13:46,900 --> 00:13:51,660
And then data lifecycle management with its component records management capability allows

194
00:13:51,660 --> 00:13:55,500
us to really understand how long we should be keeping our data.

195
00:13:55,500 --> 00:14:00,820
So many of the organizations out there will keep data indefinitely and that is not necessarily

196
00:14:00,820 --> 00:14:02,300
a great thing to do.

197
00:14:02,300 --> 00:14:06,480
So being able to say, how long should we be retaining this data?

198
00:14:06,480 --> 00:14:08,620
Should we be retaining data no more than five years?

199
00:14:08,620 --> 00:14:10,460
And if that is the case, fine.

200
00:14:10,460 --> 00:14:14,660
As the data hits that particular mark, we'll go and mark that for deletion and remove it

201
00:14:14,660 --> 00:14:18,680
from the tenant so that we don't have old data that's no longer relevant, but could

202
00:14:18,680 --> 00:14:22,260
potentially impose a security risk if it was left lying around.

203
00:14:22,260 --> 00:14:28,500
So that suite of capabilities represents what we consider today as Microsoft Purview from

204
00:14:28,500 --> 00:14:34,580
the M365 side of the tenants all the way through to the structured data existing inside Azure

205
00:14:34,580 --> 00:14:36,680
and outside sources as well.

206
00:14:36,680 --> 00:14:42,220
Now what we wanted to talk about, now we've done our little recap of Purview, is talk

207
00:14:42,220 --> 00:14:48,500
about more about, as we know, tons of people are doing AI things.

208
00:14:48,500 --> 00:14:56,260
And for a lot of people who have Microsoft environments, their first foray into AI is

209
00:14:56,260 --> 00:14:58,940
using Copilot for M365.

210
00:14:58,940 --> 00:15:04,980
Now I know from talking to customers, and I'm sure you do too, that one of the concerns

211
00:15:04,980 --> 00:15:12,340
they have is around, can I put AI, a Copilot across my data when I don't know what's there

212
00:15:12,340 --> 00:15:14,300
and things can be overshared?

213
00:15:14,300 --> 00:15:18,980
So do you want to kick off and tell us, am I right?

214
00:15:18,980 --> 00:15:22,580
Is that what people are worried about and what can we do about it?

215
00:15:22,580 --> 00:15:23,580
Sure.

216
00:15:23,580 --> 00:15:26,620
So it is something that people need to be worried about.

217
00:15:26,620 --> 00:15:33,380
It's fairly common in larger organizations and even smaller organizations for permissions

218
00:15:33,380 --> 00:15:39,420
to become poor over time, for areas of the organization to have sensitive data inside

219
00:15:39,420 --> 00:15:42,620
them that may be a little bit more open than it should be.

220
00:15:42,620 --> 00:15:49,580
I mean, I think anyone out there can probably think of a time when they've gone to an internal

221
00:15:49,580 --> 00:15:55,980
intranet site, typed in a reasonably innocent search term, and then seen stuff come back

222
00:15:55,980 --> 00:16:00,820
that is not necessarily something they should see.

223
00:16:00,820 --> 00:16:02,420
Copilot is great.

224
00:16:02,420 --> 00:16:08,020
Copilot really extends our ability to find and use information, but Copilot is extremely

225
00:16:08,020 --> 00:16:11,220
good at finding information based on the prompt that you give it.

226
00:16:11,220 --> 00:16:15,500
And sometimes that information that comes back might be stuff that you shouldn't see

227
00:16:15,500 --> 00:16:21,580
that just happens to be sitting inside a SharePoint site or something similar that is probably

228
00:16:21,580 --> 00:16:23,820
more open than it needs to be.

229
00:16:23,820 --> 00:16:30,620
Perhaps that document is effectively security by obscurity because nobody is really aware

230
00:16:30,620 --> 00:16:35,220
of the presence of the SharePoint site and we haven't effectively locked that down.

231
00:16:35,220 --> 00:16:40,220
So I could quite innocently type in, give me some salary expectations for level X and

232
00:16:40,220 --> 00:16:44,900
all of a sudden I could be pulling back potential salary information for executives or something

233
00:16:44,900 --> 00:16:47,500
like that if it's not locked down securely.

234
00:16:47,500 --> 00:16:50,180
So that becomes something that we really need to think about.

235
00:16:50,180 --> 00:16:55,520
How do we make sure that we're not oversharing data in the first place or at least making

236
00:16:55,520 --> 00:17:01,700
sure that we have appropriate controls over that data so that Copilot is not accidentally

237
00:17:01,700 --> 00:17:04,940
giving a user something that they shouldn't be seeing?

238
00:17:04,940 --> 00:17:10,080
And that definitely correlates with the experience I've seen with customers where data security

239
00:17:10,080 --> 00:17:12,980
everybody knows data is the thing that's important.

240
00:17:12,980 --> 00:17:14,520
And people have talked about that.

241
00:17:14,520 --> 00:17:18,300
But it's always been on the list, it's just always been towards the bottom of the list

242
00:17:18,300 --> 00:17:21,900
after SOC, after identity, after all these pressing concerns.

243
00:17:21,900 --> 00:17:25,500
And oh, we're migrating to the cloud, we need to take care of our infrastructure and make

244
00:17:25,500 --> 00:17:28,820
sure they're doing DevOps, we need to make sure there's security in it.

245
00:17:28,820 --> 00:17:34,000
And so it always ended up just slipping to last place is one of the things I've noticed.

246
00:17:34,000 --> 00:17:40,220
And I've really seen since the advent of AI how much it's really popped forward onto so

247
00:17:40,220 --> 00:17:43,020
many security organizations' radars.

248
00:17:43,020 --> 00:17:47,180
And it's just been kind of like one of those, instead of giving it lip service that yes,

249
00:17:47,180 --> 00:17:50,620
it's important, we're starting to see some real action on it.

250
00:17:50,620 --> 00:17:55,940
And I'm really glad to hear about all the different purview tools that can help with

251
00:17:55,940 --> 00:17:56,940
those challenges.

252
00:17:56,940 --> 00:17:59,900
At the end of the day, it's just fundamentals, right, Andrew?

253
00:17:59,900 --> 00:18:01,460
I mean, it really is just fundamental.

254
00:18:01,460 --> 00:18:02,460
It's labeling.

255
00:18:02,460 --> 00:18:06,460
I guess is that copilot honors those labels, is that true?

256
00:18:06,460 --> 00:18:07,460
Absolutely right.

257
00:18:07,460 --> 00:18:09,140
Yeah, for sure.

258
00:18:09,140 --> 00:18:15,860
So if we think about purview's capability of sensitivity labeling, this is something

259
00:18:15,860 --> 00:18:20,460
that becomes really important when thinking about where the data is sitting in the data

260
00:18:20,460 --> 00:18:24,260
estate and the level of access that I should or shouldn't have to it.

261
00:18:24,260 --> 00:18:28,140
Now labeling has been around for quite some time, but it tends to be one of those things

262
00:18:28,140 --> 00:18:35,540
that's a little bit intimidating for organizations to put in place, because an aspect of sensitivity

263
00:18:35,540 --> 00:18:40,300
labeling is the ability for us to apply an encryption template to unstructured data,

264
00:18:40,300 --> 00:18:43,860
to documents sitting inside various document repositories.

265
00:18:43,860 --> 00:18:49,420
And that encryption can make it very secure, and that encryption will move with the file

266
00:18:49,420 --> 00:18:51,700
because it's baked into the file.

267
00:18:51,700 --> 00:18:57,980
But the danger there is if we're putting too much of a control onto it, then or too stringent

268
00:18:57,980 --> 00:19:03,260
a control onto it, then users that might legitimately need to open that particular file that aren't

269
00:19:03,260 --> 00:19:08,540
necessarily referenced inside the ACL associated with the encryption template, that they may

270
00:19:08,540 --> 00:19:11,140
not be able to open that particular file.

271
00:19:11,140 --> 00:19:16,280
But it's one of those things that when planned properly is actually very, very powerful and

272
00:19:16,280 --> 00:19:18,320
quite easy to do.

273
00:19:18,320 --> 00:19:24,180
One of the benefits that we provide you out of the box in M365 is a default sensitivity

274
00:19:24,180 --> 00:19:25,180
label taxonomy.

275
00:19:25,180 --> 00:19:30,500
A lot of customers out there have not really thought about the taxonomy they should use

276
00:19:30,500 --> 00:19:32,700
for their sensitivity labels.

277
00:19:32,700 --> 00:19:39,340
And over a period of many years now, when we think about the acquisition of a company

278
00:19:39,340 --> 00:19:46,420
called Secure Islands that became our sensitivity labeling capability back in 2016, we've always

279
00:19:46,420 --> 00:19:51,540
had a set of default sensitivity labels, but we've really optimized them over time so that

280
00:19:51,540 --> 00:19:57,540
they make sense for most customers that are coming into this as a new customer.

281
00:19:57,540 --> 00:20:02,220
Those five default sensitivity labels, when we look at the upper two, confidential and

282
00:20:02,220 --> 00:20:07,660
highly confidential, we assign encryption templates to them to allow access to people

283
00:20:07,660 --> 00:20:12,980
within the organizational tenant, but not necessarily outside that.

284
00:20:12,980 --> 00:20:17,540
However, we need to think about how this is also done.

285
00:20:17,540 --> 00:20:23,140
So if I have those generic labels, then I could assign something as confidential and

286
00:20:23,140 --> 00:20:29,060
that would allow anyone within the organization that has an Entrez ID account to get access

287
00:20:29,060 --> 00:20:30,220
to that info.

288
00:20:30,220 --> 00:20:32,740
We're probably going to need to make some changes to that.

289
00:20:32,740 --> 00:20:37,140
There's going to be certain pieces of information that are more sensitive than what the default

290
00:20:37,140 --> 00:20:38,300
taxonomy will give me.

291
00:20:38,300 --> 00:20:41,260
So I'll probably want to tweak those a little bit.

292
00:20:41,260 --> 00:20:45,460
Executive salary information, for instance, should probably be very highly confidential

293
00:20:45,460 --> 00:20:48,900
and be locked down to only certain people in the organization.

294
00:20:48,900 --> 00:20:53,300
Now when I'm searching for things via copilot, one of the cool things that copilot will do

295
00:20:53,300 --> 00:20:57,820
is it will aggregate sources of data to try and get me a final answer as to the question

296
00:20:57,820 --> 00:20:59,260
that I'm asking.

297
00:20:59,260 --> 00:21:04,380
And those sources of data might be multiple files with different sensitivity labels.

298
00:21:04,380 --> 00:21:10,100
If I'm attempting to access something through copilot that is protected with a sensitivity

299
00:21:10,100 --> 00:21:14,780
label that I do not have access to based on an encryption template, the first thing that

300
00:21:14,780 --> 00:21:17,900
will occur is copilot will say, I will not give you that information.

301
00:21:17,900 --> 00:21:20,100
I will indicate there's information out there.

302
00:21:20,100 --> 00:21:24,620
However, it is a sensitivity level that you do not have access to and will not show me

303
00:21:24,620 --> 00:21:26,760
the actual results of that information.

304
00:21:26,760 --> 00:21:32,340
So if I want to find out the CEO salary, that is completely locked down via a sensitivity

305
00:21:32,340 --> 00:21:35,260
label that only allows three or four people to see it.

306
00:21:35,260 --> 00:21:38,620
So copilot will tell me, sorry, can't show you that.

307
00:21:38,620 --> 00:21:44,060
Additionally, the copilot response itself will be tagged with a sensitivity.

308
00:21:44,060 --> 00:21:50,140
So if for instance, I produce a prompt that gives me back, let's say a document that is

309
00:21:50,140 --> 00:21:55,140
generally sensitive, in other words, has no real problems with me seeing it.

310
00:21:55,140 --> 00:21:59,540
And another response that's highly confidential that I happen to have access to, the entire

311
00:21:59,540 --> 00:22:03,740
response will be labeled as the highest of those sensitivity labels because I'm giving

312
00:22:03,740 --> 00:22:06,700
you information that is considered highly confidential.

313
00:22:06,700 --> 00:22:11,160
I will tag the entire conversation as highly confidential as well.

314
00:22:11,160 --> 00:22:17,460
So having the ability there to ensure that we're not returning data you shouldn't see.

315
00:22:17,460 --> 00:22:21,980
But when we are returning data, you should see, we will make sure you know what the highest

316
00:22:21,980 --> 00:22:24,720
sensitivity flag of that response is.

317
00:22:24,720 --> 00:22:29,380
This makes sure that the end user can be somewhat responsible with how they use that response.

318
00:22:29,380 --> 00:22:35,680
Kind of reminds me a little bit in the day, I'm already aging myself here.

319
00:22:35,680 --> 00:22:41,940
Index server back in Windows used to take the access control lists off the files that

320
00:22:41,940 --> 00:22:43,020
were being indexed.

321
00:22:43,020 --> 00:22:48,340
So that way it could actually honor the ACLs in the queries result rather than just because

322
00:22:48,340 --> 00:22:50,680
index server, which runs a system on the box.

323
00:22:50,680 --> 00:22:51,980
So of course it can read everything.

324
00:22:51,980 --> 00:22:56,200
And the reason why it runs a system is for very good technical reasons.

325
00:22:56,200 --> 00:22:58,380
But because it's running a system, it can read everything.

326
00:22:58,380 --> 00:22:59,380
But that doesn't mean you can.

327
00:22:59,380 --> 00:23:02,780
That doesn't mean the person doing the query can read everything.

328
00:23:02,780 --> 00:23:06,500
So yeah, the way index server used to do it back in the day was by maintaining the ACL

329
00:23:06,500 --> 00:23:07,500
information from the files.

330
00:23:07,500 --> 00:23:09,820
So I was trying to make sure I get this 100% right.

331
00:23:09,820 --> 00:23:16,900
So you're saying that if I do a copilot query, and let's say just humor me, that two sources

332
00:23:16,900 --> 00:23:20,420
of data, two files are used to build up the results.

333
00:23:20,420 --> 00:23:24,380
And one is public and the other one is confidential.

334
00:23:24,380 --> 00:23:28,260
The result, it doesn't matter what's in the result, will be confidential.

335
00:23:28,260 --> 00:23:30,260
Absolutely correct.

336
00:23:30,260 --> 00:23:31,340
That's great.

337
00:23:31,340 --> 00:23:34,900
I think one of the other things too, to be aware of with that is in the response, it

338
00:23:34,900 --> 00:23:39,180
will also clearly identify the sensitivity rating of each of the sources that it used

339
00:23:39,180 --> 00:23:40,980
to compile that result as well.

340
00:23:40,980 --> 00:23:45,860
So there should never be any confusion over which one of those references that came back

341
00:23:45,860 --> 00:23:48,340
was the one that was the highly confidential.

342
00:23:48,340 --> 00:23:51,100
It's all very clearly laid out to the end user.

343
00:23:51,100 --> 00:23:55,260
If I have a derivative word, which is the query, can I save that file?

344
00:23:55,260 --> 00:23:56,620
Can I save that result?

345
00:23:56,620 --> 00:23:58,500
How does that work?

346
00:23:58,500 --> 00:24:03,860
So the result will appear in obviously the interaction between the user and copilot.

347
00:24:03,860 --> 00:24:09,780
When the results come back, the assets that we use to compile that result are linked into

348
00:24:09,780 --> 00:24:13,220
the result that comes back from copilot.

349
00:24:13,220 --> 00:24:18,940
So if I have access to any of that information, the guts of the information will be in the

350
00:24:18,940 --> 00:24:21,500
response, but there will also be a link to the file.

351
00:24:21,500 --> 00:24:24,720
The link to the file will still remain an encrypted file.

352
00:24:24,720 --> 00:24:28,060
So if I go and download that file and then I go and give it to somebody else who shouldn't

353
00:24:28,060 --> 00:24:33,020
have it, that end person that I've given it to, if they're not part of the encryption

354
00:24:33,020 --> 00:24:37,780
template that determines the ACL for who is allowed to access that and a decryption, they

355
00:24:37,780 --> 00:24:39,780
won't be able to get access to it.

356
00:24:39,780 --> 00:24:44,420
So it's not like the result is shipping me decryption keys that I can then utilize outside

357
00:24:44,420 --> 00:24:46,500
of that experience.

358
00:24:46,500 --> 00:24:55,460
So at the beginning, you mentioned all the different services that are part of Perview

359
00:24:55,460 --> 00:25:00,420
suite and the data protection capabilities they provide.

360
00:25:00,420 --> 00:25:06,180
Can you elaborate why logging and eDiscovery is important and what capability this enable,

361
00:25:06,180 --> 00:25:08,180
especially with copilot?

362
00:25:08,180 --> 00:25:09,500
Absolutely.

363
00:25:09,500 --> 00:25:12,660
So logging and eDiscovery are particularly important.

364
00:25:12,660 --> 00:25:19,460
Any type of interaction with copilot gets logged into the unified audit log so that

365
00:25:19,460 --> 00:25:23,500
we can get an understanding of who is using copilot and what they're doing.

366
00:25:23,500 --> 00:25:29,220
eDiscovery becomes particularly important because in the event of some kind of issue

367
00:25:29,220 --> 00:25:36,540
that is of a legal nature, eDiscovery can capture the exact interactions that occurred

368
00:25:36,540 --> 00:25:43,140
between a user and the service and take a copy of that into a discovery package that

369
00:25:43,140 --> 00:25:45,460
can then be used in a legal scenario.

370
00:25:45,460 --> 00:25:48,100
And it doesn't matter where that's coming from, we'll capture that.

371
00:25:48,100 --> 00:25:53,620
So if I'm using copilot in a Teams environment, we'll capture that information.

372
00:25:53,620 --> 00:25:58,500
If I'm using copilot inline, inside Word, Excel, PowerPoint, for instance, we can capture

373
00:25:58,500 --> 00:26:01,460
that information as well as part of eDiscovery.

374
00:26:01,460 --> 00:26:06,620
So it's really important that we have the ability to see exactly how the interactions

375
00:26:06,620 --> 00:26:12,660
with copilot are occurring and that we have the evidence that we need to provide if things

376
00:26:12,660 --> 00:26:15,260
are brought up in a legal scenario.

377
00:26:15,260 --> 00:26:19,420
So capturing all of that stuff automatically, extremely important and allows us to bolster

378
00:26:19,420 --> 00:26:21,420
any cases that we might find ourselves involved in.

379
00:26:21,420 --> 00:26:24,540
As you can give me a practical example of that.

380
00:26:24,540 --> 00:26:25,540
Sure.

381
00:26:25,540 --> 00:26:34,020
So let's just say that I've been working as a designer for some time and I'm working on

382
00:26:34,020 --> 00:26:41,380
a brand new product and I want to have a look at some of those confidential design documents.

383
00:26:41,380 --> 00:26:46,420
Now let's assume at the moment that we don't have a really good sensitivity label taxonomy

384
00:26:46,420 --> 00:26:47,500
in place.

385
00:26:47,500 --> 00:26:53,260
So what I do is I go and open up my document and I'm working with that.

386
00:26:53,260 --> 00:26:57,700
Then I go to copilot and I say, please give me a list of all the information related to

387
00:26:57,700 --> 00:27:00,260
the spec for this particular design.

388
00:27:00,260 --> 00:27:02,500
So copilot gives me that information.

389
00:27:02,500 --> 00:27:06,660
And in this case, because we don't necessarily have a sensitivity label framework in place,

390
00:27:06,660 --> 00:27:12,500
I take that information and I am it to a competitor, one of my friends at a competitor and say,

391
00:27:12,500 --> 00:27:14,580
look at this cool thing that we're building.

392
00:27:14,580 --> 00:27:17,620
Now we've got ourselves a bit of a bit of a problem.

393
00:27:17,620 --> 00:27:18,820
We've leaked some information.

394
00:27:18,820 --> 00:27:20,180
It's highly confidential.

395
00:27:20,180 --> 00:27:23,580
Unfortunately, we didn't have those other controls across it.

396
00:27:23,580 --> 00:27:28,260
The logging and the e-discovery allow us to go back and look at the interactions that

397
00:27:28,260 --> 00:27:33,780
occurred and then ensure that we maintain those interactions and don't delete them so

398
00:27:33,780 --> 00:27:38,780
that if something does occur from a legal perspective, we now have evidence as to what

399
00:27:38,780 --> 00:27:41,660
happened, who did it and when it occurred.

400
00:27:41,660 --> 00:27:47,580
People are listening to this and they're thinking, okay, I want to use copilot and I am pretty

401
00:27:47,580 --> 00:27:53,780
sure our data is a mess or we don't even know because I can say my experience, I think people

402
00:27:53,780 --> 00:28:00,220
don't even know what, but they probably suspect that their data is overshared, etc.

403
00:28:00,220 --> 00:28:01,220
What can we do?

404
00:28:01,220 --> 00:28:06,660
Now I know that we've been working on some steps like a maturity model that people can

405
00:28:06,660 --> 00:28:14,180
take, but let's talk about maybe the first two phases, what you recommend people did

406
00:28:14,180 --> 00:28:16,500
to start with.

407
00:28:16,500 --> 00:28:18,940
Absolutely, yes.

408
00:28:18,940 --> 00:28:24,420
We are definitely working on guidance around things like oversharing concerns for your

409
00:28:24,420 --> 00:28:28,260
copilot for M365 deployment, absolutely doing that.

410
00:28:28,260 --> 00:28:33,380
Hopefully that collateral will be released soon, but just at the very start, one of the

411
00:28:33,380 --> 00:28:38,980
things you want to get a handle on is exactly how much of my content is overshared in the

412
00:28:38,980 --> 00:28:41,700
first place and where is that content residing?

413
00:28:41,700 --> 00:28:47,900
One of the first things that you can do is go into your AI Hub in the compliance portal

414
00:28:47,900 --> 00:28:49,340
and run an oversharing report.

415
00:28:49,340 --> 00:28:55,940
An oversharing report in AI Hub is really useful because it will give you a 30-day backdated

416
00:28:55,940 --> 00:29:01,300
report of the number of unprotected files in SharePoint Online that were referenced

417
00:29:01,300 --> 00:29:04,300
by Microsoft copilot, which is really, really useful.

418
00:29:04,300 --> 00:29:09,700
Over the last 30 days, here's all the files that were protected that copilot accessed.

419
00:29:09,700 --> 00:29:13,420
Then just by doing that, you can get an understanding of, well, what sites are they in?

420
00:29:13,420 --> 00:29:14,780
Why are they not locked down?

421
00:29:14,780 --> 00:29:21,340
Maybe I need to go in there and start working on securing my SharePoint Online environments

422
00:29:21,340 --> 00:29:23,820
a little bit more.

423
00:29:23,820 --> 00:29:27,660
If you look at something like the SharePoint Advanced Management capabilities in SharePoint

424
00:29:27,660 --> 00:29:34,540
Premium, there's a data access governance report that you can run to get an understanding

425
00:29:34,540 --> 00:29:39,620
of what are the permissions across my sites and what do I necessarily need to do in order

426
00:29:39,620 --> 00:29:40,620
to fix that.

427
00:29:40,620 --> 00:29:45,900
At the very beginning, we shouldn't just be jumping into saying, right, let's throw sensitivity

428
00:29:45,900 --> 00:29:48,780
labels on everything with the highest levels of encryption.

429
00:29:48,780 --> 00:29:53,060
Let's first find out what our level of exposure is before we go any further.

430
00:29:53,060 --> 00:29:59,900
Once you've done that, you can start taking some steps to look at your sensitivity label

431
00:29:59,900 --> 00:30:07,220
taxonomy, make sure it works for you and make sure that the ability to lock down that information

432
00:30:07,220 --> 00:30:08,500
is in there.

433
00:30:08,500 --> 00:30:10,420
Some of your labels will require encryption.

434
00:30:10,420 --> 00:30:16,220
Will they be a static set of ACLs in the encryption or will it be more open so that the user,

435
00:30:16,220 --> 00:30:21,980
when they apply the label, can then choose which users, groups, domains should have access

436
00:30:21,980 --> 00:30:23,820
to this information as well?

437
00:30:23,820 --> 00:30:30,060
But also, don't forget that SharePoint sites themselves have the ability to have sensitivity

438
00:30:30,060 --> 00:30:32,100
labels applied to them.

439
00:30:32,100 --> 00:30:36,900
In my organization, I can create sensitivity labels that determine the privacy settings

440
00:30:36,900 --> 00:30:41,240
of SharePoint sites as well as the access control settings of SharePoint sites.

441
00:30:41,240 --> 00:30:46,380
By labeling a SharePoint site itself as highly confidential, I might immediately restrict

442
00:30:46,380 --> 00:30:48,740
it to a private group, a private site.

443
00:30:48,740 --> 00:30:54,540
I might immediately say that in unmanaged devices, I can't download any information

444
00:30:54,540 --> 00:30:55,540
from it.

445
00:30:55,540 --> 00:31:00,700
Just various things that will ensure that Copilot is not getting too much access to

446
00:31:00,700 --> 00:31:03,260
information itself from the very beginning.

447
00:31:03,260 --> 00:31:06,740
And then of course, there are other things you can do around things like site lifecycle

448
00:31:06,740 --> 00:31:11,660
management, making sure that sites are not left lying around when they're no longer relevant.

449
00:31:11,660 --> 00:31:14,540
Things like auto labeling.

450
00:31:14,540 --> 00:31:21,380
So getting to a point where we're not responsible for the end users to label things themselves,

451
00:31:21,380 --> 00:31:26,260
but actually setting up auto labeling rules across things like Exchange and SharePoint

452
00:31:26,260 --> 00:31:28,700
to look for the presence of sensitive data.

453
00:31:28,700 --> 00:31:32,940
Find those credit card numbers, automatically label the files inside them with the right

454
00:31:32,940 --> 00:31:35,640
sensitivity labeling and the right encryption template.

455
00:31:35,640 --> 00:31:38,380
So again, the user doesn't have to worry about it.

456
00:31:38,380 --> 00:31:43,260
But also we know that Copilot will not accidentally be giving out information that maybe it shouldn't.

457
00:31:43,260 --> 00:31:45,220
Are you serious?

458
00:31:45,220 --> 00:31:47,220
There's an oversharing report?

459
00:31:47,220 --> 00:31:48,220
Absolutely.

460
00:31:48,220 --> 00:31:49,220
Yep.

461
00:31:49,220 --> 00:31:50,220
There's an oversharing report.

462
00:31:50,220 --> 00:31:55,380
If you go to your AMI Hub, you run your oversharing report and it will give you the number of

463
00:31:55,380 --> 00:32:00,220
unprotected files in SharePoint Online that were referenced by N365 Copilot in the last

464
00:32:00,220 --> 00:32:01,220
30 days.

465
00:32:01,220 --> 00:32:02,220
That's funny.

466
00:32:02,220 --> 00:32:04,660
Many one of those in social media, I think.

467
00:32:04,660 --> 00:32:07,220
Anyway, that's another discussion.

468
00:32:07,220 --> 00:32:12,340
So tell me what like a day in the life, what it's like to work with Microsoft Purview.

469
00:32:12,340 --> 00:32:16,220
Like is it one role that generally tends to do all these things?

470
00:32:16,220 --> 00:32:21,460
Is it different sort of customer roles and jobs within the organization and different

471
00:32:21,460 --> 00:32:22,460
teams?

472
00:32:22,460 --> 00:32:23,860
Like what does that look like?

473
00:32:23,860 --> 00:32:28,340
And then, you know, like who, you know, basically who uses it and what is their kind of daily

474
00:32:28,340 --> 00:32:29,820
workflow look like?

475
00:32:29,820 --> 00:32:30,820
Sure.

476
00:32:30,820 --> 00:32:35,220
So generally speaking, when you think about Microsoft Purview as an umbrella, it does

477
00:32:35,220 --> 00:32:41,180
tend to span, you know, certain teams and those teams tend to match up quite nicely

478
00:32:41,180 --> 00:32:42,420
with the pillars themselves.

479
00:32:42,420 --> 00:32:48,500
And that was very deliberately done because we tend to find in our customers that we have

480
00:32:48,500 --> 00:32:54,780
teams dedicated to data security, stuff like, you know, monitoring data loss prevention,

481
00:32:54,780 --> 00:32:59,660
checking for data loss incidents, looking at things like insider risk.

482
00:32:59,660 --> 00:33:03,980
We have the data governance office, which is generally very different to the data security

483
00:33:03,980 --> 00:33:04,980
office.

484
00:33:04,980 --> 00:33:10,460
The data governance office is all about democratizing access to data for your data consumers, whilst

485
00:33:10,460 --> 00:33:17,580
ensuring that they are governed effectively to ensure that the right people are accessing

486
00:33:17,580 --> 00:33:19,820
the right information at the right time.

487
00:33:19,820 --> 00:33:24,380
And then you have your risk and your compliance groups, which are really responsible for that

488
00:33:24,380 --> 00:33:28,900
more legal side of the fence, ensuring that data is no longer left around that doesn't

489
00:33:28,900 --> 00:33:30,020
need to be there.

490
00:33:30,020 --> 00:33:36,420
And also making sure that anything that does occur is, you know, prepared for any legal

491
00:33:36,420 --> 00:33:38,400
opportunities that come along.

492
00:33:38,400 --> 00:33:44,300
So we tend to find that those three pillars generally represent three different teams.

493
00:33:44,300 --> 00:33:48,380
And the levels of interactions between those teams can be greater or less depending on

494
00:33:48,380 --> 00:33:50,660
the company in question.

495
00:33:50,660 --> 00:33:55,260
Smaller companies, you tend to find data security and risk and compliance merged into a single

496
00:33:55,260 --> 00:33:56,260
area.

497
00:33:56,260 --> 00:34:03,020
So your average purview user in there will have quite broad responsibilities, but data

498
00:34:03,020 --> 00:34:07,380
governance tends to be something that is siloed into its own department.

499
00:34:07,380 --> 00:34:16,060
And it's quite often not necessarily controlled by IT itself, but by high level business roles.

500
00:34:16,060 --> 00:34:22,780
And IT becomes very much a provider for the data governance function.

501
00:34:22,780 --> 00:34:28,260
That's actually a beautiful segue into something we want to add to each of the episodes, which

502
00:34:28,260 --> 00:34:33,500
is when we talk to our guests is get an idea of what their day in the life looks like.

503
00:34:33,500 --> 00:34:35,940
Like what Mac on an average day?

504
00:34:35,940 --> 00:34:37,660
I mean, what is your job involved?

505
00:34:37,660 --> 00:34:40,620
You know, what are you wake up in the morning and what's next?

506
00:34:40,620 --> 00:34:42,580
Just sort of walk through what a typical day looks like.

507
00:34:42,580 --> 00:34:43,580
Yeah.

508
00:34:43,580 --> 00:34:49,780
So for me, basically my day is split between engineering internally focused work.

509
00:34:49,780 --> 00:34:57,220
So things like reviewing specs, commenting on plans for what we're intending to do over

510
00:34:57,220 --> 00:34:59,060
the next six months.

511
00:34:59,060 --> 00:35:05,140
Also talking with engineering and helping to advocate for certain pieces of functionality

512
00:35:05,140 --> 00:35:09,180
in a product over others over the next six months because of customer demand.

513
00:35:09,180 --> 00:35:11,300
I spend an awful lot of time in front of customers.

514
00:35:11,300 --> 00:35:16,300
However, I would say probably 70% of my time is talking to customers, understanding their

515
00:35:16,300 --> 00:35:21,180
blockers, helping them get deployed with the, uh, the purview solution, and then taking

516
00:35:21,180 --> 00:35:22,940
those results back to engineering.

517
00:35:22,940 --> 00:35:28,180
So in any purview deployment, you are going to find blockers for someone, for anyone.

518
00:35:28,180 --> 00:35:31,460
There will always be something that they need that's not in the product.

519
00:35:31,460 --> 00:35:37,420
And it's one of my jobs to make sure that I'm getting those requirements and then, uh,

520
00:35:37,420 --> 00:35:41,980
interpreting those requirements in a way that engineering can understand and act on keeps

521
00:35:41,980 --> 00:35:43,340
me busy.

522
00:35:43,340 --> 00:35:49,940
So Macca, the thing that we ask folks right at the end of the podcast to wrap up is if

523
00:35:49,940 --> 00:35:55,740
you had a final thought to leave our listeners with, what would it be?

524
00:35:55,740 --> 00:35:56,740
Sure.

525
00:35:56,740 --> 00:36:00,460
So we've talked a lot about, you know, the types of work that you probably need to put

526
00:36:00,460 --> 00:36:05,500
in if you want to ensure that you are appropriately securing copilot interactions.

527
00:36:05,500 --> 00:36:11,420
And, uh, I think Mark said before, you know, quite often a lot of this work is that that

528
00:36:11,420 --> 00:36:16,340
piece that gets forgotten about during the rush to actually get the solution in place.

529
00:36:16,340 --> 00:36:19,020
And my advice is don't put the work off.

530
00:36:19,020 --> 00:36:23,340
If you're starting to think about using copilot, take into account the stuff we've been through

531
00:36:23,340 --> 00:36:26,940
today and make it part of your initial deployment plan.

532
00:36:26,940 --> 00:36:31,340
Fundamentally don't try and fit the roof before you build the walls.

533
00:36:31,340 --> 00:36:32,340
Yeah.

534
00:36:32,340 --> 00:36:34,140
Words to live by.

535
00:36:34,140 --> 00:36:37,100
Um, so, Hey Mac, thanks so much for joining us this week.

536
00:36:37,100 --> 00:36:38,100
Yeah.

537
00:36:38,100 --> 00:36:39,100
Purview's a complex beast.

538
00:36:39,100 --> 00:36:42,140
So it's good to have someone from the, you know, the engineering side of the house sort

539
00:36:42,140 --> 00:36:46,340
of talk about it, especially on the more practical aspects, you know, integration with, uh, with

540
00:36:46,340 --> 00:36:47,740
copilot.

541
00:36:47,740 --> 00:36:49,740
So with that, again, thank you so much for joining us.

542
00:36:49,740 --> 00:36:51,260
Really appreciate it taking the time.

543
00:36:51,260 --> 00:36:55,260
And to all our listeners out there, well, next, our next podcast is episode 100.

544
00:36:55,260 --> 00:36:57,260
Um, I'll leave it at that.

545
00:36:57,260 --> 00:36:58,260
It's going to be a special episode.

546
00:36:58,260 --> 00:37:01,220
Everyone stay safe and we'll see you next time.

547
00:37:01,220 --> 00:37:02,220
Episode 100.

548
00:37:02,220 --> 00:37:03,220
Take care.

549
00:37:03,220 --> 00:37:05,580
Thanks for listening to the Azure security podcast.

550
00:37:05,580 --> 00:37:12,420
You can find show notes and other resources at our website, azsecuritypodcast.net.

551
00:37:12,420 --> 00:37:17,980
If you have any questions, please find us on Twitter at Azure set pod background music

552
00:37:17,980 --> 00:37:27,740
is from ccmixter.com and licensed under the creative commons license.

