1
00:00:00,000 --> 00:00:06,200
Welcome to the Azure Security Podcast,

2
00:00:06,200 --> 00:00:09,380
where we discuss topics relating to security, privacy,

3
00:00:09,380 --> 00:00:13,720
reliability, and compliance on the Microsoft Cloud Platform.

4
00:00:13,720 --> 00:00:17,460
Hey everybody, welcome to Episode 59.

5
00:00:17,460 --> 00:00:19,740
This week is another one of those special weeks.

6
00:00:19,740 --> 00:00:21,220
We actually don't have a guest.

7
00:00:21,220 --> 00:00:23,460
This week we have Mark and he's going to talk about

8
00:00:23,460 --> 00:00:25,700
something that's near and dear to his heart,

9
00:00:25,700 --> 00:00:27,740
but more on that in a moment.

10
00:00:27,740 --> 00:00:30,980
First up, let's take a little quick lap around the news.

11
00:00:30,980 --> 00:00:32,740
Sarah, why don't you kick things off?

12
00:00:32,740 --> 00:00:37,380
Sure. I haven't got too much news this time around,

13
00:00:37,380 --> 00:00:41,180
but the one thing that is exciting is,

14
00:00:41,180 --> 00:00:44,920
we announced a public preview release of Gateway Load Balancer.

15
00:00:44,920 --> 00:00:50,180
So what that is is a load balancer that is actually

16
00:00:50,180 --> 00:00:51,820
designed for use with

17
00:00:51,820 --> 00:00:54,700
NVAs or network virtual appliances.

18
00:00:54,700 --> 00:00:57,380
So if you're not familiar with what those are,

19
00:00:57,380 --> 00:01:00,980
that is generally our third party firewalls.

20
00:01:00,980 --> 00:01:03,740
They have to run as NVAs in Azure.

21
00:01:03,740 --> 00:01:08,780
So previously, other load balancers didn't work with them,

22
00:01:08,780 --> 00:01:12,020
but of course, load balancing is important for firewalls.

23
00:01:12,020 --> 00:01:15,740
If you'd rather use a third party firewall rather than an Azure,

24
00:01:15,740 --> 00:01:18,540
you can now have a look at the Gateway Load Balancer

25
00:01:18,540 --> 00:01:22,500
to do your load balancing across those NVAs.

26
00:01:22,500 --> 00:01:24,200
So yeah, pretty exciting.

27
00:01:24,200 --> 00:01:26,820
I think that's going to be super useful.

28
00:01:26,820 --> 00:01:28,900
So go and have a look if you are using

29
00:01:28,900 --> 00:01:30,580
NVAs in your environment.

30
00:01:30,580 --> 00:01:33,340
Then my other exciting bit of news is,

31
00:01:33,340 --> 00:01:35,780
get to go to Black Hat and Def Con this year.

32
00:01:35,780 --> 00:01:38,740
Hooray. So that'll be really exciting.

33
00:01:38,740 --> 00:01:43,260
It's of course, really exciting to go back to doing some in-person events.

34
00:01:43,260 --> 00:01:44,500
That Gateway Load Balancer,

35
00:01:44,500 --> 00:01:45,820
that's a new load balancer.

36
00:01:45,820 --> 00:01:47,380
So it's another one that we've had,

37
00:01:47,380 --> 00:01:49,620
it's a list of load balancers. Is that right?

38
00:01:49,620 --> 00:01:54,780
Yes, it is. It's a new one because our other load balancers

39
00:01:54,780 --> 00:01:57,060
couldn't work with NVAs.

40
00:01:57,060 --> 00:02:00,540
They would only work with native Azure devices.

41
00:02:00,540 --> 00:02:03,460
So this one is NVA aware.

42
00:02:03,460 --> 00:02:08,140
So yeah, it's a brand new type of load balancer.

43
00:02:08,140 --> 00:02:11,220
An NVAs network virtual appliance, is that correct?

44
00:02:11,220 --> 00:02:13,180
It is, yes. That's right, Mark.

45
00:02:13,180 --> 00:02:15,900
So usually when we're talking about an NVA,

46
00:02:15,900 --> 00:02:20,660
we're talking about if you choose to use a Cisco ASA,

47
00:02:20,660 --> 00:02:27,540
Palo Alto, a checkpoint, some non-Microsoft firewall in your environment.

48
00:02:27,540 --> 00:02:32,180
In Azure, they run as NVAs and the architecture.

49
00:02:32,180 --> 00:02:34,740
This is a whole we won't go down,

50
00:02:34,740 --> 00:02:36,860
but architecturally,

51
00:02:36,860 --> 00:02:39,860
because they're running on top of a virtual machine,

52
00:02:39,860 --> 00:02:44,020
we have to set them up a little bit differently to a native,

53
00:02:44,020 --> 00:02:47,260
like say Azure firewall or Azure WAF.

54
00:02:47,260 --> 00:02:50,100
One of the things that was a disadvantage was that

55
00:02:50,100 --> 00:02:56,420
the Azure load balancers couldn't work with those NVAs,

56
00:02:56,420 --> 00:02:57,980
but now this new one can.

57
00:02:57,980 --> 00:03:01,260
So that's why it's important and exciting.

58
00:03:01,260 --> 00:03:03,100
All right. I've got a few news items.

59
00:03:03,100 --> 00:03:06,820
The first one is that Azure Confidential Ledger is now generally available.

60
00:03:06,820 --> 00:03:08,340
Now, a few weeks ago,

61
00:03:08,340 --> 00:03:10,860
we had a discussion about Azure SQL Ledger,

62
00:03:10,860 --> 00:03:13,860
which is not the same as the Confidential Ledger.

63
00:03:13,860 --> 00:03:15,740
Now, one thing that's really cool is that

64
00:03:15,740 --> 00:03:19,140
the SQL Ledger can use Confidential Ledger.

65
00:03:19,140 --> 00:03:22,500
So Confidential Ledger is basically at the back end

66
00:03:22,500 --> 00:03:26,380
using blockchain-like technology to provide things like

67
00:03:26,380 --> 00:03:29,980
tamper-evident data streams and so on.

68
00:03:29,980 --> 00:03:33,300
I've been working actually with the folks at

69
00:03:33,300 --> 00:03:35,300
Microsoft Research in Cambridge on some of

70
00:03:35,300 --> 00:03:38,540
the sample code that they had over the last few weeks.

71
00:03:38,540 --> 00:03:40,060
Long story behind that,

72
00:03:40,060 --> 00:03:42,980
but fantastic technology, very simple to use.

73
00:03:42,980 --> 00:03:46,780
It's basically a whole bunch of APIs you can call and you'll end up building

74
00:03:46,780 --> 00:03:50,100
yourself a Confidential Ledger at the back end.

75
00:03:50,100 --> 00:03:53,700
Also, now that I'm in the Azure Database Platform,

76
00:03:53,700 --> 00:03:56,540
Azure Database for MySQL flexible server now

77
00:03:56,540 --> 00:03:59,220
supports data encryption with Customer Managed Keys.

78
00:03:59,220 --> 00:04:01,100
That is in public preview.

79
00:04:01,100 --> 00:04:03,660
So this is where you can store your keys in

80
00:04:03,660 --> 00:04:08,180
Key Vault and that way you can do any degree of key backup, key rotation.

81
00:04:08,180 --> 00:04:12,060
So basically the whole key lifecycle is totally up to you

82
00:04:12,060 --> 00:04:14,740
rather than having the platform manage the key.

83
00:04:14,740 --> 00:04:18,660
The last one is that we've now had added

84
00:04:18,660 --> 00:04:23,340
Trusted Launch support for some new VM types.

85
00:04:23,340 --> 00:04:29,380
So DCS version 3 and DCDS version 3 VMs now support

86
00:04:29,380 --> 00:04:31,420
Trusted Boot or Trusted Launch.

87
00:04:31,420 --> 00:04:33,220
So this is another thing that's from the stable of

88
00:04:33,220 --> 00:04:34,620
Confidential Computing.

89
00:04:34,620 --> 00:04:36,840
Essentially, if you're familiar with the way

90
00:04:36,840 --> 00:04:39,700
TPMs work, Trusted Platform Modules work in Windows,

91
00:04:39,700 --> 00:04:41,980
where you can do what's called a measured boot to make sure that

92
00:04:41,980 --> 00:04:45,940
the whole boot sequence is free of malware and root kits and boot kits.

93
00:04:45,940 --> 00:04:47,940
Same thing. Exact same thing.

94
00:04:47,940 --> 00:04:49,780
Big differences rather than being a TPM,

95
00:04:49,780 --> 00:04:51,940
it's a VTPM, a virtualized TPM,

96
00:04:51,940 --> 00:04:54,780
but the technology is essentially the same.

97
00:04:54,780 --> 00:04:57,220
So that's all the news I have.

98
00:04:57,220 --> 00:04:59,460
So with that, let's turn our attention to Mark,

99
00:04:59,460 --> 00:05:03,260
who's here to talk about something that is absolutely near and dear to his heart,

100
00:05:03,260 --> 00:05:07,340
and that is a new Chief Information Security Officer workshop.

101
00:05:07,340 --> 00:05:11,100
So, Mark, why don't you give us a quick overview of what on Earth this thing is?

102
00:05:11,100 --> 00:05:14,660
Yeah, we'd like to call it the CISO workshop.

103
00:05:14,660 --> 00:05:18,660
I actually finally decided on the pronunciation of that when,

104
00:05:18,660 --> 00:05:24,020
at one of our CISO summits, Brett Arsenal asked the audience of these top 50 or 100 CISOs,

105
00:05:24,020 --> 00:05:25,260
how do you actually pronounce it?

106
00:05:25,260 --> 00:05:28,940
And it was like overwhelmingly CISO, so that's actually what I go with.

107
00:05:28,940 --> 00:05:33,580
But ultimately, this is an update of a workshop.

108
00:05:33,580 --> 00:05:37,860
It's a pretty significant overhaul of something that we had initially put out,

109
00:05:37,860 --> 00:05:40,380
and I think it was 2016 or 2017,

110
00:05:40,380 --> 00:05:42,700
that's still fairly popular,

111
00:05:42,700 --> 00:05:46,540
something like a couple thousand unique visitors a month.

112
00:05:46,540 --> 00:05:53,980
And so we decided, especially with all the changes and all the things that we've learned in the past five years or so,

113
00:05:53,980 --> 00:05:56,980
we decided to go ahead and update this.

114
00:05:56,980 --> 00:06:00,660
And then, so this is the new version of the CISO workshop.

115
00:06:00,660 --> 00:06:06,180
It pretty much covers just about everything that someone in that role,

116
00:06:06,180 --> 00:06:09,420
or a similar role, because not everyone gets that title, that does the job.

117
00:06:09,420 --> 00:06:14,900
We would need to care about it, and as many insights and lessons learned,

118
00:06:14,900 --> 00:06:17,820
and best practices and models as we could pack into it,

119
00:06:17,820 --> 00:06:25,100
to help the folks that are doing that job really learn from all the things that we've been learning,

120
00:06:25,100 --> 00:06:29,580
both inside Microsoft as well as across our customer base.

121
00:06:29,580 --> 00:06:31,420
So I don't even know where to start with this.

122
00:06:31,420 --> 00:06:32,660
I'm going to be totally honest with you.

123
00:06:32,660 --> 00:06:35,420
So what are you trying to achieve with this thing?

124
00:06:35,420 --> 00:06:40,940
And also, I guess that if it's five or six years old, as you said, it's a big overhaul,

125
00:06:40,940 --> 00:06:45,020
my guess is the threat landscape has changed significantly.

126
00:06:45,020 --> 00:06:50,780
The attackers' methods have changed as well as the defenders' methods have changed as well.

127
00:06:50,780 --> 00:06:54,300
So you want to just give our listeners an overview as to who this is aimed at,

128
00:06:54,300 --> 00:06:56,620
and what are you trying to achieve with all of this?

129
00:06:56,620 --> 00:07:01,980
So ultimately, we aim this at, and I think it's on the landing page itself,

130
00:07:01,980 --> 00:07:08,380
that we talk about it's CISOs, CIOs, so those sort of executive leaders,

131
00:07:08,380 --> 00:07:09,740
and whatever title they happen to be.

132
00:07:10,380 --> 00:07:14,700
Generally, they're the director reports and the directors that run a function or a department

133
00:07:14,700 --> 00:07:21,020
for them. And then any other roles that have sort of an enterprise-wide scope,

134
00:07:22,060 --> 00:07:27,180
such as a lead architect or a team of architects within the organization,

135
00:07:27,180 --> 00:07:32,540
or a governance lead, et cetera. So anybody that deals with all of the security across

136
00:07:33,100 --> 00:07:38,540
an entire technical estate or organization is really kind of our target audience.

137
00:07:38,540 --> 00:07:42,940
A lot of other folks would get a lot out of it, but that's really where we aimed it at.

138
00:07:42,940 --> 00:07:47,820
The thing that we've seen that's really changed in the past five years is just

139
00:07:47,820 --> 00:07:53,580
they've got a much more mature view and understanding of what the CISO faces as a job,

140
00:07:53,580 --> 00:07:59,900
because it's one of the toughest jobs in security, period. You have to be literate and

141
00:07:59,900 --> 00:08:04,220
conversant on technical topics, and so you can sort of understand those.

142
00:08:05,100 --> 00:08:10,380
You have to be literate and conversant on security topics and kind of the dynamics of security,

143
00:08:10,380 --> 00:08:16,060
the threat environment, et cetera. And you also have to be literate and conversant on business

144
00:08:16,060 --> 00:08:23,020
topics and risk and how the organization and senior leaders and board members view the world,

145
00:08:23,020 --> 00:08:28,140
how they think about things, how they manage risk, the taxonomies that they use, whether it's

146
00:08:28,140 --> 00:08:34,060
informal or formal. You've got to be able to talk to them about what are their business initiatives,

147
00:08:34,060 --> 00:08:39,580
where are they going to be looking for revenue, what does risk mean to them in terms of actual

148
00:08:39,580 --> 00:08:44,940
loss and what do they care about, what do they not care about. So the CISO is really a bridge

149
00:08:44,940 --> 00:08:52,060
between a lot of different worlds, and it's a really, really tough job. Basically, the goal

150
00:08:52,060 --> 00:08:55,980
of this is to help people be successful with that job. That's the primary goal,

151
00:08:56,780 --> 00:09:03,580
so they can mature their programs and be successful with it. And then a lot of it, honestly, is

152
00:09:03,580 --> 00:09:07,740
there's so many people that aspire to this role, and we need people aspiring to this role because

153
00:09:09,660 --> 00:09:15,500
it's a very important job, and folks need to understand kind of what they're getting into

154
00:09:15,500 --> 00:09:19,820
and just sort of learn that. So we fully expect folks, we use this as a ramp up as well, to

155
00:09:19,820 --> 00:09:22,780
sort of learn the job, even if they want to do it or not.

156
00:09:22,780 --> 00:09:29,020
So Mark, I know a little bit about this being that I did help you out on some of those videos,

157
00:09:29,020 --> 00:09:34,620
just a little bit. So why don't you tell us a little bit more about the structure

158
00:09:35,180 --> 00:09:39,580
and of the workshop and how we go through the material?

159
00:09:40,540 --> 00:09:46,380
To Michael's sort of comment earlier, ultimately, the first thing that we start with is context,

160
00:09:46,380 --> 00:09:50,460
right? So the threat environment is one of the big pieces of that context, and how has that evolved?

161
00:09:51,420 --> 00:09:58,300
And we've seen a lot of maturity, as it were, in the not so good sense of the attackers and

162
00:09:58,300 --> 00:10:03,980
the way that they buy and sell things and their business models through Extortion Ransomware and

163
00:10:03,980 --> 00:10:10,300
other things, as well as the data theft is matured as well. So we've seen a lot of sophistication

164
00:10:10,300 --> 00:10:14,060
get put into there, whether it's business model or technical sophistication.

165
00:10:14,060 --> 00:10:21,500
And so the first part is really focused on not only the threat environment, but also the business

166
00:10:21,500 --> 00:10:26,220
environment and how is that changing security and how are the cloud and technical platform

167
00:10:26,220 --> 00:10:32,060
changes changing security and these sort of drivers for modernizing. And then,

168
00:10:33,100 --> 00:10:37,580
in the first part, we also cover roles and responsibilities and how are those jobs changing

169
00:10:37,580 --> 00:10:42,220
and how do all these things connect together and what are the sort of jobs of the future or jobs of

170
00:10:42,220 --> 00:10:48,460
the current. And then strategy and how do we recommend, we basically include a reference

171
00:10:48,460 --> 00:10:53,580
strategy, including defining very specific initiatives with outcomes and goals, etc.

172
00:10:54,140 --> 00:10:59,660
And these bring in our trust principles and they tie in our cloud adoption framework of how

173
00:10:59,660 --> 00:11:04,380
organizations are modernizing and adopting the cloud, etc. So the first section is all about

174
00:11:04,380 --> 00:11:09,580
context setting, right? So that, okay, let's start with a common baseline. And then the second and

175
00:11:09,580 --> 00:11:15,020
third sections are kind of, I guess the easiest way to think about it is the top half and the

176
00:11:15,020 --> 00:11:19,500
bottom half of the job, right? So the top half of the job is how do you align your security

177
00:11:19,500 --> 00:11:24,140
program to the business, right? To the organization you're in, how do you align it to the risk management,

178
00:11:25,740 --> 00:11:30,540
taxonomy and system? And how do you engage business leaders and make them successful?

179
00:11:31,660 --> 00:11:35,500
How do you integrate with your IT departments? And what is the north star of the program that

180
00:11:35,500 --> 00:11:40,380
you're trying to achieve? Business resilience, that's the short answer. And then the third section

181
00:11:40,380 --> 00:11:45,980
is a little bit more of sort of the bits and bytes of the program itself and the strategy. So it's

182
00:11:45,980 --> 00:11:51,180
not getting into technical stuff, but it is the disciplines that you need to run and you need

183
00:11:51,180 --> 00:11:56,300
to have a sort of an ongoing, we're going to do this for a very long period of time. So things

184
00:11:56,300 --> 00:12:01,980
like access control, security operations, asset protection, governance, etc., and innovation

185
00:12:01,980 --> 00:12:07,660
security. And like these are the things that you need to do on an ongoing basis and have sort of

186
00:12:07,660 --> 00:12:12,700
an ongoing program of record that makes sure this is always getting done. And so it's really kind of

187
00:12:12,700 --> 00:12:17,420
that three parts, the context, that top half of the program, how do you connect security to the rest

188
00:12:17,420 --> 00:12:23,340
of the organization and that bottom half of how do you structure security. And that bottom half,

189
00:12:23,340 --> 00:12:27,180
we actually lean heavily on some of the work with the open group that we've been part of

190
00:12:27,180 --> 00:12:32,140
sort of what does a modern, zero trust enabled security program look like?

191
00:12:32,780 --> 00:12:37,740
That's super interesting, actually. I'm going to have to dig into this just because I'm nosy,

192
00:12:37,740 --> 00:12:45,020
but tell me more about your partnership with the open group. That's very interesting

193
00:12:45,020 --> 00:12:51,340
that I just want to know more about. Oh, absolutely. So I was actually, I just flew in today

194
00:12:51,340 --> 00:12:58,060
as of the recording of this from a conference with the open group. And so

195
00:12:58,860 --> 00:13:04,860
they're a standards organization that dates back to, I believe they actually defined the UNIX

196
00:13:04,860 --> 00:13:09,180
standard, if I recall correctly. I said, I guess I should say weak because I'm part of the open

197
00:13:09,180 --> 00:13:14,380
group as well. But I was definitely not a part of defining UNIX. I'm much too young for that.

198
00:13:14,380 --> 00:13:22,140
And then things that folks might have heard of in the security industry, the Jericho forum was

199
00:13:22,140 --> 00:13:27,420
actually hosted by the open group. It's since retired and become part of the security forum in

200
00:13:27,420 --> 00:13:32,380
general. But the folks that put out the Jericho commandments and deep parameterization back in

201
00:13:32,940 --> 00:13:39,420
the early to mid 2000s were hosted there. The TOEGAF standard, the open group architecture

202
00:13:39,420 --> 00:13:46,060
framework, I think, which is a very popular architectural definition and certification

203
00:13:47,420 --> 00:13:53,020
and lots of things on digital transformation as well. So it's basically an open standards

204
00:13:53,020 --> 00:13:58,140
organization. And we're actually working towards, and that was the talk that I gave with my

205
00:13:59,020 --> 00:14:04,780
co-chair of the Zero Trust Architecture Forum, working group rather there, was about

206
00:14:04,780 --> 00:14:12,300
the upcoming standard for Zero Trust that we're defining for a Zero Trust reference model.

207
00:14:13,340 --> 00:14:18,220
And one of the cool things, and this is just me personally because I've been working hard towards

208
00:14:18,220 --> 00:14:23,900
this, is one of the things I really, really liked about the conference today is we had a

209
00:14:23,900 --> 00:14:28,620
Microsoft perspective, which actually was from Joseph Davis. And then there was a open group

210
00:14:28,620 --> 00:14:35,740
perspective, which was the one that I session that I did as well as the one that Jim Hightell did.

211
00:14:36,940 --> 00:14:40,540
And so the open group had a couple of different perspectives on the Zero Trust. And then there

212
00:14:40,540 --> 00:14:47,340
was also NIST there. So Ruzia Supaya presented on the NIST perspective and their update from their

213
00:14:47,340 --> 00:14:53,820
National Cyber Security Center of Excellence project, or NCCOE for short, and on Zero Trust.

214
00:14:53,820 --> 00:14:59,340
And they have something on the order of a couple dozen vendors in there implementing Zero Trust

215
00:14:59,340 --> 00:15:04,940
in the lab. And so it was really fascinating to hear all these different perspectives and

216
00:15:04,940 --> 00:15:10,460
present it in my case. And they're actually all pretty closely aligned. There's a lot of people

217
00:15:10,460 --> 00:15:14,700
that remarked on that. And it's like finally giving me hope that, hey, in security, we're

218
00:15:14,700 --> 00:15:19,260
actually starting to agree on something. We actually have consensus on what Zero Trust is,

219
00:15:19,260 --> 00:15:25,020
which was really awesome because we've had such a sort of like fractured, individual,

220
00:15:25,020 --> 00:15:29,900
specialized view of security for so long. We're all finally agreeing on how to modernize the thing.

221
00:15:29,900 --> 00:15:36,140
So that was particularly exciting for me. That is super cool. And as someone who has also done

222
00:15:36,140 --> 00:15:43,980
a little bit with the NCCOE as well, that's been really cool as well. I helped out with the lab

223
00:15:43,980 --> 00:15:49,980
a little bit. So not nearly as exciting as Mark. But so how would you say all of that's feeding in

224
00:15:49,980 --> 00:15:56,700
then to the SISO workshop? Is there anything else you can add there? Are we going to see a lot of

225
00:15:56,700 --> 00:16:05,420
Zero Trust stuff in the SISO workshop? Yes. Effectively, Zero Trust, the way that we think

226
00:16:05,420 --> 00:16:11,260
about it, and I can say we in a much broader sense now, having had that confirmation, is it's,

227
00:16:11,260 --> 00:16:17,260
if you think about what's starting with a digital transformation, the businesses are

228
00:16:17,820 --> 00:16:21,180
shifting. And we talked about this quite a bit in the SISO workshop recordings.

229
00:16:21,980 --> 00:16:28,220
Businesses are shifting because, hey, you got these startups, no longer startups like Netflix and

230
00:16:28,220 --> 00:16:33,820
Amazon and Uber that have really disrupted the industries that they're in and taken a whole

231
00:16:33,820 --> 00:16:38,940
different view of it using cloud-based technology. And then that starts off this cloud transformation

232
00:16:38,940 --> 00:16:43,740
because organizations, in order to compete with that, businesses especially, have to actually

233
00:16:44,300 --> 00:16:48,860
meet customers where they want to be and not just run things like they used to in retail and all

234
00:16:48,860 --> 00:16:53,660
these other industries. And so you end up with this business transforming and figuring out

235
00:16:53,660 --> 00:16:59,580
its processes. You find the technology through cloud is transforming everything and changing it.

236
00:16:59,580 --> 00:17:04,780
Like everything changes at least a little bit, sometimes a lot. And then security, basically,

237
00:17:04,780 --> 00:17:11,980
zero trust is that third leg of the transformation. It's how security is modernizing and thinking of

238
00:17:11,980 --> 00:17:18,460
things differently and linking into those strategic changes. And so when we say zero trust is a part

239
00:17:18,460 --> 00:17:23,020
of this, it's basically woven throughout. And there's very concrete zero trust principles.

240
00:17:23,020 --> 00:17:27,420
There's very concrete technology that's different than the way things used to be. But ultimately,

241
00:17:27,420 --> 00:17:33,100
it's also changing roles and responsibilities. And it's changing mindsets in the way people think

242
00:17:33,100 --> 00:17:39,260
about it. I mean, it's a full top to bottom end transformation where, yeah, you can see the old

243
00:17:39,260 --> 00:17:43,660
in it when you look at it, but you also see that it's very different than it used to be.

244
00:17:43,660 --> 00:17:49,500
So it's sort of just underpinning all these different things. And we kind of call it out

245
00:17:49,500 --> 00:17:53,900
in a bunch of places. So I don't want to elaborate too much unless we have to, but

246
00:17:54,700 --> 00:17:59,500
so you mentioned zero trust. I mean, the open group has a particular stance on that, right?

247
00:17:59,500 --> 00:18:03,580
Like they have, there is a zero trust working group within the open group.

248
00:18:03,580 --> 00:18:06,380
Oh, exactly. I'm one of the co-chairs of it. Yeah.

249
00:18:06,380 --> 00:18:13,660
Yeah. And so sort of to Sarah's point, I mean, are we seeing more customers adopting zero trust

250
00:18:13,660 --> 00:18:20,860
principles, not necessarily every single one, but starting somewhere and working towards what it

251
00:18:20,860 --> 00:18:24,780
means to go on a sort of a zero trust journey. Are we seeing that sort of picking up more steam?

252
00:18:24,780 --> 00:18:28,460
I realize this really doesn't have a lot to do necessarily with the CISO workshop, but

253
00:18:28,460 --> 00:18:32,940
I'm just curious more than anything else. Oh yeah. And we do talk about this,

254
00:18:32,940 --> 00:18:39,180
I think in several of the videos in there, but the answer is yes. So ultimately this transformation,

255
00:18:40,540 --> 00:18:45,100
we're seeing a lot of customers on the journey. I mean, many of them explicitly call it out.

256
00:18:46,140 --> 00:18:51,340
And many of those that are sort of transforming and modernizing are actually using the zero

257
00:18:51,340 --> 00:18:56,140
trust terminology. But just about every organization, if you're going to cloud,

258
00:18:56,140 --> 00:19:00,860
you can't just get away with a firewall IDS IPS and call it a day like we used to,

259
00:19:01,900 --> 00:19:05,020
or throwing a bunch of logs in the SIM and not actually,

260
00:19:05,900 --> 00:19:09,260
and having a whole bunch of false positives that burn people out. Those sort of like classic

261
00:19:09,260 --> 00:19:14,220
security problems of previous generations, everybody's trying to solve those problems.

262
00:19:14,780 --> 00:19:19,820
And so ultimately, just about everybody's on this journey, whether they know it or not,

263
00:19:19,820 --> 00:19:23,820
and whether they call it zero trust or not. Some people say they're adopting SASE.

264
00:19:23,820 --> 00:19:28,300
Great. SASE is basically a subset and a component, a specific architecture

265
00:19:29,740 --> 00:19:35,580
that fits within zero trust. SASE is a secure access service edge SASE.

266
00:19:36,780 --> 00:19:42,620
I don't know why they pronounced it SASE, but everybody does. But yeah, just about everybody's

267
00:19:42,620 --> 00:19:47,740
on that, on that journey. It's interesting. Now that I've taken this position sort of on the back,

268
00:19:47,740 --> 00:19:54,940
the back end of Azure as it were, it's interesting how important zero trust is to the actual running

269
00:19:54,940 --> 00:20:03,740
of Azure. I see a lot of documentation. I see a lot of issues getting raised and so on around

270
00:20:03,740 --> 00:20:08,700
core zero trust principles, especially things like assume breach and lease privilege.

271
00:20:08,700 --> 00:20:12,620
You're probably the two of the biggest ones that I see referenced the most. I think,

272
00:20:13,180 --> 00:20:16,780
I'm not going to say that they're the easiest to do necessarily. I think that they're easiest to

273
00:20:16,780 --> 00:20:21,020
sort of understand really quickly. Yeah, we're seeing a lot of references to that. Again,

274
00:20:21,020 --> 00:20:25,340
that the Azure back end. When you sort of step back from it and you sort of understand the

275
00:20:25,340 --> 00:20:30,700
zero trust information, you kind of think about it. In many ways, all we've done with zero trust

276
00:20:31,340 --> 00:20:38,060
is we've taken away one bad assumption that we used as a shortcut, which is that we can create

277
00:20:38,060 --> 00:20:44,620
a safe network and then everything on it is de facto safe, which was a bad assumption.

278
00:20:44,620 --> 00:20:48,860
We essentially got called on it by the attackers and we got called on it by the business that's

279
00:20:49,420 --> 00:20:55,420
operating outside with SaaS, software as a service, and mobile devices, and all these other kind of

280
00:20:55,420 --> 00:21:01,420
things that are outside of your perimeter. Essentially, all zero trust is resetting security

281
00:21:01,420 --> 00:21:05,020
back to where it should have been, which is let's think through this problem completely.

282
00:21:05,900 --> 00:21:10,460
Without this assumption, we are on a safe network and therefore everything in it is magically safe.

283
00:21:10,460 --> 00:21:16,700
All we've done is ripped out that one huge assumption that was a shortcut that just wasn't valid.

284
00:21:17,420 --> 00:21:21,340
Okay, so Mark, as you know, I know a little bit about this because...

285
00:21:22,140 --> 00:21:25,260
A little bit. Come on, you were literally there during the recording.

286
00:21:26,060 --> 00:21:29,580
Yeah, I know a little bit about it because I was helping you do the recording.

287
00:21:29,580 --> 00:21:40,060
So, I don't know, just for those of you listening, I got to myself and our very awesome colleague,

288
00:21:40,060 --> 00:21:46,780
Elizabeth, helped do the recordings for this, which will be public that you can go watch

289
00:21:46,780 --> 00:21:52,060
myself, Mark, and Elizabeth do the recordings for this workshop. Mark, let's talk about

290
00:21:52,700 --> 00:21:59,260
that recording time and some of the things that we did. I'm going to let you go and

291
00:21:59,260 --> 00:22:04,140
start this one off because I could tell lots of stories, probably half of which are not relevant.

292
00:22:04,140 --> 00:22:09,980
So, I'm going to let you go first here. Okay, so for the most part, most of the sections,

293
00:22:09,980 --> 00:22:13,580
most of the videos that you're going to see are pretty much a reference presentation,

294
00:22:13,580 --> 00:22:18,060
similar to how we would actually engage with the customer and present and discuss a particular

295
00:22:18,060 --> 00:22:24,220
topic. So, that's really the basis for all of these different sections that it's broken into.

296
00:22:24,220 --> 00:22:27,340
I can't remember how many. It feels like a dozen or two videos.

297
00:22:27,340 --> 00:22:32,380
They're anywhere between six, eight minutes and like 20, 25 minutes. But the one that sticks out

298
00:22:32,380 --> 00:22:36,540
in my mind that I remember the most, because I honestly didn't have a plan for how to record

299
00:22:36,540 --> 00:22:41,820
this going in, there's a very special section in Part B. So, Part B is the one where you're

300
00:22:41,820 --> 00:22:47,100
aligning to the business. And we decided that we wanted to not just say, here's all the stuff you

301
00:22:47,100 --> 00:22:53,740
have to do and just kind of leave it there. We wanted to actually give people a set of slides

302
00:22:53,740 --> 00:22:58,540
and the ability to have a conversation. If I'm sitting in the CISO role, maybe I'm new to it,

303
00:22:58,540 --> 00:23:03,980
maybe I'm trying to bring a business leader to my side or to try and build a relationship.

304
00:23:03,980 --> 00:23:09,340
How do I as a security leader engage with a business leader? How do I talk about security

305
00:23:09,340 --> 00:23:13,820
in a very simple, straightforward way that they can relate to? And then how do I ask for the things

306
00:23:13,820 --> 00:23:20,380
I need to make my program successful? And so, as we got into this one, I think it was like

307
00:23:20,380 --> 00:23:24,940
two minutes before we were starting the recording, it's like, hey, do you want to do this as a role

308
00:23:24,940 --> 00:23:35,820
play? It's like, okay, cool, we got three people. Elizabeth, CEO, Sarah, CIO, Mark, our hapless CISO

309
00:23:35,820 --> 00:23:42,380
trying to convince them to come along with this security journey thing that they probably

310
00:23:42,380 --> 00:23:51,340
classically don't care about. And so we did. And it was just about 100% improvised. And in

311
00:23:51,340 --> 00:23:57,820
retrospect, it was fun. In the moment, it was tough because Elizabeth played an amazingly tough

312
00:23:57,820 --> 00:24:05,260
business leader, like hardcore military-style business leader. And she made it tough for me

313
00:24:05,260 --> 00:24:10,060
as a CISO. And so I was doing everything I could to keep it on track and keep it focused.

314
00:24:10,060 --> 00:24:13,420
And I think, I don't think it made the final cut of it, but at one point I was like, okay,

315
00:24:13,420 --> 00:24:17,740
you got to give me an opening, you just shot me out completely. I've got no chance of showing

316
00:24:17,740 --> 00:24:25,180
how to engage with a business leader. But we did have a lot of fun with it and really kind

317
00:24:25,180 --> 00:24:30,860
of showed that interplay. And the toughest possible situation so that we could give people

318
00:24:30,860 --> 00:24:36,620
kind of some material that they could use and copy as they're engaging their business leaders

319
00:24:36,620 --> 00:24:41,820
and bringing them into understanding security and how to help security be successful. Because

320
00:24:41,820 --> 00:24:48,300
the thing that we learned, and this is a huge, huge learning as we work with our customers,

321
00:24:48,300 --> 00:24:53,580
is security cannot be successful without their business leader support. Like ultimately,

322
00:24:54,940 --> 00:25:01,260
the way that accountability and responsibility are typically laid out, customers, doesn't set

323
00:25:01,260 --> 00:25:05,660
people up for success. Because if security gets blamed for everything, they're sort of

324
00:25:05,660 --> 00:25:12,620
in a CYA or cover your assets mode, right? Because they're just waiting to be blamed for

325
00:25:12,620 --> 00:25:18,540
the next thing. And then if business doesn't include security risk as part of the rest of the

326
00:25:18,540 --> 00:25:25,580
things, they worry about like legal and political and natural disasters and economic and monetary

327
00:25:25,580 --> 00:25:31,740
rate kind of things that as a business owner, a factory owner, a product line owner, like if

328
00:25:31,740 --> 00:25:35,900
you don't include security in there, you just dump it on the security dude, then you're not making a

329
00:25:35,900 --> 00:25:40,860
good balanced decision. And so there's this sort of, you know, how do we bring security into that

330
00:25:40,860 --> 00:25:46,060
conversation and help make the business folks literate to have that conversation? And so we had

331
00:25:46,060 --> 00:25:50,860
a, I don't know, I just had a lot of fun with that role play, fun being painful at the time.

332
00:25:52,300 --> 00:25:58,460
And I had a lot of fun with that. Yeah, I just got to talk about some of the arguments that I've

333
00:25:58,460 --> 00:26:05,980
heard in the upper echelons of IT management and well, not even IT management, just general business

334
00:26:05,980 --> 00:26:10,620
leadership, because there's a lot of conflicts there, right? I mean, who's going to take care of

335
00:26:10,620 --> 00:26:15,660
things, how much it's going to cost, blah, blah, blah, blah, blah. So yeah, I thought it was really

336
00:26:15,660 --> 00:26:22,380
fun. Yeah, and you got to watch me squirm too. Yeah, I know, I know. And the reality is that,

337
00:26:23,180 --> 00:26:27,420
you know, this is the stuff that comes up. It doesn't matter what industry vertical you are,

338
00:26:27,420 --> 00:26:33,020
how big you are, like, these are the same sort of challenges that everyone will have when trying

339
00:26:33,020 --> 00:26:40,140
to implement this. So hopefully for those of you who are in SISO leadership positions, at least

340
00:26:40,140 --> 00:26:44,620
you feel like you're not alone. Just real quick. So what was the role, no pun intended, what was

341
00:26:44,620 --> 00:26:47,740
the role of the role playing? Was it just to sort of play different positions and then

342
00:26:48,620 --> 00:26:54,860
see how you would respond or you're using material that you had to sort of help guide

343
00:26:54,860 --> 00:27:00,700
some of the responses? So we roughly stuck to the material that was in that section. I think it's

344
00:27:00,700 --> 00:27:06,780
about five slides, six slides, I think, and then there's some optional ones in case it goes dark.

345
00:27:06,780 --> 00:27:11,340
We didn't end up going dark, which is, okay, this is what a ransomware attack is really like, and

346
00:27:11,340 --> 00:27:15,980
let me explain it to you in terms that you understand. So we didn't really cover that one

347
00:27:15,980 --> 00:27:20,380
there, but the slides are there for that. But, you know, we roughly stayed to that storyline,

348
00:27:20,380 --> 00:27:24,300
and that was the thing I was doing as the struggling SISO was to try and keep the conversation on

349
00:27:24,300 --> 00:27:32,380
track, you know, despite Elizabeth's efforts. Does that make sense? It does, yeah. And then...

350
00:27:33,500 --> 00:27:38,220
Elizabeth was a very strong leader. I was a little bit intimidated.

351
00:27:39,740 --> 00:27:44,620
So these role play videos, this role play is available in the videos?

352
00:27:44,620 --> 00:27:48,380
Yep, that's one of the videos. It's the engaging business leaders on security one.

353
00:27:48,380 --> 00:27:55,260
Yeah, that'd be cool. And then the other thing that we had a lot of fun with was we didn't really

354
00:27:55,260 --> 00:28:01,260
cover it too in depth. It's more for the slides and PDFs and whatnot. But we did cover the maturity

355
00:28:01,260 --> 00:28:06,780
models because that's one of the things we included in the SISO workshop is maturity models for your

356
00:28:06,780 --> 00:28:12,460
overall program and how it integrates with the business, as well as the governance over the

357
00:28:12,460 --> 00:28:17,580
different pieces of the program and how those progress, and then how do you get from maturity

358
00:28:17,580 --> 00:28:24,460
to model level one to two to three to four, et cetera. And we put a lot of effort into those

359
00:28:24,460 --> 00:28:32,300
because we didn't want to have some abstract academic, you know, basic to standard to optimized

360
00:28:32,300 --> 00:28:35,820
to dynamic or something, right? Which is a typical way of doing maturity models,

361
00:28:35,820 --> 00:28:42,220
or rather at least a common way. And we focused on, you know, what are the actual journeys

362
00:28:42,220 --> 00:28:48,140
organizations take? And sometimes they overpivot on like security operations, because, hey, we got

363
00:28:48,140 --> 00:28:54,140
a big incident. And then they realize, oh, my gosh, we haven't invested into prevention as much as

364
00:28:54,140 --> 00:28:58,140
we have. Otherwise, you know, because if we don't, we're going to have to hire like 200 more security

365
00:28:58,140 --> 00:29:03,340
analysts in the sock. And so we tried to really, you know, have each of those levels of maturity

366
00:29:03,340 --> 00:29:07,820
reflect the real journey and kind of encourage people to skip the, you know, the off balance

367
00:29:07,820 --> 00:29:12,700
mistakes as possible. But those are those are something that we also included in there.

368
00:29:12,700 --> 00:29:17,580
So Mark, one of the things I remember, because, well, I know you, you wrote all of this material,

369
00:29:17,580 --> 00:29:21,420
so you know it off by heart. But one of the things that really sticks out for me

370
00:29:21,420 --> 00:29:27,900
is when we were talking about the strategic initiatives part of the workshop. So just for

371
00:29:27,900 --> 00:29:34,380
our audience, do you want to give us a little bit of a summary about that and what that involves

372
00:29:34,380 --> 00:29:40,380
and what we cover? Yeah, that's a really good point. So one of the things we introduced in

373
00:29:40,380 --> 00:29:45,580
that first part in the sort of context is that reference strategy, right? Like what does good

374
00:29:45,580 --> 00:29:51,740
look like from a security strategy standpoint? And you know, this is very much in contrast with

375
00:29:51,740 --> 00:29:56,620
kind of classic security, which really built a strategy around a single tactic, which is, you

376
00:29:56,620 --> 00:30:02,620
know, a security perimeter based on network technology, you know, it's like taking one page

377
00:30:02,620 --> 00:30:08,940
out of a out of a military handbook and saying this is the one thing that we're doing is kind of

378
00:30:08,940 --> 00:30:14,700
what we did without meaning to in the early days of security, because so many folks focus so heavily

379
00:30:14,700 --> 00:30:19,980
on that network perimeter. And so we really broadened it out to a genuinely aligned with,

380
00:30:19,980 --> 00:30:24,220
you know, the typical default trend, digital transformation and cloud transformation strategies.

381
00:30:24,220 --> 00:30:32,700
And then we broke it into, okay, here are the six different specific modernization initiatives

382
00:30:32,700 --> 00:30:38,140
that most organizations are either on take undertaking or should start undertaking. So

383
00:30:38,140 --> 00:30:44,940
modernizing identity to access all types of access, including network access, modernizing

384
00:30:44,940 --> 00:30:51,100
security operations, OT and IoT, infrastructure and development, which I know is near and dear

385
00:30:51,100 --> 00:30:55,820
to Michael's heart. And, you know, so modernizing each of those and, you know, coming up with that

386
00:30:55,820 --> 00:31:02,380
coherent thing. And that actually is those strategic initiatives is how we structured

387
00:31:02,380 --> 00:31:07,580
the follow ones and the architecture design sessions. So they're discussed in the CESA

388
00:31:07,580 --> 00:31:11,820
workshop, but they are not out yet. We're still in the process of finishing those up and, you

389
00:31:11,820 --> 00:31:17,660
know, getting the recording schedules scheduled, etc. at the time of this podcast. But ultimately,

390
00:31:17,660 --> 00:31:23,740
but ultimately, those become that sort of structure for the rest of the guidance of, you

391
00:31:23,740 --> 00:31:28,060
know, here's all the different things that you need to do to modernize your program and take it

392
00:31:28,060 --> 00:31:33,980
from wherever it's at, you know, with maturity models and plans and all that to help that journey

393
00:31:33,980 --> 00:31:39,740
along. But ultimately, you know, here's, here's those six work streams that, you know, you might

394
00:31:39,740 --> 00:31:43,900
have a different priority depending on whether in your manufacturing or whether in your retail

395
00:31:43,900 --> 00:31:49,580
or, you know, whether in banking, you may or may not care about OT or IoT at the same level as

396
00:31:49,580 --> 00:31:55,500
as a different industry. But, you know, these are the six modernization things that just about

397
00:31:55,500 --> 00:32:01,820
every organization we've seen tends to structure around. And then all the lessons learned that we

398
00:32:01,820 --> 00:32:07,260
could pack into their reference plans, reference architectures to help people be successful.

399
00:32:07,260 --> 00:32:11,260
And so a lot of that is still to come in the architecture design sessions as we get into the

400
00:32:11,260 --> 00:32:17,660
technical details. But we do introduce those and, you know, and work through those in the CESA workshop.

401
00:32:17,660 --> 00:32:21,020
You know, I'm going to ask you, right? Because I just kind of helped myself. I think there better

402
00:32:21,020 --> 00:32:25,340
be something in there for developers, right? We actually put that in the innovation security

403
00:32:25,900 --> 00:32:32,300
area. And the reason that it doesn't have sort of a familiar term like dev sec ops or development

404
00:32:32,300 --> 00:32:38,380
security is because we wanted to make room for not only professional developers that are really

405
00:32:38,380 --> 00:32:44,460
moving from that securing a waterfall approach to a securing dev sec ops or something in between.

406
00:32:45,340 --> 00:32:51,580
But we also wanted to include this emerging trend of citizen developers. So we're starting to see

407
00:32:51,580 --> 00:32:58,460
like normal, non-technical people using things like Power BI and Power Apps to connect systems

408
00:32:58,460 --> 00:33:03,660
and data across the organization, which is, you know, very much an early emerging space right now,

409
00:33:03,660 --> 00:33:08,700
but it's an area that, you know, as soon as you can create business value of it, you can create risk,

410
00:33:08,700 --> 00:33:14,460
right? And so that double-edged sword dynamic. And so we're very heavily focused right now on

411
00:33:14,460 --> 00:33:20,620
the dev sec op elements and how do you integrate security into a dev ops process and not get in

412
00:33:20,620 --> 00:33:25,020
the way of the developers and not dump a bunch of false positives and a long report to them,

413
00:33:25,020 --> 00:33:30,220
but to actually, you know, give them clean alerts, actionable stuff they can do, embed the knowledge

414
00:33:30,220 --> 00:33:36,220
in there. So very much focused on that sort of dev sec ops approach. But yeah, that's the innovation

415
00:33:36,220 --> 00:33:43,260
security discipline. And then that gets modernized by that infrastructure and development security

416
00:33:43,980 --> 00:33:48,540
initiative, which is going to be module four of the architecture design session.

417
00:33:49,180 --> 00:33:53,180
If there's a section called innovation, what other sort of topics are in there?

418
00:33:53,980 --> 00:33:57,740
It's mostly those two, the professional developers or the dev sec ops scenario,

419
00:33:57,740 --> 00:34:02,780
right, that everybody's trying to get to. And then the citizen developer sort of power apps,

420
00:34:02,780 --> 00:34:08,620
low code, no code. So those are the two types of how do you secure the innovation that's happening.

421
00:34:08,620 --> 00:34:13,260
So it's much more a recognition of the innovation happening in the technical and business realms

422
00:34:13,260 --> 00:34:16,380
and how do you provide security for that without slowing it down?

423
00:34:17,260 --> 00:34:22,620
Fun fact, my uncle, who's in his 80s, he's been doing Excel development for a long, long time.

424
00:34:22,620 --> 00:34:28,860
And yeah, he can crank some stuff out. So he's a probably more advanced citizen developer the most,

425
00:34:28,860 --> 00:34:33,980
but definitely not a software engineer by any stretch of anyone's imagination. But

426
00:34:34,860 --> 00:34:39,420
yeah, he's a whiz when it comes to Excel, I'll give him that. Well, you know, as well as all of us,

427
00:34:39,420 --> 00:34:44,780
but we always like to ask our guests, if you had one thought to leave our listeners with,

428
00:34:44,780 --> 00:34:49,660
what would it be? In addition to the obvious, please watch the videos and of course provide

429
00:34:49,660 --> 00:34:56,540
feedback, etc. The big thing that I would keep in mind for folks is as you're going through this

430
00:34:56,540 --> 00:35:01,900
and as you're thinking about this topic in your day to day job, recognize that it's a big transformation.

431
00:35:02,460 --> 00:35:06,700
And, you know, in the way the best way to sort of get through something that's,

432
00:35:07,260 --> 00:35:10,860
you know, where everything is changing or could change around you that you're familiar with and

433
00:35:10,860 --> 00:35:16,140
used to is, you know, set a North Star and keep going, right? And this is, I think it's one of

434
00:35:16,140 --> 00:35:22,140
the tips early in the workshop, and that you're going to be making continuous progress each day.

435
00:35:22,140 --> 00:35:26,380
And so you just want to make sure you have a clear vision. So when you're walking as much as you can

436
00:35:26,380 --> 00:35:30,860
on that day or in that hour, that you're walking in the right direction, but just kind of have

437
00:35:30,860 --> 00:35:35,980
that expectation that this is a longer journey, it is a longer transformation. And just, you know,

438
00:35:36,700 --> 00:35:39,980
just expect that you're going to make continuous progress. So do what you can now

439
00:35:39,980 --> 00:35:44,380
and just always be working in that direction. That's, that's the big thing that we've learned,

440
00:35:44,380 --> 00:35:50,140
you know, as all this is happening. And, you know, it's always a mix. Also, it's, you know,

441
00:35:50,140 --> 00:35:53,660
kind of a mix of the old and the new, right? So a lot of the things that we've learned in the

442
00:35:53,660 --> 00:35:57,260
network security and all these things will still apply. We're not taking firewalls down, right?

443
00:35:58,060 --> 00:36:01,900
But recognize you're going to have to learn new things. Like, I mean, even you look at network

444
00:36:01,900 --> 00:36:06,140
security, it's becoming a part of like three or four or five different jobs. It's not one job

445
00:36:06,140 --> 00:36:10,860
anymore. And so just kind of expect that a lot of things are going to change. But, you know,

446
00:36:10,860 --> 00:36:15,020
have that clarity of where you're going and have the confidence that, hey, I'm going to do the best

447
00:36:15,020 --> 00:36:18,140
I can today. That's, that's kind of, you know, the way I think about it.

448
00:36:18,140 --> 00:36:21,420
I'll brief you a look through some of the material. I think it's absolutely fantastic. And I think

449
00:36:22,060 --> 00:36:25,820
a lot of our listeners will get a great deal of use out of those documents and the videos.

450
00:36:26,380 --> 00:36:30,460
So again, thanks for, I'm going to say thanks for joining us because you're here anyway,

451
00:36:30,460 --> 00:36:34,460
but thanks for joining us. And to all our listeners out there, thank you so much for

452
00:36:34,460 --> 00:36:38,940
listening. I hope you found this useful. Stay safe out there and we'll see you next time.

453
00:36:38,940 --> 00:36:43,740
Thanks for listening to the Azure Security Podcast. You can find show notes and other

454
00:36:43,740 --> 00:36:51,100
resources at our website azsecuritypodcast.net. If you have any questions, please find us on

455
00:36:51,100 --> 00:36:57,900
Twitter at Azure Setpod. Background music is from ccmixter.com and licensed under the Creative

456
00:36:57,900 --> 00:37:09,580
Commons license. Background music is from ccmixter.com and licensed under the Creative Commons license.

