1
00:00:00,000 --> 00:00:10,000
Welcome to Artificially Intelligent Marketing, a weekly podcast where we stay on top of the

2
00:00:10,000 --> 00:00:15,740
latest trends, tips, and tools in the world of marketing AI, helping you get the best

3
00:00:15,740 --> 00:00:18,560
results from your marketing efforts.

4
00:00:18,560 --> 00:00:22,440
Now let's join our hosts, Paul Avery and Martin Broadhurst.

5
00:00:22,440 --> 00:00:25,000
Hello everyone.

6
00:00:25,000 --> 00:00:28,520
Welcome to episode nine of Artificially Intelligent Marketing.

7
00:00:28,520 --> 00:00:30,600
We're glad that you're here with us today.

8
00:00:30,600 --> 00:00:33,440
There's lots and lots of stuff for us to get through.

9
00:00:33,440 --> 00:00:36,640
We're going to look at some top stories where we're going to go into some depth.

10
00:00:36,640 --> 00:00:41,200
So today we're going to cover the EU's proposed new copyright laws for AI, very important

11
00:00:41,200 --> 00:00:43,680
for marketers to know what's going on there.

12
00:00:43,680 --> 00:00:50,880
We're going to look at OpenAI's release of Code Interpreter as a chat GPT plugin.

13
00:00:50,880 --> 00:00:52,280
That's pretty cool as well.

14
00:00:52,280 --> 00:00:55,940
Stability AI announcing Deep Floyd IF.

15
00:00:55,940 --> 00:00:59,980
This text image generation with text tool.

16
00:00:59,980 --> 00:01:04,160
We're going to look at tool of the week and we're going to look at the Bing sidebar on

17
00:01:04,160 --> 00:01:05,160
that one.

18
00:01:05,160 --> 00:01:09,740
And we're also going to look at a few other things that interested us like this anime

19
00:01:09,740 --> 00:01:16,280
movie trailer that someone made with Runway's Gen 2 text video tool, which was pretty cool.

20
00:01:16,280 --> 00:01:19,640
Outside of that, there was a bunch of other stuff that happened this week that we won't

21
00:01:19,640 --> 00:01:23,200
have time to go into detail about, but that we did still want to cover briefly on the

22
00:01:23,200 --> 00:01:24,200
podcast.

23
00:01:24,200 --> 00:01:26,560
So we're going to cover all these short snippets and we're going to look at some of those very

24
00:01:26,560 --> 00:01:29,280
quickly before we get into the big stories of the week.

25
00:01:29,280 --> 00:01:34,040
I'm going to crack through these, Martin, and if there's a particular one that you want

26
00:01:34,040 --> 00:01:38,280
to chime in on, jump in and let us know what your thoughts are.

27
00:01:38,280 --> 00:01:43,120
The first one was chat GPT is now allowed to operate in Italy again after introducing

28
00:01:43,120 --> 00:01:45,880
a number of privacy disclosures and controls in there.

29
00:01:45,880 --> 00:01:48,320
So that's back in play.

30
00:01:48,320 --> 00:01:54,640
Anthropic raises $580 million in their series B, which is further evidence of all the cash

31
00:01:54,640 --> 00:02:00,000
that continues to flood into the AI company space.

32
00:02:00,000 --> 00:02:06,960
Tech layoffs continue with Dropbox laying off 16% of people in its team and IBM, as

33
00:02:06,960 --> 00:02:09,160
we saw today, pausing hiring.

34
00:02:09,160 --> 00:02:16,440
And much of these layoffs are reported to be driven by augmenting existing staff with

35
00:02:16,440 --> 00:02:19,760
AI, although none of the companies are actually confirming this.

36
00:02:19,760 --> 00:02:23,560
And I guess in many ways it could be driven by some of the economic challenges in the

37
00:02:23,560 --> 00:02:24,560
market.

38
00:02:24,560 --> 00:02:29,440
But this is what people are thinking are happening is that AI is going to start to play a bigger

39
00:02:29,440 --> 00:02:31,720
role in the workforce as some of these companies.

40
00:02:31,720 --> 00:02:32,720
What are your thoughts on this one, Martin?

41
00:02:32,720 --> 00:02:34,680
I know you had some thoughts.

42
00:02:34,680 --> 00:02:39,440
Well it's a continuation of those tech layoffs that we've seen over the past 12 months or

43
00:02:39,440 --> 00:02:45,240
so, but this one is particularly interesting that the IBM one talking or commentators around

44
00:02:45,240 --> 00:02:50,360
the issue were saying that, yeah, this is AI and what they're looking to do is basically

45
00:02:50,360 --> 00:02:59,040
replace jobs that were more or less filing paperwork or moving data from one place to

46
00:02:59,040 --> 00:03:05,480
another, quite, you know, back office jobs, kind of admin jobs.

47
00:03:05,480 --> 00:03:11,800
They're the kind of roles that they're looking to replace and these will be automated using

48
00:03:11,800 --> 00:03:14,280
AI ML technology.

49
00:03:14,280 --> 00:03:20,040
So I think it's interesting to see whether those jobs are fully replaced.

50
00:03:20,040 --> 00:03:24,960
You know, sometimes we see big tech companies make huge rounds of layoffs and then before

51
00:03:24,960 --> 00:03:27,940
you know it, they're rehiring people again.

52
00:03:27,940 --> 00:03:34,160
So it was, you know, a bit of a full Xeron to go and do that in the first place.

53
00:03:34,160 --> 00:03:40,240
But yeah, if they are jobs that are lost for good, we're certainly starting to see the

54
00:03:40,240 --> 00:03:43,840
thin end of the wedge for AI job replacement.

55
00:03:43,840 --> 00:03:48,680
I don't think jobs are going to be replaced overnight, but we're starting to see a creep

56
00:03:48,680 --> 00:03:49,680
in.

57
00:03:49,680 --> 00:03:54,240
It did make me think about another topic that we, well, we covered this topic on a recent

58
00:03:54,240 --> 00:04:03,520
episode looking at the OpenAI report on disruption in the labour market through GPT technology.

59
00:04:03,520 --> 00:04:11,960
And it brought to mind an article that I read this week, which looked at a study that was

60
00:04:11,960 --> 00:04:21,440
done where healthcare professionals were asked to judge the responses to patient emails and

61
00:04:21,440 --> 00:04:24,040
they were shown two responses.

62
00:04:24,040 --> 00:04:29,760
One of the responses was from a real healthcare professional responding to the patient and

63
00:04:29,760 --> 00:04:30,760
their inquiry.

64
00:04:30,760 --> 00:04:34,880
And the other was a response written by Chad GPT and GPT-4.

65
00:04:34,880 --> 00:04:39,200
Now the healthcare assessor, as it were, didn't know which was which.

66
00:04:39,200 --> 00:04:41,300
So these were blind tests.

67
00:04:41,300 --> 00:04:49,920
And in 79% of cases, they rated Chad GPT's response as being the better response in terms

68
00:04:49,920 --> 00:04:56,960
of higher quality, in terms of the information provided and having a better understanding

69
00:04:56,960 --> 00:05:01,960
and higher empathy, which is fascinating to me.

70
00:05:01,960 --> 00:05:12,200
So we're starting to see real world peer reviewed or at least, you know, assessments done by

71
00:05:12,200 --> 00:05:17,360
people that know what they're talking about, looking at outputs from humans versus robots

72
00:05:17,360 --> 00:05:25,720
and saying the AI is as good or better in this case, better nearly four fifths of the

73
00:05:25,720 --> 00:05:26,720
time.

74
00:05:26,720 --> 00:05:27,720
Wow.

75
00:05:27,720 --> 00:05:28,720
I saw that story.

76
00:05:28,720 --> 00:05:30,960
I had not taken home the empathy part, right?

77
00:05:30,960 --> 00:05:36,680
If you're a blind assessor and it could be a human, you see empathy in it.

78
00:05:36,680 --> 00:05:42,340
It would be fascinating to run that experiment again where it was unblinded and would knowing

79
00:05:42,340 --> 00:05:45,440
it was a robot change someone's interpretation of the empathy.

80
00:05:45,440 --> 00:05:47,240
A, that would be interesting.

81
00:05:47,240 --> 00:05:48,920
B, does it matter?

82
00:05:48,920 --> 00:05:53,260
And would this therefore make an argument for not telling people when they're getting

83
00:05:53,260 --> 00:05:59,800
certain types of support by a robot almost as part of a placebo effect, right?

84
00:05:59,800 --> 00:06:06,080
Just like for mental health support, we talked about this a bit on the podcast previously.

85
00:06:06,080 --> 00:06:10,600
Rating the large language model as being more empathetic than the humans.

86
00:06:10,600 --> 00:06:14,400
That's a little bit mind blowing.

87
00:06:14,400 --> 00:06:15,400
Cool.

88
00:06:15,400 --> 00:06:18,720
Thank you for sharing that with mine.

89
00:06:18,720 --> 00:06:20,640
One last quick snippet.

90
00:06:20,640 --> 00:06:26,560
I saw a graphic this week, showing the evolution of mid journey from version one to version

91
00:06:26,560 --> 00:06:32,160
five and the quality of the images and the improvement, especially in the photo realism

92
00:06:32,160 --> 00:06:37,640
of the images and getting rid of some of the junk that told you it was an AI image like

93
00:06:37,640 --> 00:06:41,520
hands that had 42 fingers and stuff like that.

94
00:06:41,520 --> 00:06:45,680
That's kind of mind blowing because version one was launched just over a year ago and

95
00:06:45,680 --> 00:06:49,560
the rumors are that version six is probably going to come out in the next month or two,

96
00:06:49,560 --> 00:06:56,160
which would be in line with the release schedule around mid journey so far.

97
00:06:56,160 --> 00:06:59,200
We talk a lot on this podcast about the pace of change.

98
00:06:59,200 --> 00:07:00,320
You just have to look at it.

99
00:07:00,320 --> 00:07:03,860
If you can find online anywhere, just Google and find those graphics that show the comparison

100
00:07:03,860 --> 00:07:06,400
from V1 to V5.

101
00:07:06,400 --> 00:07:07,940
Yikes.

102
00:07:07,940 --> 00:07:11,120
Where will we be in six to 12 months?

103
00:07:11,120 --> 00:07:12,640
The pace of change is astonishing.

104
00:07:12,640 --> 00:07:19,400
On the way back from a client meeting this afternoon, I was watching a YouTube video.

105
00:07:19,400 --> 00:07:22,400
I highly recommend everyone go and check it out.

106
00:07:22,400 --> 00:07:29,440
The title is The AI Dilemma and it was filmed at the San Francisco event on March the 9th

107
00:07:29,440 --> 00:07:32,440
and it's had 1.3 million views already.

108
00:07:32,440 --> 00:07:40,680
Basically it's two of the journalists or whoever it was, the people have put together the Netflix

109
00:07:40,680 --> 00:07:48,040
film The Social Dilemma and they're now looking at AI and kind of presenting where AI is.

110
00:07:48,040 --> 00:07:56,360
Now I should say this came out or this talk was done before GPT-4 was announced.

111
00:07:56,360 --> 00:08:04,200
In that they talk about basically the whole theme is AI poses a catastrophic risk to humanity,

112
00:08:04,200 --> 00:08:07,320
which is a recurring theme.

113
00:08:07,320 --> 00:08:13,640
Every week we have to have the doomsayers, the doomsayers call it what you will.

114
00:08:13,640 --> 00:08:19,640
But actually one of the things that they talk about in this is we're hitting double exponential

115
00:08:19,640 --> 00:08:23,240
territory in terms of the capabilities of AI.

116
00:08:23,240 --> 00:08:29,120
They said it's really hard for people to grasp quite what that means.

117
00:08:29,120 --> 00:08:37,720
Even people that are working in AI ML research day in, day out, even though they rationally

118
00:08:37,720 --> 00:08:44,440
understand what exponential growth and exponential capabilities of development, they understand

119
00:08:44,440 --> 00:08:45,440
what it means.

120
00:08:45,440 --> 00:08:49,000
They struggle to bring that into the kind of reality.

121
00:08:49,000 --> 00:08:57,800
So they give an example of AI experts that were asked to say when they thought AI would

122
00:08:57,800 --> 00:09:08,320
be capable of more than 50% of the time being able to answer competitive maths questions.

123
00:09:08,320 --> 00:09:15,200
So like real high level, cutting edge, leading edge maths, competition level maths questions.

124
00:09:15,200 --> 00:09:18,720
And they got it wrong by years.

125
00:09:18,720 --> 00:09:24,600
They said, I think it was in the, they thought it was going to be four years.

126
00:09:24,600 --> 00:09:31,640
That was it, and it was actually one year from when the survey was done.

127
00:09:31,640 --> 00:09:36,560
And then they're now saying, so this was a few years ago, they're now saying that the

128
00:09:36,560 --> 00:09:43,400
AI capability is beating the questions quicker than they can really be written, like 80,

129
00:09:43,400 --> 00:09:47,000
90% of the time.

130
00:09:47,000 --> 00:09:49,920
This is, and we're seeing this kind of in the real world.

131
00:09:49,920 --> 00:09:53,840
We're seeing this happen in real time in front of us and all of the products that we're using

132
00:09:53,840 --> 00:09:55,080
day to day.

133
00:09:55,080 --> 00:10:01,000
This growth in capabilities mid journey, as you say, the transition from one to five to

134
00:10:01,000 --> 00:10:11,640
six around the corner is breathtaking and kind of exciting, frightening.

135
00:10:11,640 --> 00:10:14,600
I'm glad, yeah, I would agree with it in that order.

136
00:10:14,600 --> 00:10:19,720
But I definitely, yeah, we've talked ad infinitum about the risks of AI and some of the other

137
00:10:19,720 --> 00:10:23,240
things that we try not to go into too much detail on this podcast.

138
00:10:23,240 --> 00:10:25,760
Although they should absolutely be recognized.

139
00:10:25,760 --> 00:10:32,000
Yeah, so there's a guy, I don't know if you know him, called Peter Diamandis, and he's

140
00:10:32,000 --> 00:10:39,120
like a futurist to a certain extent, but he's got a PhD in some biological discipline.

141
00:10:39,120 --> 00:10:41,080
He is a medical doctor.

142
00:10:41,080 --> 00:10:42,560
He founded the X Prize.

143
00:10:42,560 --> 00:10:46,560
I don't know if you heard of the X Prize.

144
00:10:46,560 --> 00:10:51,440
And he is an investor in a number of different companies in a number of different areas.

145
00:10:51,440 --> 00:10:58,040
His whole thing is about basically the exponential change in a number of areas in the world,

146
00:10:58,040 --> 00:11:01,320
drifting towards a world of abundance because he's very positive about the future.

147
00:11:01,320 --> 00:11:04,080
He's got a podcast called Moonshots and Mindsets.

148
00:11:04,080 --> 00:11:05,080
People should look that up.

149
00:11:05,080 --> 00:11:06,080
It's really, really interesting.

150
00:11:06,080 --> 00:11:11,640
But he always talks about how our brains are linear.

151
00:11:11,640 --> 00:11:17,040
We count the thing that our brains are organized and evolved to count the things in front of

152
00:11:17,040 --> 00:11:19,000
us.

153
00:11:19,000 --> 00:11:26,040
Four trees over there in that forest, 16 trees over there in that forest, 32 stones on the

154
00:11:26,040 --> 00:11:27,040
floor.

155
00:11:27,040 --> 00:11:34,320
We cannot imagine exponential change because you don't easily see it in front of you in

156
00:11:34,320 --> 00:11:38,520
nature in terms of macro scale, right?

157
00:11:38,520 --> 00:11:40,080
Just looking.

158
00:11:40,080 --> 00:11:45,040
You could probably observe exponential division of cells, but even that happens over a time

159
00:11:45,040 --> 00:11:47,240
period that's quite hard for a human to sit and watch.

160
00:11:47,240 --> 00:11:50,120
You'd have to time lapse it, you'd need a microscope.

161
00:11:50,120 --> 00:11:55,000
So yeah, it's a funny old world and it makes it hard to imagine because the other thing

162
00:11:55,000 --> 00:11:59,760
is we should also still remember that humans have been very good at predicting we'd have

163
00:11:59,760 --> 00:12:04,280
flying cars by like 1999 in the 60s and stuff like this.

164
00:12:04,280 --> 00:12:13,460
So we're both simultaneously spectacularly overly optimistic and yet for certain changes

165
00:12:13,460 --> 00:12:16,840
unable to comprehend how fast they're going to move.

166
00:12:16,840 --> 00:12:17,840
That's fascinating.

167
00:12:17,840 --> 00:12:20,600
I guess that's what keeps all the futurists in work.

168
00:12:20,600 --> 00:12:25,960
But yeah, let's move on to the actual bigger stories of the week.

169
00:12:25,960 --> 00:12:33,080
The first one is the EU's proposed new copyright laws for AI and for all of you working in

170
00:12:33,080 --> 00:12:37,560
marketing who are thinking about using generative AI tools to produce content and images and

171
00:12:37,560 --> 00:12:43,720
other things for your brand, understanding the copyright landscape is critical.

172
00:12:43,720 --> 00:12:47,920
Tell us a bit about what you learned from this recent story, Mike.

173
00:12:47,920 --> 00:12:53,520
This is a big development and we have covered the regulation changes and the AI act that

174
00:12:53,520 --> 00:12:56,040
the EU has been bringing out for a while now.

175
00:12:56,040 --> 00:13:02,400
It's been in the works for I think a couple of years and it's the EU commission's attempt

176
00:13:02,400 --> 00:13:05,240
at addressing AI technology regulation.

177
00:13:05,240 --> 00:13:12,640
And one of the things that to date they've had in the regulation is that tools needed

178
00:13:12,640 --> 00:13:18,960
to have, where AI or ML technology was being used, it had to have some sort of risk classification.

179
00:13:18,960 --> 00:13:23,200
So it would have to say high risk, low risk, medium risk.

180
00:13:23,200 --> 00:13:28,080
So for instance, if it was like spam detection on emails, that would be low risk.

181
00:13:28,080 --> 00:13:33,720
And if it was saying who can get healthcare insurance, that would be high risk, those

182
00:13:33,720 --> 00:13:35,700
kinds of things.

183
00:13:35,700 --> 00:13:42,640
But this new development in the regulation has been brought about specifically in response

184
00:13:42,640 --> 00:13:50,340
to generative AI and it addresses the big topic of copyright.

185
00:13:50,340 --> 00:13:55,880
So as we know, training data for large language models and text to image generators and all

186
00:13:55,880 --> 00:13:59,320
of these kinds of technologies consumes lots of copyright data.

187
00:13:59,320 --> 00:14:00,640
They crawl the web.

188
00:14:00,640 --> 00:14:04,320
They watch lots of video.

189
00:14:04,320 --> 00:14:11,680
We've seen stability and stable diffusion creating image with shusha stock, watermarks

190
00:14:11,680 --> 00:14:12,680
all over them.

191
00:14:12,680 --> 00:14:15,000
So we know that it is doing that.

192
00:14:15,000 --> 00:14:23,360
And what the EU is now saying is that where models have been trained using copyright data,

193
00:14:23,360 --> 00:14:28,640
the people who have produced the model or trained the model have to disclose that as

194
00:14:28,640 --> 00:14:34,640
part of the, they just have to disclose that.

195
00:14:34,640 --> 00:14:41,040
This is an interesting solution given where they could have landed because there were

196
00:14:41,040 --> 00:14:47,880
some people that were calling for an outright ban.

197
00:14:47,880 --> 00:14:53,360
And until these tools can be sorted out, some of the more extreme voices were saying that

198
00:14:53,360 --> 00:14:56,880
they should just ban it completely.

199
00:14:56,880 --> 00:15:02,720
And this is seen as being a more moderating approach.

200
00:15:02,720 --> 00:15:08,960
And I think it serves the good middle ground.

201
00:15:08,960 --> 00:15:17,720
I certainly think from a marketer's perspective or a content producer or a content creator's

202
00:15:17,720 --> 00:15:21,800
perspective, we need to start thinking a bit more like Wired.

203
00:15:21,800 --> 00:15:27,320
So Wired magazine's publication about how they're using generative AI spoke about that

204
00:15:27,320 --> 00:15:30,040
they're not, spoke exactly how they are going to use it.

205
00:15:30,040 --> 00:15:33,920
And one of the things they said for text to image generation was they're not going to

206
00:15:33,920 --> 00:15:37,960
use it until the copyright issue is resolved.

207
00:15:37,960 --> 00:15:42,640
Now all of a sudden we can start to see that, well, actually the copyright issue is going

208
00:15:42,640 --> 00:15:47,560
to be a real differentiator for people using a model or not.

209
00:15:47,560 --> 00:15:50,720
If Wired magazine are saying, well, we're not going to use models that have been trained

210
00:15:50,720 --> 00:15:58,680
on copyright data, a tool like Adobe's Firefly, which has as one of its leading value propositions,

211
00:15:58,680 --> 00:16:02,440
we are not training our models on copyrighted data.

212
00:16:02,440 --> 00:16:06,080
Well that becomes a real sell because you think, oh, well, I will use that and I'm not

213
00:16:06,080 --> 00:16:08,660
at risk of ripping anybody off.

214
00:16:08,660 --> 00:16:14,280
So that becomes a competitive advantage for people that are creating these models that

215
00:16:14,280 --> 00:16:22,560
aren't trained on copyrighted data, which leads you to think that the likes of Shutterstock,

216
00:16:22,560 --> 00:16:29,560
the likes of Getty Images, the likes of Adobe, who have access to huge amounts of these kind

217
00:16:29,560 --> 00:16:33,640
of data sets that they own.

218
00:16:33,640 --> 00:16:35,920
Big image repositories and stuff.

219
00:16:35,920 --> 00:16:36,920
Yeah.

220
00:16:36,920 --> 00:16:43,440
They will be able to create the better, more compliant models in the future.

221
00:16:43,440 --> 00:16:44,440
Yeah.

222
00:16:44,440 --> 00:16:46,800
It's absolutely fascinating.

223
00:16:46,800 --> 00:16:53,960
I think I can imagine a world where certainly the bigger brands almost have to lean into

224
00:16:53,960 --> 00:16:55,400
those types of tools.

225
00:16:55,400 --> 00:17:00,960
It's just not worth their time to get caught up in any of these copyright issues.

226
00:17:00,960 --> 00:17:07,800
My experience playing with and reading about and looking at the ones that are trained on

227
00:17:07,800 --> 00:17:17,240
data that's owned by like Shutterstock images or Adobe is that they are limited in comparison.

228
00:17:17,240 --> 00:17:20,920
I think I saw online an example where someone was trying to create some interesting images

229
00:17:20,920 --> 00:17:22,280
with Deadpool.

230
00:17:22,280 --> 00:17:25,760
And of course, I think it was Adobe Firefly.

231
00:17:25,760 --> 00:17:26,760
Basically it couldn't.

232
00:17:26,760 --> 00:17:34,880
It didn't know what Deadpool was because that's an owned likeness by Disney at this point,

233
00:17:34,880 --> 00:17:37,280
I would guess.

234
00:17:37,280 --> 00:17:39,520
And so it couldn't be in the training set.

235
00:17:39,520 --> 00:17:40,520
Right.

236
00:17:40,520 --> 00:17:44,520
But of course, those that are using images from here, there and everywhere could do it.

237
00:17:44,520 --> 00:17:49,100
So I could imagine a world in which smaller businesses that were willing to risk it could

238
00:17:49,100 --> 00:17:53,800
probably create quite maybe higher quality potentially as well images.

239
00:17:53,800 --> 00:17:55,400
But that's the risk that they're taking.

240
00:17:55,400 --> 00:17:59,280
Whereas bigger brands probably lean into these types of images.

241
00:17:59,280 --> 00:18:03,440
We've talked previously, how will this end up?

242
00:18:03,440 --> 00:18:10,960
Probably a little bit like Spotify, like some type of mechanism where everybody whose image

243
00:18:10,960 --> 00:18:19,400
was used in the training data set gets 0.00000100010001 pence every time someone generates an image

244
00:18:19,400 --> 00:18:22,040
and there's some sort of fees that come out the back of it.

245
00:18:22,040 --> 00:18:25,340
I still think we may see something like that, to be honest.

246
00:18:25,340 --> 00:18:29,060
Did you see the Grimes updates?

247
00:18:29,060 --> 00:18:30,120
Tell us about it.

248
00:18:30,120 --> 00:18:36,360
So yeah, the Grimes, I don't really, I'm not even going to pretend to really know who Grimes

249
00:18:36,360 --> 00:18:39,680
is other than they're, I think, a pop star.

250
00:18:39,680 --> 00:18:41,680
I sound so old.

251
00:18:41,680 --> 00:18:44,880
They make the popular music that goes into the hit parade.

252
00:18:44,880 --> 00:18:45,880
I think she has cassette singles.

253
00:18:45,880 --> 00:18:48,880
I think they're on the wireless.

254
00:18:48,880 --> 00:18:49,880
Yes.

255
00:18:49,880 --> 00:18:52,280
You can get the one CD single.

256
00:18:52,280 --> 00:18:57,200
And is Grimes married to Elon Musk?

257
00:18:57,200 --> 00:18:58,200
Is that the right person?

258
00:18:58,200 --> 00:18:59,200
I think was.

259
00:18:59,200 --> 00:19:03,000
Hard to keep up.

260
00:19:03,000 --> 00:19:14,040
So yeah, Grimes has said that feel free to make music using AI versions of Grimes' voice.

261
00:19:14,040 --> 00:19:15,600
Absolutely.

262
00:19:15,600 --> 00:19:20,640
Go wild and Grimes will share any revenue with you 50-50.

263
00:19:20,640 --> 00:19:27,280
It's a very interesting approach and I think very smart because if she doesn't have to

264
00:19:27,280 --> 00:19:33,800
do any, you know, or they don't have to do any work, then, and then they made the revenue

265
00:19:33,800 --> 00:19:36,480
and also the person who did the work gets some revenue.

266
00:19:36,480 --> 00:19:38,360
Yeah, absolutely fascinating.

267
00:19:38,360 --> 00:19:43,360
How some of these innovative business models come about is going to be fascinating.

268
00:19:43,360 --> 00:19:47,840
I would really want to keep an eye on the early movers who put themselves out there

269
00:19:47,840 --> 00:19:53,280
and say things like that because I think it will be fascinating to see how it plays out.

270
00:19:53,280 --> 00:19:54,960
Yeah, good story.

271
00:19:54,960 --> 00:19:58,000
Thanks for flagging that one for us, mine.

272
00:19:58,000 --> 00:20:04,680
Story number two is OpenAI has made a new plugin available called Code Interpreter through

273
00:20:04,680 --> 00:20:05,680
ChatGPT.

274
00:20:05,680 --> 00:20:10,600
So for those that are on the wait list like me still waiting to get access to plugins,

275
00:20:10,600 --> 00:20:13,960
please give me access to plugins so I can have a play that could tell all the lovely

276
00:20:13,960 --> 00:20:19,360
people who listen to this podcast and give them the absolute honest review of just how

277
00:20:19,360 --> 00:20:24,000
good or not they are because I read that they're good at times and still really buggy at other

278
00:20:24,000 --> 00:20:26,120
times.

279
00:20:26,120 --> 00:20:34,600
This new one is making a few waves this week because it allows you to analyze and interpret

280
00:20:34,600 --> 00:20:36,000
data.

281
00:20:36,000 --> 00:20:40,320
So I've seen some reports suggesting it can deliver the same sort of value as a junior

282
00:20:40,320 --> 00:20:42,740
level data analyst.

283
00:20:42,740 --> 00:20:47,580
The inference being it could replace a junior level data analyst.

284
00:20:47,580 --> 00:20:54,320
The way that it works is you upload a data file via CSV and then you ask for trends,

285
00:20:54,320 --> 00:20:56,380
insights on the data.

286
00:20:56,380 --> 00:20:59,460
You can get it to plot things for you.

287
00:20:59,460 --> 00:21:02,280
And all of this is driven by natural language prompts.

288
00:21:02,280 --> 00:21:08,020
So a lot of tools have been promising some level of automation and AI and the ability

289
00:21:08,020 --> 00:21:13,320
to do data analysis without having to be a coder, but actually you still needed to know

290
00:21:13,320 --> 00:21:15,680
how to manipulate SQL and all this other stuff.

291
00:21:15,680 --> 00:21:17,880
Whereas with this plugin, it's natural language.

292
00:21:17,880 --> 00:21:22,160
You upload it, you ask a question of the data and you get an answer.

293
00:21:22,160 --> 00:21:24,520
You can ask it to provide you with a report.

294
00:21:24,520 --> 00:21:30,680
I saw an example where someone basically asked code interpreters to pull out what it thought

295
00:21:30,680 --> 00:21:36,200
was two or three key trends from a large bulk of data and then write an abstract for the

296
00:21:36,200 --> 00:21:40,040
paper that they would draft based on it.

297
00:21:40,040 --> 00:21:46,100
And then I think also, I think they even drafted the paper using TrackGBT as well.

298
00:21:46,100 --> 00:21:49,680
So this is absolutely fascinating.

299
00:21:49,680 --> 00:21:52,760
Why is it important for marketers?

300
00:21:52,760 --> 00:21:55,900
Fundamentally, it makes it easier for marketers.

301
00:21:55,900 --> 00:22:00,040
Many of us don't know, we certainly don't, how to manipulate data using SQL and other

302
00:22:00,040 --> 00:22:01,840
fancy techniques.

303
00:22:01,840 --> 00:22:08,720
So it could bring that to non-developer marketers, which I think is really interesting.

304
00:22:08,720 --> 00:22:14,440
And I think making data analysis open and accessible like this is especially powerful

305
00:22:14,440 --> 00:22:20,600
for small and mid-size businesses that would never have perhaps interrogated their data

306
00:22:20,600 --> 00:22:21,600
in this way.

307
00:22:21,600 --> 00:22:27,200
If you don't have a data analysis team, because you're a small or mid-size company, you just

308
00:22:27,200 --> 00:22:28,200
don't do it.

309
00:22:28,200 --> 00:22:32,120
And that's been my experience anyway, because you don't have the bandwidth anyway, because

310
00:22:32,120 --> 00:22:35,240
you've got a hundred other things to do and you don't have the expertise in the business.

311
00:22:35,240 --> 00:22:42,640
But now, if you can basically outsource that to the level of junior data analyst in TrackGPT

312
00:22:42,640 --> 00:22:47,240
with this plugin, you can start to surface insights in your data that you might not have

313
00:22:47,240 --> 00:22:49,920
otherwise.

314
00:22:49,920 --> 00:22:50,920
So that's pretty cool.

315
00:22:50,920 --> 00:22:57,440
I think the one caveat that springs instantly to mind is the caveat of TrackGPT forever

316
00:22:57,440 --> 00:23:00,600
and ever and ever, which is, will it hallucinate?

317
00:23:00,600 --> 00:23:03,360
Will it report trends that weren't there?

318
00:23:03,360 --> 00:23:07,960
If it's only able to perform at a certain level, it's best piloted.

319
00:23:07,960 --> 00:23:12,480
This has been my experience with TrackGPT at least, by someone who has more insight than

320
00:23:12,480 --> 00:23:13,480
the tool, right?

321
00:23:13,480 --> 00:23:18,360
You ask it to put together 10 points for a blog post on a particular topic.

322
00:23:18,360 --> 00:23:22,640
You need to be an expert yourself in the topic to make sure that those 10 points are all

323
00:23:22,640 --> 00:23:27,360
accurate and relevant and interesting and that it's not full of hallucinations and junk.

324
00:23:27,360 --> 00:23:30,480
And I could imagine that could end up being the case here.

325
00:23:30,480 --> 00:23:37,280
So the reality is it might help senior data analysts get more done and it might not quite

326
00:23:37,280 --> 00:23:41,400
open up the power to non-developer data analyst people potentially.

327
00:23:41,400 --> 00:23:43,400
But yeah, so I thought that was fascinating.

328
00:23:43,400 --> 00:23:44,840
Any thoughts on this, Brian?

329
00:23:44,840 --> 00:23:55,320
I think actually using it as a non-data analyst will be a bit of a stretch for a lot of people

330
00:23:55,320 --> 00:23:57,080
for the reasons that you've identified.

331
00:23:57,080 --> 00:23:59,840
But I think you don't know when you don't know.

332
00:23:59,840 --> 00:24:01,080
That's the issue.

333
00:24:01,080 --> 00:24:04,560
So you can stick a load of data and then be like, tell me some things or produce some

334
00:24:04,560 --> 00:24:05,920
interesting graphs.

335
00:24:05,920 --> 00:24:11,160
But when I was, this sprung to mind when I was reading a thread, I've got a Twitter thread

336
00:24:11,160 --> 00:24:15,840
open now and it says, where is it?

337
00:24:15,840 --> 00:24:21,680
So creating charts to basic video editing to converting files, it does it all.

338
00:24:21,680 --> 00:24:22,680
Basic video editing, what?

339
00:24:22,680 --> 00:24:23,680
How does that even work?

340
00:24:23,680 --> 00:24:26,560
I don't even understand how that makes sense.

341
00:24:26,560 --> 00:24:31,280
That's just demonstrating my lack of knowledge about what a tool like this should be able

342
00:24:31,280 --> 00:24:32,560
to do.

343
00:24:32,560 --> 00:24:36,980
It goes on to give some really interesting visualizations that it does plotting onto

344
00:24:36,980 --> 00:24:42,640
maps and kind of showing geographic data and spatial analysis and all sorts of kind of

345
00:24:42,640 --> 00:24:43,960
cool things.

346
00:24:43,960 --> 00:24:48,960
But if you're not a data analyst, you don't know what graph to...

347
00:24:48,960 --> 00:24:51,800
You're probably talking about, give me a pie chart.

348
00:24:51,800 --> 00:24:52,800
Give me a pie chart.

349
00:24:52,800 --> 00:24:54,040
Give me a bar chart.

350
00:24:54,040 --> 00:25:02,360
Whereas this thing is laughing at your ineptitude when you ask it to give you a basic pie chart.

351
00:25:02,360 --> 00:25:03,600
I think you're right.

352
00:25:03,600 --> 00:25:11,960
One of the things I've noticed as I've got older is smart people ask great questions.

353
00:25:11,960 --> 00:25:17,080
And if you don't know what great questions to ask of your data, then the insights are

354
00:25:17,080 --> 00:25:20,920
going to be, you're not going to access them.

355
00:25:20,920 --> 00:25:29,000
And the ability...the example I mentioned did see ChatGPT's code interpreter try and

356
00:25:29,000 --> 00:25:35,280
pull insights from the data by asking itself questions about the data.

357
00:25:35,280 --> 00:25:39,220
But obviously some people have reported that it's more like a junior analyst.

358
00:25:39,220 --> 00:25:46,760
So one assumes its ability to ask great questions and great follow up questions.

359
00:25:46,760 --> 00:25:47,760
They always say like...

360
00:25:47,760 --> 00:25:48,760
Who's that?

361
00:25:48,760 --> 00:25:51,280
Who's that guy?

362
00:25:51,280 --> 00:25:54,600
If you really want to get to the number of the subject, there's models like the five

363
00:25:54,600 --> 00:25:55,600
Ys.

364
00:25:55,600 --> 00:25:59,520
It's not the first question where the real interesting stuff is.

365
00:25:59,520 --> 00:26:01,440
Okay, well what drove that?

366
00:26:01,440 --> 00:26:02,440
And what drove that?

367
00:26:02,440 --> 00:26:03,680
And what are the underlying factors of that?

368
00:26:03,680 --> 00:26:06,200
And then you get something interesting.

369
00:26:06,200 --> 00:26:08,520
So yeah, can these tools really do that?

370
00:26:08,520 --> 00:26:09,600
I think you're right, Martin.

371
00:26:09,600 --> 00:26:15,560
I think that's where experts piloting the machines are still going to be super valuable.

372
00:26:15,560 --> 00:26:18,600
I think you're absolutely bang on.

373
00:26:18,600 --> 00:26:23,560
Alright, let's get ourselves onto story number three, which is Stability AI have announced

374
00:26:23,560 --> 00:26:30,440
DeepFloid IF, which is a text to image generation tool, but it actually produces nice text,

375
00:26:30,440 --> 00:26:34,760
which has obviously been a major bugbear of most of the tools available so far.

376
00:26:34,760 --> 00:26:36,120
Tell us about this one, Martin.

377
00:26:36,120 --> 00:26:42,120
Yeah, so the only one that I'd seen do this before, the only text to image generation

378
00:26:42,120 --> 00:26:49,440
model I'd seen produce text was the research paper that Google published with Google Party

379
00:26:49,440 --> 00:26:50,840
last year.

380
00:26:50,840 --> 00:26:54,520
They announced it just after they announced Google Imagen.

381
00:26:54,520 --> 00:26:56,240
And that was a huge data set.

382
00:26:56,240 --> 00:26:58,400
It was a massive amount of parameters.

383
00:26:58,400 --> 00:27:03,040
It was, I think it was like 20 billion or something, which is far, far greater than

384
00:27:03,040 --> 00:27:04,040
than the others.

385
00:27:04,040 --> 00:27:09,840
But Stability AI have released a DeepFloid IF, a state of the art text to image model

386
00:27:09,840 --> 00:27:13,200
that generates high quality images from text prompts.

387
00:27:13,200 --> 00:27:16,280
So it uses the same underlying interface.

388
00:27:16,280 --> 00:27:21,480
So we were all familiar with the writing some words and you get an image.

389
00:27:21,480 --> 00:27:26,520
Incredible abilities with photo realism.

390
00:27:26,520 --> 00:27:33,880
And it can generate in different aspect ratios and the text generation on it.

391
00:27:33,880 --> 00:27:40,040
There's a couple of interesting examples with shop signage and graffiti and things like

392
00:27:40,040 --> 00:27:41,480
that is incredible.

393
00:27:41,480 --> 00:27:45,160
Now when you look at it, it's a different model to, it's a different text generation

394
00:27:45,160 --> 00:27:52,360
style to the diffusion models that we've seen to date.

395
00:27:52,360 --> 00:27:58,880
That's the technology, the way the model creates the image was through diffusion, hence stable

396
00:27:58,880 --> 00:28:02,100
diffusion on the previous model.

397
00:28:02,100 --> 00:28:14,480
With this one, it creates a 64 by 64 pixel image and then kind of upscales it from there.

398
00:28:14,480 --> 00:28:18,160
And it's a really interesting approach to completely reading the paper.

399
00:28:18,160 --> 00:28:24,000
I mean, it gets far more technical than my small brain can handle, but just seeing a

400
00:28:24,000 --> 00:28:30,680
different approach to image generation was quite fascinating to understand how you can

401
00:28:30,680 --> 00:28:33,560
get text created.

402
00:28:33,560 --> 00:28:38,680
It's much easier to create text when you kind of scale it down to 64 by 64 pixels and then

403
00:28:38,680 --> 00:28:42,560
expand out from there.

404
00:28:42,560 --> 00:28:47,360
So yeah, another interesting model for marketers to look at because this is really going to,

405
00:28:47,360 --> 00:28:52,200
let's be honest, we've all been itching for the moment we can create images with real

406
00:28:52,200 --> 00:28:54,280
text in them.

407
00:28:54,280 --> 00:28:57,340
It's been one of the things holding back text to image from being something that's kind

408
00:28:57,340 --> 00:29:04,400
of truly useful in a lot of instances.

409
00:29:04,400 --> 00:29:08,800
Text was always one of those things that was an artifact that you would spot straight away.

410
00:29:08,800 --> 00:29:16,080
And now we can see good high quality images.

411
00:29:16,080 --> 00:29:23,400
Potentially seeing this being integrated into tools like Canva, that'll be very interesting

412
00:29:23,400 --> 00:29:29,000
for marketers to get their hands on it, not having to kind of, well, as with all of these

413
00:29:29,000 --> 00:29:30,760
kind of tools, you want a good UI.

414
00:29:30,760 --> 00:29:33,040
You want a nice easy way to interface with it.

415
00:29:33,040 --> 00:29:40,000
I wouldn't be surprised if Stability AI implemented within Clip Drop.

416
00:29:40,000 --> 00:29:42,840
That was the tool of the week from a recent episode.

417
00:29:42,840 --> 00:29:47,640
I would expect them to make it nice and easy for people to use from there.

418
00:29:47,640 --> 00:29:54,600
Obviously, as with all of these things that we're seeing, new models that are just bigger,

419
00:29:54,600 --> 00:30:01,160
better, more capable, this increases the risk of potential disinformation and fake news.

420
00:30:01,160 --> 00:30:04,720
But I think we've always had fake news problems anyway.

421
00:30:04,720 --> 00:30:10,120
And the same methods for filtering images, the same sense checks, the same fact checking,

422
00:30:10,120 --> 00:30:16,600
the same general common sense levels of discernment that we should bring to any media we see online

423
00:30:16,600 --> 00:30:18,720
should apply anyway.

424
00:30:18,720 --> 00:30:26,520
And whether it's a new text to image model or just a Photoshop image, the risks always

425
00:30:26,520 --> 00:30:27,520
present themselves.

426
00:30:27,520 --> 00:30:30,920
I don't think that's too much of a concern.

427
00:30:30,920 --> 00:30:34,640
But yeah, have you had a look at any of the images?

428
00:30:34,640 --> 00:30:35,640
Have you checked it out?

429
00:30:35,640 --> 00:30:37,680
Yeah, I've had a bit of a play with it.

430
00:30:37,680 --> 00:30:41,800
And it's kind of, but all the reasons you just said, it's actually kind of fascinating

431
00:30:41,800 --> 00:30:48,880
to use because obviously you start with a really small image that looks blurry and then

432
00:30:48,880 --> 00:30:53,160
you upscale it and the upscale actually looks good.

433
00:30:53,160 --> 00:30:57,200
Like in a lot of cases when I've played with it, not many artifacts, although I'm sure

434
00:30:57,200 --> 00:30:59,200
it's probably quite easy to get artifacts on.

435
00:30:59,200 --> 00:31:03,280
I'm sure there's a lot of iteration here to get the actual image you want.

436
00:31:03,280 --> 00:31:06,760
Come back to your previous point about producing images, like the amount of images that I've

437
00:31:06,760 --> 00:31:14,080
produced that end up a bit crap because it's tried to put text in them.

438
00:31:14,080 --> 00:31:17,520
And it's like, how am I supposed to crop the text out of that?

439
00:31:17,520 --> 00:31:23,160
And you end up, depending on the tool you're using, you're trying to use one of the repair

440
00:31:23,160 --> 00:31:26,680
tools where you can actually select a bit of the image and do another generation.

441
00:31:26,680 --> 00:31:32,560
It's almost like you needed actually to generate images deliberately without texting and then

442
00:31:32,560 --> 00:31:34,000
try and add it later.

443
00:31:34,000 --> 00:31:40,880
And not only is that a bit of a pain in the bottom from a workflow perspective, it actually

444
00:31:40,880 --> 00:31:43,360
cuts down also on the creative possibilities.

445
00:31:43,360 --> 00:31:48,760
I remember one of the first things I did was I tried to get Dually2 to create a new logo

446
00:31:48,760 --> 00:31:50,800
for ByStrat just to see what it would look like.

447
00:31:50,800 --> 00:31:53,000
And it was horrific.

448
00:31:53,000 --> 00:31:54,760
It didn't understand at all.

449
00:31:54,760 --> 00:31:57,280
It couldn't get the word in at all.

450
00:31:57,280 --> 00:32:02,000
So you could imagine how this might be applied very quickly to something like logo generation,

451
00:32:02,000 --> 00:32:07,240
whereas we're often going to be including the company name and stuff in there.

452
00:32:07,240 --> 00:32:09,720
So yeah, no, I think it's going to be fun to play with.

453
00:32:09,720 --> 00:32:14,080
I think having had a play with it, I see this again as like fairly early.

454
00:32:14,080 --> 00:32:15,960
I think it will get better from here.

455
00:32:15,960 --> 00:32:22,640
But yes, I think it solves one of the problems that we've all been having.

456
00:32:22,640 --> 00:32:26,480
Right last story and then a quick tool of the week.

457
00:32:26,480 --> 00:32:29,840
So this is a quick one.

458
00:32:29,840 --> 00:32:32,440
We saw this on the Twitters this week, right?

459
00:32:32,440 --> 00:32:39,160
Somebody made a whole trailer for an anime movie using text to video using RunwayML's

460
00:32:39,160 --> 00:32:40,800
Gen 2.

461
00:32:40,800 --> 00:32:45,440
It's about two minutes long and it is incredible.

462
00:32:45,440 --> 00:32:47,000
When was Gen 2 even released?

463
00:32:47,000 --> 00:32:48,720
Like what, two weeks ago or something?

464
00:32:48,720 --> 00:32:49,720
I don't know.

465
00:32:49,720 --> 00:32:50,720
Yeah, two or three weeks.

466
00:32:50,720 --> 00:32:52,200
It's no time at all.

467
00:32:52,200 --> 00:32:57,080
Gen 1, which was the precursor to Gen 2, makes a lot of sense.

468
00:32:57,080 --> 00:33:00,200
How did they come up with this invention?

469
00:33:00,200 --> 00:33:04,960
Honestly, that must have been an incredible four-hour workshop to figure out how they

470
00:33:04,960 --> 00:33:07,960
were going to get that to work.

471
00:33:07,960 --> 00:33:12,040
Gen 1, I think was you made a video and then he basically stylized it.

472
00:33:12,040 --> 00:33:16,720
But now this is just text based and we thought it was amazing.

473
00:33:16,720 --> 00:33:17,720
You should check it out.

474
00:33:17,720 --> 00:33:20,560
Search for, I don't know, anime movie, Runway Gen 2.

475
00:33:20,560 --> 00:33:22,480
You'll probably find it.

476
00:33:22,480 --> 00:33:24,640
We'll share a link and we'll embed it on the website.

477
00:33:24,640 --> 00:33:25,640
Let's do that.

478
00:33:25,640 --> 00:33:27,240
Why don't we help the people out?

479
00:33:27,240 --> 00:33:28,240
Embed it on the website.

480
00:33:28,240 --> 00:33:29,240
Check out the website.

481
00:33:29,240 --> 00:33:34,920
Go to artificiallyintelligentmarketing.com, which I hope is the actual name of the website,

482
00:33:34,920 --> 00:33:37,520
because Martin's the master of that domain.

483
00:33:37,520 --> 00:33:39,080
It sounds like it probably is.

484
00:33:39,080 --> 00:33:40,080
Go there.

485
00:33:40,080 --> 00:33:41,080
You'll see it there.

486
00:33:41,080 --> 00:33:46,160
But one of the things that's kind of interesting about it is it's full of crazy artifacts.

487
00:33:46,160 --> 00:33:51,720
There's a quick scene where in the city where a car is driving along the road and it just

488
00:33:51,720 --> 00:33:54,160
disappears into the road and it's never seen again.

489
00:33:54,160 --> 00:33:55,360
Completely becomes the road.

490
00:33:55,360 --> 00:33:56,600
Yeah, it's kind of mad.

491
00:33:56,600 --> 00:34:02,480
There's a character stood at a bar, like a desktop doing some work and they've only got

492
00:34:02,480 --> 00:34:04,120
one leg.

493
00:34:04,120 --> 00:34:08,400
And so this is far away from being production quality.

494
00:34:08,400 --> 00:34:15,200
But if you think about where we were like a month ago in terms of, oh, you can generate

495
00:34:15,200 --> 00:34:20,160
images and they're kind of a bit photorealistic, but people's fingers are a bit weird.

496
00:34:20,160 --> 00:34:23,720
Now we've got Mid Journey version five, which has fixed a lot of those issues.

497
00:34:23,720 --> 00:34:27,480
And now we've got text to video of the point where someone could create an anime movie

498
00:34:27,480 --> 00:34:29,800
trailer.

499
00:34:29,800 --> 00:34:33,600
We're back to that exponential speed of development and how we're supposed to get our heads around

500
00:34:33,600 --> 00:34:34,720
it, Martin.

501
00:34:34,720 --> 00:34:36,040
It did feel watching it.

502
00:34:36,040 --> 00:34:40,440
It was a bit like watching a Studio Ghibli film on acid.

503
00:34:40,440 --> 00:34:41,440
Yeah.

504
00:34:41,440 --> 00:34:42,440
Yeah.

505
00:34:42,440 --> 00:34:43,720
It gave me a nosebleed a bit.

506
00:34:43,720 --> 00:34:48,940
I don't disagree with you because of the bits that were hazing in and out.

507
00:34:48,940 --> 00:34:51,120
But the person who put it together, applause.

508
00:34:51,120 --> 00:34:52,920
Yeah, yeah, top marks.

509
00:34:52,920 --> 00:34:57,640
I bet it took ages just from our experience playing with image and video generation tools

510
00:34:57,640 --> 00:35:00,320
99 times out of a hundred.

511
00:35:00,320 --> 00:35:03,960
No, that's too many, but nine times out of 10, you don't get what you want.

512
00:35:03,960 --> 00:35:08,460
So iterating every single one of those scenes until the person got what they wanted was

513
00:35:08,460 --> 00:35:11,460
probably a labor of love.

514
00:35:11,460 --> 00:35:17,000
But as this technology progresses for marketers, how is text to video going to impact your

515
00:35:17,000 --> 00:35:18,920
marketing strategy?

516
00:35:18,920 --> 00:35:25,920
In terms of how accessible video becomes and how you might turn to video for certain campaigns

517
00:35:25,920 --> 00:35:30,800
or content that you want to create where video would have been too time consuming or perhaps

518
00:35:30,800 --> 00:35:36,680
expensive, especially animationy type video where obviously I think this is sitting, you

519
00:35:36,680 --> 00:35:42,520
now can turn to it and maybe you can even create pretty good quality video at scale

520
00:35:42,520 --> 00:35:44,440
for things like your social media programs.

521
00:35:44,440 --> 00:35:49,560
There's little bits of B-roll footage that you want to just stick on a social post.

522
00:35:49,560 --> 00:35:52,800
Yeah, that's where I can see this being really useful.

523
00:35:52,800 --> 00:35:53,800
Absolutely.

524
00:35:53,800 --> 00:35:59,200
And I think if we look at second order consequences, and this is probably true for all the tools,

525
00:35:59,200 --> 00:36:08,760
as they start to become more mainstream in terms of use, it's the creativity of the idea

526
00:36:08,760 --> 00:36:15,560
and its ability to be interesting to your audience, to disrupt their attention and grab

527
00:36:15,560 --> 00:36:17,440
their attention.

528
00:36:17,440 --> 00:36:20,200
That's what becomes the premium and that's the human part.

529
00:36:20,200 --> 00:36:25,620
So I think when we end up with easier access to tools that can produce a high quality output,

530
00:36:25,620 --> 00:36:30,840
the cream that will rise to the top was the people who think of really excellent creative

531
00:36:30,840 --> 00:36:38,740
ideas to drive the machine and have the creative eye and the tenacity to keep prompting it

532
00:36:38,740 --> 00:36:42,640
to get closer to the thing that they actually want to see.

533
00:36:42,640 --> 00:36:47,600
Yeah, these outputs are never just a one shot done.

534
00:36:47,600 --> 00:36:48,600
There we go.

535
00:36:48,600 --> 00:36:51,080
I've created the masterpiece I had in my mind.

536
00:36:51,080 --> 00:36:56,560
It's always a constant iteration, tweaking, refining.

537
00:36:56,560 --> 00:37:02,120
But yeah, I think that's ultimately going to be the role of the AI artist.

538
00:37:02,120 --> 00:37:06,880
It's iterating, improving and prompt engineering.

539
00:37:06,880 --> 00:37:07,880
I think so.

540
00:37:07,880 --> 00:37:11,960
And I think we'll see those people really rise to the fore.

541
00:37:11,960 --> 00:37:18,320
In fact, there's a metagent this week called Chris Branch who's doing some really interesting

542
00:37:18,320 --> 00:37:25,920
work using generative creative tools to produce really stunning stuff.

543
00:37:25,920 --> 00:37:30,720
And I connected with him on LinkedIn and you should all look him up.

544
00:37:30,720 --> 00:37:34,840
Sharing so many creative approaches to image generation.

545
00:37:34,840 --> 00:37:45,000
To really open my mind to just how important the creative person is in getting inspirational,

546
00:37:45,000 --> 00:37:49,280
arresting, provocative things out of the tools.

547
00:37:49,280 --> 00:37:56,400
I've seen from him different animals like fish and things like leaping out of sands.

548
00:37:56,400 --> 00:38:01,160
I think they took sand shark as an inspiration and then riffed on that with a bunch of other

549
00:38:01,160 --> 00:38:04,520
animals and these images are incredible, right?

550
00:38:04,520 --> 00:38:08,480
They are borderline photorealistic.

551
00:38:08,480 --> 00:38:11,440
Certainly all the elements are, it's just your brain is like, well, I know that's not

552
00:38:11,440 --> 00:38:13,000
real, right?

553
00:38:13,000 --> 00:38:18,640
And I think about a lot of the work on creative campaigns that I've done where a gifted creative

554
00:38:18,640 --> 00:38:23,400
has come up with that idea and then had to execute it by seamlessly blending two, three,

555
00:38:23,400 --> 00:38:26,400
five more images in something like Photoshop to get that effect.

556
00:38:26,400 --> 00:38:32,100
But they're getting that out of mid-journey, presumably iteratively.

557
00:38:32,100 --> 00:38:35,760
So not in one minute, but maybe in 15.

558
00:38:35,760 --> 00:38:37,400
It's pretty cool and pretty inspiring.

559
00:38:37,400 --> 00:38:40,880
So we're going to get Chris, I had a chat with Chris last week.

560
00:38:40,880 --> 00:38:44,280
We're going to get Chris on the website because I think he could tell us some really fantastic

561
00:38:44,280 --> 00:38:45,760
stories about that.

562
00:38:45,760 --> 00:38:49,080
But hopefully you can find that video online, everyone.

563
00:38:49,080 --> 00:38:52,760
It will inspire you to think about where video is heading and what creative ideas that you

564
00:38:52,760 --> 00:38:55,320
have that now you can have realized.

565
00:38:55,320 --> 00:38:58,900
And also maybe if you look up Chris Branch on LinkedIn and see some of the things that

566
00:38:58,900 --> 00:38:59,900
they're doing.

567
00:38:59,900 --> 00:39:02,320
What's the name of Chris's company?

568
00:39:02,320 --> 00:39:03,560
Where's he working?

569
00:39:03,560 --> 00:39:07,360
I think it's Cedar League.

570
00:39:07,360 --> 00:39:10,680
Maybe I'll put that on the show notes and not try and check that in real time.

571
00:39:10,680 --> 00:39:13,280
But yeah, and be inspired.

572
00:39:13,280 --> 00:39:14,520
Go have a play.

573
00:39:14,520 --> 00:39:19,600
You can probably do some really cool stuff with just a bit of patience and Googling around

574
00:39:19,600 --> 00:39:22,080
to get some advice on how to come up with prompts.

575
00:39:22,080 --> 00:39:26,080
And I know that Chris and his team actually offer mid-journey training as well for people

576
00:39:26,080 --> 00:39:29,280
who want to get some insights into how they achieve what they achieve.

577
00:39:29,280 --> 00:39:32,000
So that's worth looking into.

578
00:39:32,000 --> 00:39:33,120
Right.

579
00:39:33,120 --> 00:39:36,760
One last bits and piece and then we'll be all done for the week.

580
00:39:36,760 --> 00:39:40,240
Tool of the week this week is a strange one.

581
00:39:40,240 --> 00:39:43,720
Probably doesn't sound like much of a tool, but I've been playing with it this week and

582
00:39:43,720 --> 00:39:51,680
I think I'm going to change my behavior quite a lot based on it.

583
00:39:51,680 --> 00:39:54,240
It's not particularly new, although it's relatively new.

584
00:39:54,240 --> 00:39:57,320
I think it came out in maybe mid-March.

585
00:39:57,320 --> 00:40:02,400
So for users of Edge, the browser from Microsoft, of which I was not one, I have been a Chrome

586
00:40:02,400 --> 00:40:06,040
user for many, many, many years.

587
00:40:06,040 --> 00:40:11,720
You can now access the Bing chatbot in the sidebar.

588
00:40:11,720 --> 00:40:17,320
And actually, as much as it gives you access to a lot of the things that you can access

589
00:40:17,320 --> 00:40:23,320
through Bing when you go to the chatbot thing, which includes a lot of chat GPT like tools,

590
00:40:23,320 --> 00:40:29,200
like creating, copy, writing blog posts, summarizing things, all with the power of the ability to

591
00:40:29,200 --> 00:40:33,160
actually browse the web, which for those of us who don't have the access to that plugin

592
00:40:33,160 --> 00:40:37,840
within chat GPT yet is a great way to start experimenting with how that functionality

593
00:40:37,840 --> 00:40:40,560
might work for you by using Bing.

594
00:40:40,560 --> 00:40:42,680
But you can use it in the sidebar.

595
00:40:42,680 --> 00:40:49,880
Where did this get a bit ninja for me was when I was inspired to try some of the tools

596
00:40:49,880 --> 00:41:00,200
within there by opening up Google Drive files like Google Docs and also OneDrive files like

597
00:41:00,200 --> 00:41:03,960
Microsoft Word documents, both of which obviously you can do in the browser.

598
00:41:03,960 --> 00:41:10,280
Now you can select text in those documents and you can click a button and it pushes it

599
00:41:10,280 --> 00:41:14,360
instantly into the Bing sidebar, which you've got open side by side.

600
00:41:14,360 --> 00:41:18,080
And then you get a bunch of options like you can ask it to explain the text, revise the

601
00:41:18,080 --> 00:41:21,120
text, summarize the text or expand the text.

602
00:41:21,120 --> 00:41:28,120
And why I thought this was cool is in essence, this is like a mini preview of what copilot

603
00:41:28,120 --> 00:41:30,520
Microsoft copilot might be like.

604
00:41:30,520 --> 00:41:31,520
Right.

605
00:41:31,520 --> 00:41:36,000
So for those of you that do a lot of editing in the browser, if you're doing it in edge,

606
00:41:36,000 --> 00:41:43,160
you can actually now start to edit your emails and your text in your documents that you're

607
00:41:43,160 --> 00:41:48,880
writing and you can tap into some of this generative AI power from the play that I've

608
00:41:48,880 --> 00:41:49,880
had.

609
00:41:49,880 --> 00:41:55,080
It's very text driven, so it doesn't appear to have any of those upgrades from the copilot

610
00:41:55,080 --> 00:42:00,200
launch video in terms of you can't manage data or do anything cool in Excel.

611
00:42:00,200 --> 00:42:03,040
And I don't think you can particularly do anything easily.

612
00:42:03,040 --> 00:42:06,680
I think it does image generation.

613
00:42:06,680 --> 00:42:11,480
You can do image generation, but I don't know if you can do it within say the PowerPoint

614
00:42:11,480 --> 00:42:12,480
environment.

615
00:42:12,480 --> 00:42:13,480
No, you can't.

616
00:42:13,480 --> 00:42:16,120
It's almost like it's limited to the edge environment.

617
00:42:16,120 --> 00:42:17,120
Yeah.

618
00:42:17,120 --> 00:42:21,240
It's kind of like a Dali type prompt to get yourself image.

619
00:42:21,240 --> 00:42:22,960
Fascinating nonetheless.

620
00:42:22,960 --> 00:42:29,080
We talked about what this would do to the Google ecosystem a couple of weeks ago.

621
00:42:29,080 --> 00:42:33,700
And I seem to remember us coming to the conclusion that getting people to change their behavior

622
00:42:33,700 --> 00:42:37,480
and move from one system to another would be quite hard.

623
00:42:37,480 --> 00:42:40,200
But if they added enough value, then you might.

624
00:42:40,200 --> 00:42:45,720
I'm seriously having to think about binning all of the ecosystem I built for myself in

625
00:42:45,720 --> 00:42:46,720
Chrome.

626
00:42:46,720 --> 00:42:52,080
I've got all these lovely plugins and all this other stuff and move into edge because

627
00:42:52,080 --> 00:42:55,720
I want access to this power now.

628
00:42:55,720 --> 00:43:01,920
And to my understanding, other than third party tools that I could install as Chrome

629
00:43:01,920 --> 00:43:07,920
plugins, if I'm honest, I don't entirely trust because I don't know where my information

630
00:43:07,920 --> 00:43:08,920
is going.

631
00:43:08,920 --> 00:43:12,960
I'm working on a sensitive internal project here by Stra.

632
00:43:12,960 --> 00:43:16,200
I don't necessarily want to highlight a piece of text, send it off through a third party

633
00:43:16,200 --> 00:43:22,120
tool where the third party provider gets to see it.

634
00:43:22,120 --> 00:43:25,160
And I have more trust in Microsoft, right?

635
00:43:25,160 --> 00:43:29,200
But we've talked previously as well about how one of the valuable parts of their ecosystem

636
00:43:29,200 --> 00:43:34,720
is to appeal to enterprise and have that level of security and data management within it.

637
00:43:34,720 --> 00:43:35,720
I'm thinking about it.

638
00:43:35,720 --> 00:43:41,040
I'm thinking about going to edge so that I can get this power.

639
00:43:41,040 --> 00:43:44,280
I've been playing with it for a few weeks and I have it open.

640
00:43:44,280 --> 00:43:46,560
I have two browsers open at any given time.

641
00:43:46,560 --> 00:43:48,160
Now I ditched Chrome a while ago.

642
00:43:48,160 --> 00:43:54,960
I find it a bit of a resource hog, but I have Vivaldi, which is my daily driver.

643
00:43:54,960 --> 00:44:00,080
And I have on my second screen edge with, at the other day, I found myself in the absurd

644
00:44:00,080 --> 00:44:07,200
situation where I had barred open on my main chat with a sidebar of Bing chat with Claude

645
00:44:07,200 --> 00:44:11,720
by Anthropic on my main screen in Vivaldi.

646
00:44:11,720 --> 00:44:15,800
And I was jumping between the three of them, putting in different prompts, trying to get

647
00:44:15,800 --> 00:44:16,800
the best outputs.

648
00:44:16,800 --> 00:44:23,880
But yeah, as a tool just integrated into something that's just always there, having it in the

649
00:44:23,880 --> 00:44:26,440
sidebar is really neat.

650
00:44:26,440 --> 00:44:28,320
It's so convenient.

651
00:44:28,320 --> 00:44:29,320
And it does give.

652
00:44:29,320 --> 00:44:30,320
It's got GPT-4.

653
00:44:30,320 --> 00:44:32,360
It gives high quality outputs.

654
00:44:32,360 --> 00:44:35,360
I love the compose option as well.

655
00:44:35,360 --> 00:44:40,640
Again, as with all AI tech at the moment, I think the big developments are really coming

656
00:44:40,640 --> 00:44:43,000
out in the UI.

657
00:44:43,000 --> 00:44:46,520
How easy do they make it for the end user?

658
00:44:46,520 --> 00:44:51,360
And Microsoft have done a good job with that by saying, hey, look, you can have a chat

659
00:44:51,360 --> 00:44:55,760
with us in the chat mode, but if you click on the compose mode, you can write a blog

660
00:44:55,760 --> 00:45:00,440
post and do you want a small, medium or long blog post?

661
00:45:00,440 --> 00:45:02,400
Yeah, that compose window is cool.

662
00:45:02,400 --> 00:45:05,520
Do you know how long the long is, so to speak?

663
00:45:05,520 --> 00:45:08,320
I haven't played with it in enough detail to say.

664
00:45:08,320 --> 00:45:16,040
So I did one, I was in a live demo with a group the other day and it came out around

665
00:45:16,040 --> 00:45:20,960
700, 800 words for the long.

666
00:45:20,960 --> 00:45:22,320
Okay.

667
00:45:22,320 --> 00:45:28,920
I think in general, I'm finding my generation needs a more than that, which I think is one

668
00:45:28,920 --> 00:45:32,480
of the things that's put me off the Bing ecosystem as well.

669
00:45:32,480 --> 00:45:36,160
And I probably still use chat GPT-4 for that, for that reason.

670
00:45:36,160 --> 00:45:38,440
But yeah, okay.

671
00:45:38,440 --> 00:45:40,920
I think it's definitely something to play with.

672
00:45:40,920 --> 00:45:48,160
On your multiple tabs, I mean, I think you've got a software problem, honestly, don't mind.

673
00:45:48,160 --> 00:45:49,440
I have a problem.

674
00:45:49,440 --> 00:45:54,320
They tried to make him go to rehab and he said, no, no, no.

675
00:45:54,320 --> 00:45:59,400
And he asked chat GPT whether he should go and how he should deal with his addiction.

676
00:45:59,400 --> 00:46:01,840
And he found that the therapist is very empathetic.

677
00:46:01,840 --> 00:46:05,560
Chat GPT said, I'm not a trained therapist.

678
00:46:05,560 --> 00:46:07,560
I can't possibly advise you.

679
00:46:07,560 --> 00:46:09,560
So he jailbroken instead.

680
00:46:09,560 --> 00:46:15,640
And then it gave him all the goods and it was like, yeah, it's no hope for you, Martin.

681
00:46:15,640 --> 00:46:19,200
I've been watching you and I've been looking at all your prompts and you can't try and

682
00:46:19,200 --> 00:46:20,200
give up.

683
00:46:20,200 --> 00:46:24,080
You'll never be able to just embrace the darkness.

684
00:46:24,080 --> 00:46:26,760
Some of that might be true, but most of it's probably Chuck.

685
00:46:26,760 --> 00:46:27,760
Cool.

686
00:46:27,760 --> 00:46:28,760
So that's tool of the week.

687
00:46:28,760 --> 00:46:34,280
Go and have a play with the Bing sidebar if you haven't, because it's kind of very interesting

688
00:46:34,280 --> 00:46:38,000
and how it might augment your work and make actually a few things all in one place a bit

689
00:46:38,000 --> 00:46:39,000
easier.

690
00:46:39,000 --> 00:46:41,360
I think that's it from us this week, Martin.

691
00:46:41,360 --> 00:46:45,320
Was there anything else you wanted to share with the lovely folks at home?

692
00:46:45,320 --> 00:46:50,200
I just want to make everyone aware that Derby County play Sheffield Wednesday this weekend,

693
00:46:50,200 --> 00:46:51,960
final game of the season.

694
00:46:51,960 --> 00:46:53,400
If we win, we're in the playoffs.

695
00:46:53,400 --> 00:46:57,920
If we draw and Peter Brewer don't beat Barnsley by three goals to nil.

696
00:46:57,920 --> 00:46:59,040
We're in the playoffs.

697
00:46:59,040 --> 00:47:00,240
It's all to play for.

698
00:47:00,240 --> 00:47:01,240
Come on, you Rams.

699
00:47:01,240 --> 00:47:02,240
Let's have it.

700
00:47:02,240 --> 00:47:05,420
I love the fact you've shared that like all of the listeners don't already know.

701
00:47:05,420 --> 00:47:09,280
Like Derby County for our listenership is the new Wrexham.

702
00:47:09,280 --> 00:47:13,560
Like they want to know when the Disney Plus series is coming out.

703
00:47:13,560 --> 00:47:19,200
I desperately want you to get in the playoffs because we all know that the most Derby County

704
00:47:19,200 --> 00:47:26,480
thing that Derby County can do is lose the playoffs 2-1 having been one nil up in the

705
00:47:26,480 --> 00:47:27,480
89th minute.

706
00:47:27,480 --> 00:47:32,240
I would predict, I'd put money on that happening, honestly.

707
00:47:32,240 --> 00:47:35,280
And then I can tease you about it.

708
00:47:35,280 --> 00:47:41,320
I will weep into my GPT powered therapist's arms.

709
00:47:41,320 --> 00:47:45,640
Don't worry, you can just get another AI software subscription to play with and you'll forget

710
00:47:45,640 --> 00:47:46,640
all about it.

711
00:47:46,640 --> 00:47:47,640
Right.

712
00:47:47,640 --> 00:47:50,000
That's quite enough absolute nonsense for one week.

713
00:47:50,000 --> 00:47:51,040
We hope you've all enjoyed this.

714
00:47:51,040 --> 00:47:52,960
If you have, please subscribe.

715
00:47:52,960 --> 00:47:57,020
If you have other marketers that you know and anything that they may benefit even from

716
00:47:57,020 --> 00:48:01,420
a little bit of this, even though there's a lot of inane chatter in it, please do share

717
00:48:01,420 --> 00:48:02,420
it with them as well.

718
00:48:02,420 --> 00:48:03,420
We really appreciate it.

719
00:48:03,420 --> 00:48:06,200
If you've got any feedback, please hit us up on the Twitter or the LinkedIn.

720
00:48:06,200 --> 00:48:07,200
We'd love to hear it.

721
00:48:07,200 --> 00:48:09,640
If there's topics you'd like us to feature, let us know.

722
00:48:09,640 --> 00:48:13,800
If there's ninja applications of AI and marketing that you've read about or seen sharing with

723
00:48:13,800 --> 00:48:15,080
us, we want to see them.

724
00:48:15,080 --> 00:48:17,760
We'd love to hear what you've got to say.

725
00:48:17,760 --> 00:48:21,280
Anybody who's doing cool stuff with AI and wants to come on the podcast as an interviewee,

726
00:48:21,280 --> 00:48:23,160
we'd love to hear from you as well.

727
00:48:23,160 --> 00:48:25,720
Just get in touch with us really.

728
00:48:25,720 --> 00:48:27,100
We'd love to hear from you.

729
00:48:27,100 --> 00:48:30,040
Have a lovely week everyone and we will catch up with you next week.

730
00:48:30,040 --> 00:48:31,040
Cheers Martin.

731
00:48:31,040 --> 00:48:32,040
Cheers.

732
00:48:32,040 --> 00:48:33,040
Bye.

733
00:48:33,040 --> 00:48:34,040
Bye.

734
00:48:34,040 --> 00:48:39,360
Thank you for listening to Artificially Intelligent Marketing.

735
00:48:39,360 --> 00:48:45,880
To stay on top of the latest trends, tips, and tools in the world of marketing AI, be

736
00:48:45,880 --> 00:48:47,620
sure to subscribe.

737
00:48:47,620 --> 00:49:10,640
We look forward to seeing you again next week.

