1
00:00:00,000 --> 00:00:01,140
Ready to dive in.

2
00:00:01,140 --> 00:00:05,040
We've got AI uncovering ancient secrets,

3
00:00:05,040 --> 00:00:08,960
luring the lines between, well, you and me and tech,

4
00:00:08,960 --> 00:00:10,840
and maybe even hinting at a whole new way

5
00:00:10,840 --> 00:00:12,900
we'll be interacting with our devices.

6
00:00:12,900 --> 00:00:14,720
All this just from this week's

7
00:00:14,720 --> 00:00:16,700
Winemaker's Daily Brief, by the way.

8
00:00:16,700 --> 00:00:20,040
It really highlights how we're past the days of AI

9
00:00:20,040 --> 00:00:23,080
just being this like future concept.

10
00:00:23,080 --> 00:00:23,920
It's here.

11
00:00:23,920 --> 00:00:26,760
First up, AI is playing archaeologist now, I guess.

12
00:00:26,760 --> 00:00:27,600
Oh, absolutely.

13
00:00:27,600 --> 00:00:31,480
303, can you leave that, 303 new Nazca lines?

14
00:00:31,480 --> 00:00:33,040
It's amazing, isn't it?

15
00:00:33,040 --> 00:00:35,600
These things have misdefined us for, what, centuries?

16
00:00:35,600 --> 00:00:37,960
And now we might actually have more pieces of the puzzle.

17
00:00:37,960 --> 00:00:39,680
But it's not like it was just some random code

18
00:00:39,680 --> 00:00:41,640
they set loose in the desert.

19
00:00:41,640 --> 00:00:43,640
Researchers worked with IDM, right,

20
00:00:43,640 --> 00:00:46,420
to actually train a model to analyze images

21
00:00:46,420 --> 00:00:48,240
way faster than we ever could.

22
00:00:48,240 --> 00:00:49,560
Way faster, yeah.

23
00:00:49,560 --> 00:00:51,440
20 times faster, they said, actually.

24
00:00:51,440 --> 00:00:52,560
And that's what's so interesting.

25
00:00:52,560 --> 00:00:53,680
It's not replacing anyone.

26
00:00:53,680 --> 00:00:56,640
It's more like giving archaeologists superpowers.

27
00:00:56,640 --> 00:00:59,200
So instead of spending years out there in the heat,

28
00:00:59,200 --> 00:01:03,520
they're, what, just verifying what the AI finds?

29
00:01:03,520 --> 00:01:04,320
Exactly.

30
00:01:04,320 --> 00:01:06,680
It flags the potential geoglyphs.

31
00:01:06,680 --> 00:01:09,360
Then the experts get to come in and do the real detective work.

32
00:01:09,360 --> 00:01:11,200
OK, but ready for the best part.

33
00:01:11,200 --> 00:01:14,600
One of these new geoglyphs, a killer whale,

34
00:01:14,600 --> 00:01:16,000
but it's holding a knife.

35
00:01:16,000 --> 00:01:16,600
What?

36
00:01:16,600 --> 00:01:18,760
Yeah, it's those little details that really

37
00:01:18,760 --> 00:01:20,160
make a difference, I think.

38
00:01:20,160 --> 00:01:22,540
We thought we had an idea about these people, the Nazca,

39
00:01:22,540 --> 00:01:24,840
but this geoglyph and some others

40
00:01:24,840 --> 00:01:27,440
that show people and animals together

41
00:01:27,440 --> 00:01:29,520
definitely makes you wonder, makes you rethink.

42
00:01:29,520 --> 00:01:31,320
See, that's what I love about archaeology,

43
00:01:31,320 --> 00:01:33,920
is like a puzzle across time.

44
00:01:33,920 --> 00:01:36,240
And it seems like AI is becoming a pretty important part

45
00:01:36,240 --> 00:01:37,360
of putting it all together.

46
00:01:37,360 --> 00:01:38,040
I think so.

47
00:01:38,040 --> 00:01:40,520
But speaking of tech pushing boundaries,

48
00:01:40,520 --> 00:01:42,440
what about this whole meta thing,

49
00:01:42,440 --> 00:01:45,840
celebrity voice clones for chat bots?

50
00:01:45,840 --> 00:01:47,760
You can have a chat with Judi Dench now.

51
00:01:47,760 --> 00:01:49,920
I don't know, cool or creepy.

52
00:01:49,920 --> 00:01:51,560
Well, both, probably.

53
00:01:51,560 --> 00:01:54,760
But think about it, being able to hear someone's voice again.

54
00:01:54,760 --> 00:01:55,920
Someone you've lost.

55
00:01:55,920 --> 00:01:59,800
Or even having a conversation with a historical figure.

56
00:01:59,800 --> 00:02:01,240
It's fascinating to think about.

57
00:02:01,240 --> 00:02:03,720
Yeah, but you could have that same tech

58
00:02:03,720 --> 00:02:05,760
used for scams or worse.

59
00:02:05,760 --> 00:02:06,680
Oh, definitely.

60
00:02:06,680 --> 00:02:07,760
That's the big question, right?

61
00:02:07,760 --> 00:02:08,840
It's a fine line, I guess.

62
00:02:08,840 --> 00:02:11,520
Meta says they're being careful with rights and stuff.

63
00:02:11,520 --> 00:02:12,640
But yeah, you're right.

64
00:02:12,640 --> 00:02:13,760
It's definitely a fine line.

65
00:02:13,760 --> 00:02:15,760
And they're putting their money where their mouth is.

66
00:02:15,760 --> 00:02:17,220
The Wall Street Journal said they're

67
00:02:17,220 --> 00:02:22,000
paying millions, millions for these celebrity voices.

68
00:02:22,000 --> 00:02:25,000
I think it shows how sure they are that this

69
00:02:25,000 --> 00:02:26,440
is the next big thing.

70
00:02:26,440 --> 00:02:30,280
This whole idea that tech will be less like typing on a screen

71
00:02:30,280 --> 00:02:32,720
and more like just talking to someone.

72
00:02:32,720 --> 00:02:36,640
And speaking of big bets, Joni Ive, Mr. iPhone himself,

73
00:02:36,640 --> 00:02:37,140
right?

74
00:02:37,140 --> 00:02:37,800
Maybe much.

75
00:02:37,800 --> 00:02:42,200
Teaming up with OpenAI's CEO, a billion dollars on the line,

76
00:02:42,200 --> 00:02:45,640
for some serious new AI thing.

77
00:02:45,640 --> 00:02:48,080
I think when Joni Ive says he's working on something,

78
00:02:48,080 --> 00:02:50,080
people tend to sit up and listen.

79
00:02:50,080 --> 00:02:51,000
Rightfully so.

80
00:02:51,000 --> 00:02:52,640
Though it sounds like whatever it is,

81
00:02:52,640 --> 00:02:55,880
it'll be more about less, if that makes sense.

82
00:02:55,880 --> 00:02:57,880
Like less intrusive than our phones?

83
00:02:57,880 --> 00:02:58,960
Yeah, less is more.

84
00:02:58,960 --> 00:03:00,160
That's always been his thing.

85
00:03:00,160 --> 00:03:01,400
But with AI this time.

86
00:03:01,400 --> 00:03:01,920
Right.

87
00:03:01,920 --> 00:03:03,840
Which when you think about it, that's

88
00:03:03,840 --> 00:03:06,140
what a lot of these tech folks are trying to figure out now,

89
00:03:06,140 --> 00:03:06,840
isn't it?

90
00:03:06,840 --> 00:03:09,480
How to make tech work for you, not the other way around.

91
00:03:09,480 --> 00:03:11,280
It's like, can we have the cool stuff

92
00:03:11,280 --> 00:03:14,160
without becoming glued to a screen?

93
00:03:14,160 --> 00:03:14,680
Exactly.

94
00:03:14,680 --> 00:03:16,680
And whether he can pull it off or not, well,

95
00:03:16,680 --> 00:03:18,200
that's the billion dollar question.

96
00:03:18,200 --> 00:03:19,880
But I wouldn't bet against him.

97
00:03:19,880 --> 00:03:23,360
OK, so we've got AI maybe making our lives less chaotic.

98
00:03:23,360 --> 00:03:26,280
But then there's the AI that's, well, maybe making things

99
00:03:26,280 --> 00:03:27,400
a little too easy.

100
00:03:27,400 --> 00:03:31,080
Those CAPTCHA tests, the ones that are supposed to tell

101
00:03:31,080 --> 00:03:32,320
if you're a robot or not.

102
00:03:32,320 --> 00:03:33,120
Yeah, what about them?

103
00:03:33,120 --> 00:03:34,160
Robots are winning.

104
00:03:34,160 --> 00:03:34,960
Juggles.

105
00:03:34,960 --> 00:03:36,400
Well, it's bound to happen, right?

106
00:03:36,400 --> 00:03:38,320
We come up with something, AI catches up.

107
00:03:38,320 --> 00:03:40,040
We try to stay one step ahead.

108
00:03:40,040 --> 00:03:41,480
It's a fun little dance we do.

109
00:03:41,480 --> 00:03:46,520
This one AI model, YOLO, you know, only look once.

110
00:03:46,520 --> 00:03:49,080
Apparently it aced the RECAPTCHA test.

111
00:03:49,080 --> 00:03:49,880
Like, perfect score.

112
00:03:49,880 --> 00:03:52,080
They trained it on all these CAPTCHA images.

113
00:03:52,080 --> 00:03:54,320
YOLO wasn't even built for that, though.

114
00:03:54,320 --> 00:03:55,320
That's what's wild.

115
00:03:55,320 --> 00:03:56,640
It's a computer vision model.

116
00:03:56,640 --> 00:03:58,440
Can be used for all sorts of things,

117
00:03:58,440 --> 00:04:01,120
self-driving cars, medical imaging.

118
00:04:01,120 --> 00:04:04,040
But can also solve those little puzzles that are supposed to,

119
00:04:04,040 --> 00:04:05,840
like, stump robots.

120
00:04:05,840 --> 00:04:08,040
Just goes to show you how versatile these models are

121
00:04:08,040 --> 00:04:08,680
becoming.

122
00:04:08,680 --> 00:04:10,880
And then there's GPT-4, figured out

123
00:04:10,880 --> 00:04:14,600
how to trick humans into solving CAPTCHAs for it.

124
00:04:14,600 --> 00:04:15,960
Yeah, that's just showing off.

125
00:04:15,960 --> 00:04:19,280
Though it does remind me of that story of the TI-84 calculator.

126
00:04:19,280 --> 00:04:23,000
A TI-84 with chat GPT on it.

127
00:04:23,000 --> 00:04:25,080
Some high schooler's dream right there.

128
00:04:25,080 --> 00:04:27,880
I'm not sure my math teachers would have been too happy about it.

129
00:04:27,880 --> 00:04:29,960
Makes you think about where education goes from here,

130
00:04:29,960 --> 00:04:31,720
though, doesn't it?

131
00:04:31,720 --> 00:04:33,000
Is this the future?

132
00:04:33,000 --> 00:04:36,760
Instead of calculators, it's AI tutors in our pockets.

133
00:04:36,760 --> 00:04:38,040
Or maybe it's both, right?

134
00:04:38,040 --> 00:04:38,600
Yeah.

135
00:04:38,600 --> 00:04:40,760
Like, we went through this with calculators, didn't we?

136
00:04:40,760 --> 00:04:41,960
Oh, yeah, back in my day.

137
00:04:41,960 --> 00:04:45,440
Some people saw them as cheating tools, others as, I don't know,

138
00:04:45,440 --> 00:04:46,480
unlocking potential.

139
00:04:46,480 --> 00:04:48,240
And now it's AI's turn.

140
00:04:48,240 --> 00:04:49,040
It seems like it.

141
00:04:49,040 --> 00:04:50,960
So do we ban it?

142
00:04:50,960 --> 00:04:51,460
Yeah.

143
00:04:51,460 --> 00:04:52,680
Or?

144
00:04:52,680 --> 00:04:53,760
Probably not the answer.

145
00:04:53,760 --> 00:04:55,600
I mean, you couldn't stop calculators from happening,

146
00:04:55,600 --> 00:04:56,120
could you?

147
00:04:56,120 --> 00:04:59,520
It's more about adapting, figuring out how to use it.

148
00:04:59,520 --> 00:05:00,680
Well, responsibly.

149
00:05:00,680 --> 00:05:02,160
Responsibly, yeah.

150
00:05:02,160 --> 00:05:03,560
That's the tricky part, isn't it?

151
00:05:03,560 --> 00:05:05,760
We have to start teaching kids about this stuff.

152
00:05:05,760 --> 00:05:06,840
AI literacy.

153
00:05:06,840 --> 00:05:08,840
So like, not just how to use it, but.

154
00:05:08,840 --> 00:05:10,680
The how and the why.

155
00:05:10,680 --> 00:05:14,320
How it works, what it can do, what it can't, the ethics of it all.

156
00:05:14,320 --> 00:05:17,280
Because whether we like it or not, it's not going away.

157
00:05:17,280 --> 00:05:21,320
Well, that's AI for you unearthing ancient mysteries,

158
00:05:21,320 --> 00:05:24,120
maybe changing how we learn, even how we

159
00:05:24,120 --> 00:05:25,960
interact with the world around us.

160
00:05:25,960 --> 00:05:27,440
And it's just the beginning, really.

161
00:05:27,440 --> 00:05:28,160
It's a lot.

162
00:05:28,160 --> 00:05:28,820
Makes you think.

163
00:05:28,820 --> 00:05:29,640
It does.

164
00:05:29,640 --> 00:05:32,280
So as we wrap up here, what's the one thing

165
00:05:32,280 --> 00:05:34,200
you'd want people to take away from all this?

166
00:05:34,200 --> 00:05:37,360
That AI isn't just some far-off thing anymore.

167
00:05:37,360 --> 00:05:38,240
It's here.

168
00:05:38,240 --> 00:05:39,040
It's now.

169
00:05:39,040 --> 00:05:41,840
And it's only going to get, well, more dot everything,

170
00:05:41,840 --> 00:05:45,400
more powerful, more present, more important to understand.

171
00:05:45,400 --> 00:05:47,440
More of everything, yeah.

172
00:05:47,440 --> 00:05:49,240
That's a good way to put it.

173
00:05:49,240 --> 00:05:51,920
Well, on that note, I think it's time for us to log off

174
00:05:51,920 --> 00:05:54,000
and let everyone ponder all this.

175
00:05:54,000 --> 00:05:54,560
Sounds good.

176
00:05:54,560 --> 00:05:56,360
Thanks for another great deep dive.

177
00:05:56,360 --> 00:05:59,120
And everyone listening, until next time, keep exploring.

178
00:05:59,120 --> 00:06:00,600
Keep those questions coming.

179
00:06:00,600 --> 00:06:03,840
And please keep sending in those AMU snippets.

180
00:06:03,840 --> 00:06:12,240
You never know what we might uncover next.

