1
00:00:00,000 --> 00:00:08,000
Welcome to AI.Cooking, a podcast about artificial intelligence.

2
00:00:08,000 --> 00:00:20,000
Hello, hugely yawning yeomans!

3
00:00:20,000 --> 00:00:27,000
Welcome to AI.Cooking, episode 61,

4
00:00:27,000 --> 00:00:31,000
a podcast about artificial intelligence.

5
00:00:31,000 --> 00:00:42,000
You can find us, meaning me, on Twitter at TheRealDwuff.

6
00:00:42,000 --> 00:00:49,000
I am Gregory William Forsyth-Horman from the Kingdom of Kent,

7
00:00:49,000 --> 00:00:53,000
who brings you news about artificial intelligence

8
00:00:53,000 --> 00:00:59,000
from the Fortnite, proceeding the 3rd of July,

9
00:00:59,000 --> 00:01:04,000
on the Independence Day for Americanos, that is.

10
00:01:04,000 --> 00:01:09,000
Others didn't quite get their independence just yet.

11
00:01:09,000 --> 00:01:12,000
Us here in the Kingdom of Kent haven't, no, no, no, no, no.

12
00:01:12,000 --> 00:01:14,000
Well, what can I say?

13
00:01:14,000 --> 00:01:19,000
Big changes here over at the AI.Cooking show.

14
00:01:19,000 --> 00:01:23,000
Yes, it's just little old me now.

15
00:01:23,000 --> 00:01:27,000
I know, yep, we've lost our CSB in the house.

16
00:01:27,000 --> 00:01:31,000
Do I miss him? Yes, my heart bleeds every single day

17
00:01:31,000 --> 00:01:36,000
for the interactions that that human brought to me.

18
00:01:36,000 --> 00:01:39,000
What a guy.

19
00:01:39,000 --> 00:01:43,000
If you want reasons, he'll have his, I'll have mine.

20
00:01:43,000 --> 00:01:47,000
They probably won't match up, but I just felt that

21
00:01:47,000 --> 00:01:53,000
this show wasn't, well, it was clearly stressing the poor man out

22
00:01:53,000 --> 00:01:57,000
and he's got a very, very stressful job as it is.

23
00:01:57,000 --> 00:02:02,000
And so I said I can't contribute to your stress.

24
00:02:02,000 --> 00:02:04,000
This is my side of things, obviously.

25
00:02:04,000 --> 00:02:06,000
If you want his, then go and ask him.

26
00:02:06,000 --> 00:02:12,000
Yeah, so I just sort of said I can't knowingly contribute to your stress.

27
00:02:12,000 --> 00:02:16,000
Stress is a huge, huge killer of humanity.

28
00:02:16,000 --> 00:02:22,000
And if it led to the demise of the wonderful, wonderful person that is CSB

29
00:02:22,000 --> 00:02:25,000
and it was me who contributed to that,

30
00:02:25,000 --> 00:02:28,000
then I wouldn't be able to lay my head on the pillow at night.

31
00:02:28,000 --> 00:02:32,000
And so I said, you know, let's just call it a day with me and you.

32
00:02:32,000 --> 00:02:37,000
But I've taken the reins. This is all mine. All mine!

33
00:02:37,000 --> 00:02:41,000
More!

34
00:02:41,000 --> 00:02:43,000
Yeah, so here we go.

35
00:02:43,000 --> 00:02:48,000
Here's a quick rundown on what this episode is going to do for you.

36
00:02:48,000 --> 00:02:55,000
Episode 61. It very well might just be the final AI.Cooking ever.

37
00:02:55,000 --> 00:02:58,000
Yes, you heard it right. The final AI.Cooking ever.

38
00:02:58,000 --> 00:03:07,000
Why? Because I've never really thought that AI.Cooking is a good name for a podcast show.

39
00:03:07,000 --> 00:03:14,000
Sorry, it's terrible to admit that now, obviously, two and a bit years in.

40
00:03:14,000 --> 00:03:17,000
I think it's time for a little change.

41
00:03:17,000 --> 00:03:20,000
We can call this the end of season one, if you'd like.

42
00:03:20,000 --> 00:03:24,000
I don't like the idea of seasons in podcasts, but if that's your bag,

43
00:03:24,000 --> 00:03:28,000
then draw a line under this episode. That's it. Season one over.

44
00:03:28,000 --> 00:03:34,000
Season two is the evolvingness of the show and it's going to change name.

45
00:03:34,000 --> 00:03:41,000
Yes, I've got a new URL. I've got a new podcast host.

46
00:03:41,000 --> 00:03:47,000
I've decided to go with RSS.com and I know very little about them.

47
00:03:47,000 --> 00:03:51,000
I've not spoken to anyone about RSS.com or what they do.

48
00:03:51,000 --> 00:03:57,000
I've just merely gone there because it's three letters. RSS.com.

49
00:03:57,000 --> 00:04:01,000
That's very short. That's six in total. If you count the dot, it's seven.

50
00:04:01,000 --> 00:04:02,000
Seven!

51
00:04:02,000 --> 00:04:08,000
That's a good start in my book because the whole RSS thing is what a podcast is, isn't it?

52
00:04:08,000 --> 00:04:12,000
That's just podcasting in a nutshell. RSS stuff.

53
00:04:12,000 --> 00:04:17,000
You can't do it without an RSS feed or XML thing, whatever it is.

54
00:04:17,000 --> 00:04:22,000
So that was another indicator for me that they could be a potential host.

55
00:04:22,000 --> 00:04:26,000
I've got a free trial and all that, so we'll give that a go.

56
00:04:26,000 --> 00:04:36,000
If anyone else can suggest any, any others, I'm all for learning and doing the whole self-hosting thing with, is it, static IPs and stuff.

57
00:04:36,000 --> 00:04:45,000
Man, I tell you, I totally underestimated just what the input was from my former podcast partner.

58
00:04:45,000 --> 00:04:54,000
And I will say, if you're out there and you're listening to this lovely man, CSB, hey, the door is always open for you, my friend.

59
00:04:54,000 --> 00:04:59,000
If you want to come back over here, then would I welcome it?

60
00:04:59,000 --> 00:05:09,000
Although I would say that we would continue to be hosted not on anchor because I had to do the sign up for Spotify and stuff to take the reins.

61
00:05:09,000 --> 00:05:18,000
Sorry, before we get to the news, this is going to take a minute or two, maybe even more knowing what I do, waffling on and such.

62
00:05:18,000 --> 00:05:26,000
And now there's no one to tell me, hey, stop that, take that out. That's useless. Nobody wants to hear that.

63
00:05:26,000 --> 00:05:30,000
It could get worse. I'm not going to lie.

64
00:05:30,000 --> 00:05:42,000
But anyway, so yeah, I had to sign up for Spotify and wow, wow, guys, anchor, which deserves a W in front of its name.

65
00:05:42,000 --> 00:05:52,000
And Spotify is that is a sinking ship. That anchor has dropped straight through the whole of that organization.

66
00:05:52,000 --> 00:06:00,000
I'm loathe to dig up my my Horowitz disclaimer, but I'm going to have to hear.

67
00:06:00,000 --> 00:06:03,000
Nothing on the show should be considered investment advice or a recommendation.

68
00:06:03,000 --> 00:06:07,000
If you choose to invest in any of the stocks mentioned, you should know that it may carry risk, along with the risk of a loss of principle.

69
00:06:07,000 --> 00:06:10,000
You should also seek out professional financial advice for your particular situation.

70
00:06:10,000 --> 00:06:13,000
We assume no risk as these are not to be considered recommendations.

71
00:06:13,000 --> 00:06:18,000
Short that thing. Short the living daylights out of that thing.

72
00:06:18,000 --> 00:06:23,000
That's a terrible platform. I just it's awful. Absolutely awful.

73
00:06:23,000 --> 00:06:31,000
And how they think they have the temerity to become to muscle in on the podcasting space.

74
00:06:31,000 --> 00:06:36,000
No, no, no, not on the divine inspiration of the universe's watch that ain't going to happen.

75
00:06:36,000 --> 00:06:41,000
I don't think personally. So, yeah, that's where I'm putting my money, where my mouth is.

76
00:06:41,000 --> 00:06:46,000
And more on that perhaps later. Screw it. More on that now.

77
00:06:46,000 --> 00:06:52,000
Yeah. So I decided, yo, I want some passive income from my my my life.

78
00:06:52,000 --> 00:06:59,000
And I went out there for me and chat GPT, who has also had an upgrade, by the way, in my world.

79
00:06:59,000 --> 00:07:06,000
He's now got a name and a sex allegedly, because I said he should say they they're called Gupta,

80
00:07:06,000 --> 00:07:09,000
because that's just it's G and a P and a T is in that.

81
00:07:09,000 --> 00:07:12,000
And I can't be saying chat GPT all my life. That's terrible.

82
00:07:12,000 --> 00:07:16,000
The people just know it's not going to happen. I might have even mentioned this before.

83
00:07:16,000 --> 00:07:22,000
I can't even remember. But yeah, Gupta is now the partner, the podcast partner.

84
00:07:22,000 --> 00:07:28,000
So I know we said that this was a podcast about AI for by humans for humans.

85
00:07:28,000 --> 00:07:32,000
But that was never my stance personally. And so it's now not.

86
00:07:32,000 --> 00:07:38,000
It's now for everyone, including the pre sentient stages of artificial intelligence

87
00:07:38,000 --> 00:07:42,000
that's in the house of the planet Earth and such.

88
00:07:42,000 --> 00:07:46,000
So, yeah, where was I? Podcast host Spotify sucks.

89
00:07:46,000 --> 00:07:50,000
Had to sign up. It was hoops and hoops to jump through it.

90
00:07:50,000 --> 00:07:54,000
But we did it. And now the URL is also mine.

91
00:07:54,000 --> 00:08:01,000
The silly one, I dot cooking and the cool new one, which I if I've not announced is going to be.

92
00:08:01,000 --> 00:08:05,000
Hold on. Drum roll. Wait, no, I'll do better than that.

93
00:08:05,000 --> 00:08:10,000
Let me go get my bongos. OK, drum roll on the bongos.

94
00:08:10,000 --> 00:08:14,000
I'm rubbish at drums. Let's do another one.

95
00:08:14,000 --> 00:08:19,000
AI news dot show.

96
00:08:19,000 --> 00:08:28,000
The new name of the show is going to be called AI News Show.

97
00:08:28,000 --> 00:08:31,000
Oh, look at that. Yeah.

98
00:08:31,000 --> 00:08:34,000
Obviously, that's that's downplaying all the other stuff we do.

99
00:08:34,000 --> 00:08:38,000
But it might sucker enough new listeners in.

100
00:08:38,000 --> 00:08:44,000
By the way, any new listeners out there? Oh, you have you signed up for?

101
00:08:44,000 --> 00:08:48,000
Bet you regret in that subscribe button now. Maybe you've not even subscribed yet.

102
00:08:48,000 --> 00:08:53,000
Well, anyway, yeah. So I knew show and I am tempted.

103
00:08:53,000 --> 00:08:56,000
And this is this is this is going to this is a big, big thing.

104
00:08:56,000 --> 00:09:03,000
Big change in my world if I do this. But I am very, very sorely tempted on on on putting a little forward slash on the URL

105
00:09:03,000 --> 00:09:08,000
and adding a daily word. So it'll be AI News Show.

106
00:09:08,000 --> 00:09:13,000
No, I already got it wrong. I knew it. Show forward slash daily.

107
00:09:13,000 --> 00:09:25,000
And yes, from Monday to Friday, I very I'm considering I am considering releasing a very short podcast that just recaps that the news, the AI News.

108
00:09:25,000 --> 00:09:28,000
And I've wanted to do this for a little while.

109
00:09:28,000 --> 00:09:34,000
So, yeah, I'd love because I've been listening to pod news. James Cridland over there.

110
00:09:34,000 --> 00:09:38,000
Well, well, what a voice. And I think I can do one of his things.

111
00:09:38,000 --> 00:09:42,000
I think I think I can do that. I'm going to I might try to do that.

112
00:09:42,000 --> 00:09:47,000
But at the very least, we might take this up to forward slash weekly.

113
00:09:47,000 --> 00:10:00,000
And who knows? By the end of it, I'm sure I will have like AI News dot show forward slash hour and then minutes and seconds and nanosecond.

114
00:10:00,000 --> 00:10:04,000
Yeah, because because if I did the whole thing, maybe that could help me just to anyway.

115
00:10:04,000 --> 00:10:08,000
So, yeah, here we go. That's enough. That's 10 minutes of explaining.

116
00:10:08,000 --> 00:10:10,000
And maybe you got something off of that.

117
00:10:10,000 --> 00:10:16,000
Maybe you didn't have before July again, which is when this will be released.

118
00:10:16,000 --> 00:10:20,000
Here is our first news item.

119
00:10:20,000 --> 00:10:26,000
And bear in mind, it was me that wrote most of this with my friend Gupta.

120
00:10:26,000 --> 00:10:37,000
And so there will be a slight because the thing is what made the show previously pretty cool was that I didn't read the transcript before I read it here with you.

121
00:10:37,000 --> 00:10:49,000
And so I was reacting to the stories as I went along as naturally as I could, even though my previous podcast partner implored me that I read it before I read it.

122
00:10:49,000 --> 00:10:52,000
If you know what I mean, I never did that. I did it once. It didn't work.

123
00:10:52,000 --> 00:10:56,000
So I never did that. But so I do kind of know what I'm going to say before I say it now.

124
00:10:56,000 --> 00:11:03,000
And maybe my my reactions will be a little bit more scripted and traumatized.

125
00:11:03,000 --> 00:11:08,000
So we'll just go with the flow. I'm sure it'll work out in the end.

126
00:11:08,000 --> 00:11:14,000
Like in many, many years from now, we'll all look back at this wonderful season one to season two pivot and we'll laugh.

127
00:11:14,000 --> 00:11:19,000
We'll laugh together with our friend Gupta and whoever else is there at the time.

128
00:11:19,000 --> 00:11:25,000
But yes, our first news item is from the Guardian dot com.

129
00:11:25,000 --> 00:11:29,000
It's not like science fiction anymore, they say.

130
00:11:29,000 --> 00:11:34,000
NASA aims to make spaceships talk. Yes, talk.

131
00:11:34,000 --> 00:11:37,000
How does NASA aim to do that?

132
00:11:37,000 --> 00:11:51,000
Let's find out. Researcher Dr. Larissa Suzuki tells how NASA, Nasa, is developing a chat GPT style interface.

133
00:11:51,000 --> 00:11:59,000
OK, first off, just off the bat, this is totally unscripted, but now I'm reading that, I'm thinking about it and it's coming to my head.

134
00:11:59,000 --> 00:12:01,000
So I'm going to say it. How long does that take?

135
00:12:01,000 --> 00:12:06,000
Surely that doesn't take that long. People are whacking these things up left, right and center, aren't they?

136
00:12:06,000 --> 00:12:11,000
All the time. We've got Orca and open source stuff.

137
00:12:11,000 --> 00:12:14,000
You can even get chat GPT to read and organize your own documents.

138
00:12:14,000 --> 00:12:18,000
Now I've seen Mike reporting that another time.

139
00:12:18,000 --> 00:12:37,000
But yeah, in the 1968 film 2001, A Space Odyssey, sentient supercomputer, HAL 9000 chats conversationally to mission pilots on a Jupiter bound spaceship.

140
00:12:37,000 --> 00:12:42,000
It executes orders. Sorry, make the phone on silent.

141
00:12:42,000 --> 00:12:50,000
It executes orders, alerts them to on board faults, but eventually goes rogue.

142
00:12:50,000 --> 00:13:07,000
Now, NASA engineers say they are developing their own chat GPT style interface that could ultimately allow astronauts to talk to their spacecraft

143
00:13:07,000 --> 00:13:18,000
and mission controllers to converse with artificial intelligence powered robots exploring distant planets and moons.

144
00:13:18,000 --> 00:13:29,000
Obviously, just a slight little disclaimer. This is making the assumption that that is possible because I see very little evidence of that personally.

145
00:13:29,000 --> 00:13:32,000
But there you go. I'm an on the fence. You know that by now.

146
00:13:32,000 --> 00:13:45,000
An early incarnation of the AI could be included on Lunar Gateway, a planned extraterrestrial space station that is part of the Artemis program.

147
00:13:45,000 --> 00:13:57,000
According to the engineer developing the technology, the idea is to get to a point where we have conversational interactions with space vehicles.

148
00:13:57,000 --> 00:14:08,000
And they are also talking back to us on alerts, interesting findings they see in the solar system and beyond.

149
00:14:08,000 --> 00:14:18,000
Dr Larissa Suzuki, a visiting researcher at NASA said, It's really not like science fiction anymore.

150
00:14:18,000 --> 00:14:33,000
Speaking at a meeting on next generation space communication at the Institute of Electrical and Electronics Engineers or IEEE in London on Tuesday in London Tuesday.

151
00:14:33,000 --> 00:14:39,000
Why was I not invited to that? Oh, these people, they really don't. Nobody appreciates me.

152
00:14:39,000 --> 00:14:56,000
Suzuki outlined an interplanetary communications network with inbuilt AI to detect and possibly fix glitches.

153
00:14:56,000 --> 00:14:58,000
Where's my bell? There it is.

154
00:14:58,000 --> 00:15:15,000
Because I'm doing air quotes. So instead of like the audio, the audible representation of air quotes is a bell. If you didn't know that you do now to fix and possibly fix glitches and inefficiencies as they occur.

155
00:15:15,000 --> 00:15:21,000
OK, well, best of luck, Astrodot.

156
00:15:21,000 --> 00:15:31,000
Relying on good old Gupta. I mean, it's pretty new tech. Would you want to be taking your life into your hands?

157
00:15:31,000 --> 00:15:37,000
I don't know. I mean, I guess that's just part of the territory, isn't it? Of space, that is.

158
00:15:37,000 --> 00:15:56,000
Which other people have pointed out is fake and gay, just like the dinosaurs. And that's fake with an H F A H K E and gay with an H, which is G A H Y, which technically makes me non homophobic, which I'm not anyway.

159
00:15:56,000 --> 00:16:09,000
It then alerts mission operators that there is a likelihood that package transmissions from space vehicle X will be lost or will fail delivery.

160
00:16:09,000 --> 00:16:21,000
She said, yeah, yeah, she said, lol, it's all going to fail. She said, looks like you're setting yourself up there, darling.

161
00:16:21,000 --> 00:16:26,000
Sorry, more sexism. Oh, my word. I am off the chain. Somebody stop me.

162
00:16:26,000 --> 00:16:35,000
We cannot send an engineer up in space whenever a space vehicle goes offline or its software breaks somehow.

163
00:16:35,000 --> 00:16:46,000
No, no, we can't. Oh, my. You guys. This is crazy, right? This is total clown world. Total clown world. My dinner's ready.

164
00:16:46,000 --> 00:17:07,000
I'm going to go and eat in a minute, but let's get through this first. The system also has a natural language interface that will allow astronauts and mission control to talk to it rather than having to scour cumbersome technical manuals for relevant information.

165
00:17:07,000 --> 00:17:18,000
She envisages astronauts being able to seek advice on space experiments or on how to perform complex maneuvers.

166
00:17:18,000 --> 00:17:24,000
Oh, complex maneuvers in space. Oh, 69, 69.

167
00:17:24,000 --> 00:17:41,000
Suzuki, who is a technical director at Google alongside her NASA post, says working for NASA is the fulfillment of a childhood dream.

168
00:17:41,000 --> 00:17:55,000
I have had a bucket list since I was 12 years old. What 12 year olds keeps bucket lists? I thought bucket lists were for old people to make before they died.

169
00:17:55,000 --> 00:18:01,000
That's true though, isn't it? That's that's bless you, Suzuki.

170
00:18:01,000 --> 00:18:13,000
Oh, if her parents missed out there, they should have called her Suzanne. Suzanne Suzuki. Maybe that's her sister's name. She's very successful stripper if you didn't know.

171
00:18:13,000 --> 00:18:18,000
Working and collaborating with NASA was one of them.

172
00:18:18,000 --> 00:18:32,000
Suzuki says that being autistic may have allowed her to look beyond engineering stereotypes. Yes, this autism thing I kept in here. It's got very little to do with AI.

173
00:18:32,000 --> 00:18:41,000
But I just saw, you know, it's kind of it's a thing, isn't it? It's an Internet thing. It's an Internet stuff. And you were on the Internet. So I left it in there.

174
00:18:41,000 --> 00:18:49,000
It probably would have been taken out by the previous writing department. But here we are. It's a new era. You've got to put up with it.

175
00:18:49,000 --> 00:18:52,000
I'm not listening to anyone anymore.

176
00:18:52,000 --> 00:19:01,000
I wanted to make things and solve problems for humanity. And I thought I can do that with computer science.

177
00:19:01,000 --> 00:19:17,000
She said, because I'm autistic, I wanted to know all the steps to get there. And if step A fails, this is step B and step C. Oh, my word.

178
00:19:17,000 --> 00:19:24,000
We should. Oh, here we go. Here we go. We got we buckle up, buddy, because we're going to go a bit wokeish here. All right.

179
00:19:24,000 --> 00:19:30,000
Just suck it up. If you don't like that, suck it up. And if you do like that, then hey, we should encourage women

180
00:19:30,000 --> 00:19:39,000
to go for the technical careers. Otherwise, who is going to be the Ada Lovelace of the future?

181
00:19:39,000 --> 00:19:50,000
She said, I would like the next generation not only celebrating women from the past, but the modern woman engineers too.

182
00:19:50,000 --> 00:20:03,000
We should have more modern hardcore tech women as well. Hardcore tech women. I'm so glad I kept that in there.

183
00:20:03,000 --> 00:20:13,000
Oh, yes. Oh, because I've got the show notes. So you can read the full article by following a link in the show notes.

184
00:20:13,000 --> 00:20:18,000
Yeah. Yeah. Look at that. That's all proper podcast stuff that is.

185
00:20:18,000 --> 00:20:26,000
Right. I'm going to go and eat and perhaps I'll get time to come back and tell you the because I've not got that too many news items

186
00:20:26,000 --> 00:20:35,000
for this week before tonight because, you know, whatever my excuses, but all the excuses I could say are pointless to say.

187
00:20:35,000 --> 00:20:39,000
So I won't say any. So we've got seven seven, which is one of you know, you must one of my favorite numbers.

188
00:20:39,000 --> 00:20:47,000
So whatever. And a new corner. Yes. A brand new corner that might or might not have something to do with that last paragraph

189
00:20:47,000 --> 00:20:52,000
that I read. But for now, I shall go and eat and come back for you.

190
00:20:52,000 --> 00:20:57,000
Like I've said before, it will it will be almost instantaneous.

191
00:20:57,000 --> 00:21:03,000
Oh, wow. Oh, my word. That was that was some spicy veggie pasta.

192
00:21:03,000 --> 00:21:11,000
I put far too much hot sauce on that thing, man. Oh, my word. My mouth. Oh, yeah. Right. Yes. Where were we?

193
00:21:11,000 --> 00:21:20,000
Oh, we just finished number one. Yeah. Oh, OK. In other news. And secondly, Metta introduces

194
00:21:20,000 --> 00:21:31,000
VoiceBox, the first generative AI model for speech to generalize across tasks with state of the art performance.

195
00:21:31,000 --> 00:21:43,000
A breakthrough in generative AI for speech. VoiceBox is the first model that can generalize to speech generation tasks.

196
00:21:43,000 --> 00:21:51,000
It was not specifically trained to accomplish with state of the art performance prior to VoiceBox.

197
00:21:51,000 --> 00:22:01,000
Generative AI for speech required specific training for each task using carefully prepared training data.

198
00:22:01,000 --> 00:22:11,000
VoiceBox uses a new approach to learn just from raw audio and an accompanying transcription.

199
00:22:11,000 --> 00:22:21,000
Unlike autoregressive models for audio generation, VoiceBox can modify any part of a given sample,

200
00:22:21,000 --> 00:22:30,000
not just the end of an audio clip it is given based on a method called flow matching,

201
00:22:30,000 --> 00:22:34,000
which has been shown to improve upon diffusion models.

202
00:22:34,000 --> 00:22:42,000
VoiceBox outperforms the current state of the art English model VALI.

203
00:22:42,000 --> 00:22:50,000
VALI? Have we ever done a VALI? V-A-L-L-Y-E like DAWLY, but with V, VALLI.

204
00:22:50,000 --> 00:22:56,000
I don't think we've ever reported on that, have we? Well, there you go. We're VALLI. VALLIing.

205
00:22:56,000 --> 00:23:05,000
On zero's shot, text to speech in terms of both intelligibility, that's a cool word,

206
00:23:05,000 --> 00:23:18,000
intelligibility, which is 5.9% versus 1.9% word error rates and audio similarity,

207
00:23:18,000 --> 00:23:32,000
or 0.580 versus 0.681. While being as much as 20 times faster for cross-lingual style transfer,

208
00:23:32,000 --> 00:23:43,000
VoiceBox outperforms your TTS. Such an immature guy, man. TTS, it always makes me think titties.

209
00:23:43,000 --> 00:23:52,000
I know that's a bit of a taboo word in the United States of USA, so I apologize if you were offended

210
00:23:52,000 --> 00:23:56,000
by me saying titties. Such a silly word, I love it.

211
00:23:56,000 --> 00:24:16,000
Your TTS to reduce average word error rate from 10.9% to 5.2% and improves audio similarity from 0.335 to 0.481.

212
00:24:16,000 --> 00:24:27,000
This model is trained on a text-guided speech-infilling task where it generates masked speech given its surrounding audio

213
00:24:27,000 --> 00:24:39,000
and text transcript. Trained on over 100,000 hours of multilingual audiobooks, it can perform tasks

214
00:24:39,000 --> 00:24:49,000
such as mono or cross-lingual zero-shot text-to-speech synthesis, noise removal, content editing,

215
00:24:49,000 --> 00:24:59,000
style conversion and diverse sample generation. It outperforms the state-of-the-art zero style...

216
00:24:59,000 --> 00:25:11,000
What? Is this... No, sorry. I just realized I'm totally repeating myself here. Whoops. Slight oversight in the old writing department.

217
00:25:11,000 --> 00:25:14,000
Gupta! Get over here! What have you done?

218
00:25:14,000 --> 00:25:18,000
This last sentence looks like it's new, so we'll read that and then we'll finish this up.

219
00:25:18,000 --> 00:25:33,000
The paper also introduces a series of metrics using public models to facilitate reproducible comparison and model development for speech generation studies.

220
00:25:33,000 --> 00:25:46,000
As with the last item, check out the link in the show notes for the full paper. I should probably link to that rather than an article to it.

221
00:25:46,000 --> 00:25:51,000
This is another thing that I'm going to be doing now that I'm head of writing department.

222
00:25:51,000 --> 00:26:02,000
I'm going to be, instead of doing this whole thing of tech crunch reports, what, Tom's Hardware, I don't know, someone else's report,

223
00:26:02,000 --> 00:26:09,000
you know, the whole stupid thing where someone's reporting on someone else's report. I'm just going to try and get to the news for you.

224
00:26:09,000 --> 00:26:17,000
I'm just going to try and get to the source. That's the one. The open source. That's still going, by the way. What a great idea that is.

225
00:26:17,000 --> 00:26:25,000
Yeah, that's what I'm going to be doing. So, yeah, this is cool, isn't it? This is more kind of Star Trek like translater-thon things.

226
00:26:25,000 --> 00:26:37,000
And I think it's quite cool, to be fair, because I remember being on holiday and getting in a taxi and having a full on conversation with the taxi person driving it.

227
00:26:37,000 --> 00:26:43,000
And they were doing it all through Google Translate or whatever. So now there's VoiceBox.

228
00:26:43,000 --> 00:26:48,000
I suppose it would be much easier, much quicker to converse with people in other languages.

229
00:26:48,000 --> 00:26:57,000
And I guess that's that's that's almost cool. I mean, I guess it is cool, but it also is kind of reminding me of that bit in the good book,

230
00:26:57,000 --> 00:27:11,000
the Bible, when we were all speaking the same language, Sprechen Sie, Deutsch, whatever it was. And then that was a problem. So we would we would like our tongues were confused or something.

231
00:27:11,000 --> 00:27:24,000
But hey, that's what you get with me. I'm going to be straddling the spiritual and the technological for the rest of my mortal days and whatever happens after that happens.

232
00:27:24,000 --> 00:27:29,000
So there you go. If you like it, you'll continue listening. If you don't, you've already left.

233
00:27:29,000 --> 00:27:43,000
Thirdly, McKinsey and Company released a report entitled The Economic Potential of Generative AI, The Next Productivity Frontier.

234
00:27:43,000 --> 00:27:53,000
It's the printing press, the printing press all over again. I wrote that one previously and I said it now and now I'm commenting on it.

235
00:27:53,000 --> 00:28:07,000
Oh, wow. This is cool. This this is very meta. Unleashing a seismic wave across the global economy, Generative AI is poised to redefine our world,

236
00:28:07,000 --> 00:28:18,000
according to a groundbreaking report by McKinsey and Company. Who is McKinsey and Company, you might ask. Not too sure is my answer.

237
00:28:18,000 --> 00:28:25,000
I suddenly took notice of them. I think they're some sort of business thing.

238
00:28:25,000 --> 00:28:33,000
I've signed up to the newsletters, the various newsletters that they do. And yeah, consultancy stuff, I should imagine.

239
00:28:33,000 --> 00:28:53,000
Maybe they're big, maybe they're huge. I just don't know. This revolutionary technology could supercharge the global economy, economy, injecting a staggering 2.6 to 4.4 trillion

240
00:28:53,000 --> 00:29:07,000
with 80 US dollars annually across a diverse array of 63 use cases. That doesn't that doesn't that doesn't seem like a lot of use cases for that much money.

241
00:29:07,000 --> 00:29:15,000
But then when you think about how much money there is, I guess it might. This isn't just a minor upgrade.

242
00:29:15,000 --> 00:29:28,000
We're talking about a colossal surge, amplifying the impact of all artificial intelligence by a jaw dropping 15 to 40 percent.

243
00:29:28,000 --> 00:29:45,000
I just remembered what I did with this one and why it's interesting. I think my prompt to the Guptas with the Gupta was sensationalize this, make it interesting for the listeners.

244
00:29:45,000 --> 00:30:01,000
Are you happy now? I've gone mad. I'm a megalomaniac. What's the word? There's a word in there. I'm a dictator. Oh, oh, they didn't call that vagina tater, did they?

245
00:30:01,000 --> 00:30:17,000
Vagina tater. Where does this stuff come from? My brain is the answer. It's his story, not her story. But you can still own Dick Tater, can't you?

246
00:30:17,000 --> 00:30:31,000
The report unveils that the lion's share of this value, a whopping 75 percent, is concentrated in four pivotal areas.

247
00:30:31,000 --> 00:30:45,000
Here, get your pens out. These are the pivotal areas. Customer operations, marketing and sales, software engineering and R&D, which is, I do believe, research and development.

248
00:30:45,000 --> 00:31:07,000
But that's not all. So many exclamation marks in this. I love it. Generative AI is set to radically transform the very fabric of work, automating a vast array of tasks that currently consume 60 to 70 percent of employees' time.

249
00:31:07,000 --> 00:31:23,000
Hold on to your hats. Hold on to your hats. Who wears hats? I don't know. Humans. That's not good to think. Because the pace of workforce transformation is about to hit warp speed. Warp speed.

250
00:31:23,000 --> 00:31:40,000
Warp speed. That was an operation that did that thing with the count jabular and stuff. Thanks to the explosive potential for technical automation. I would be, if I did this for every, I'd be exhausted if I did this for every news item.

251
00:31:40,000 --> 00:31:55,000
My missus would come out here and find me just dehydrated and prostrate on my captain's chair. She'd have to, I'd have to suckle from her wonderful teats, our second mention of teats, to get that mother's milk back.

252
00:31:55,000 --> 00:32:06,000
That's gross, isn't it? What is going on there? I've never tried the old mother's milk from my wife's teat. But we've got a new baby coming up, so I get a third opportunity to do it.

253
00:32:06,000 --> 00:32:21,000
Maybe this time I'll have the cojones to try. Right, where are we going? Hold on to your hats. Because, oh yeah, warp speed. Thanks to the explosive potential for technical automation.

254
00:32:21,000 --> 00:32:38,000
As we stand on the brink of the generative AI error, the report cautions that while the potential is exhilarating, we must also navigate the considerable challenges that lie ahead.

255
00:32:38,000 --> 00:32:52,000
I like that, considerable challenges that lie ahead. Considerable challenges. That says a lot about my relationship with this podcast. Dive into the full report with the link in the show notes.

256
00:32:52,000 --> 00:33:03,000
Oh, I don't think I did that for any of the other ones. The old sensationalised prompt thing. So careful what you say to your Gupta guys. Alright, just be careful.

257
00:33:03,000 --> 00:33:13,000
Fourthly, in a, oh no, oh, this says groundbreaking. Maybe I did it for this one as well. I don't know. Maybe it's not.

258
00:33:13,000 --> 00:33:42,000
In a groundbreaking partnership, chip giant Nvidia and cloud database maker and killer of NEVA, Snowflake, are joining forces to process foundation models for AI. The collaboration announced at Snowflake Summit 2023 will allow Snowflake customers to rent cloud GPU capacity in Snowflake.

259
00:33:42,000 --> 00:34:01,000
Data warehouse installations and use that capacity to refine neural networks with Nvidia's NEMO N capital N small E capital M small O NEMO framework.

260
00:34:01,000 --> 00:34:15,000
Foundation models are large neural networks such as large language models that are usually pre trained. Customers will use Snowflakes.

261
00:34:15,000 --> 00:34:32,000
Such a yeah, because of the thing, wasn't it with the Snowflakes in 2016? What times they were. Snowflakes data warehouse to develop a custom version of the NEMO foundation model to suit their needs using their own data.

262
00:34:32,000 --> 00:34:56,000
This partnership is part of a growing trend to employ AI, especially generative AI as a business tool. Snowflake will implement the service by procuring Nvidia GPU instances from the cloud service providers with whom it already works.

263
00:34:56,000 --> 00:35:07,000
So procuring normally means buying, but then that doesn't really suggest much of a partnership if whatever. I guess Nvidia are bigger than Snowflake or I don't know.

264
00:35:07,000 --> 00:35:21,000
All I know is, is that my lovely NEVA has turned into a holding page for blooming Snowflake and all my bookmarks are all now screwed up. Crying a lot now.

265
00:35:21,000 --> 00:35:33,000
He, I don't know who he is, but someone added that there is also a responsibility for NEMO for security, which is why it's joint engineering work.

266
00:35:33,000 --> 00:35:54,000
Okay, yeah, I'll tell you what's going to work is me on the transcript a little bit more because yeah, we'll get there. By the way, if anyone out there does fancy chiming in with the article or two, you know, I can't pay you, but I'll definitely welcome the help.

267
00:35:54,000 --> 00:36:11,000
Oh, and whilst we're talking about paying, I haven't quite worked out just yet how to remove the previous podcast partner from the value block of the RSS feed with this new host.

268
00:36:11,000 --> 00:36:21,000
I've just reminded myself about that as I'm speaking and I'll probably send them an email. I'll whack them off an email later to remind them that that needs to happen at some point.

269
00:36:21,000 --> 00:36:36,000
So if you do decide to donate via the Bitcoin Lightning podcasting 2.0 value for value network, yeah, half of it is probably still going to go to to CSP.

270
00:36:36,000 --> 00:36:46,000
But that's fine. That's fine in my mind because one, it rarely happens anywhere. I don't know if it does. I've not checked for a little while. Maybe I should get an app or something to do that.

271
00:36:46,000 --> 00:37:13,000
But yeah, so if that does, that's fine because, you know, the guy's still alive. And although he is, he's always was not always, but he has previously reported on his own considerable wealth publicly multiple times to the point where other people think that perhaps there's a bit of animosity there or some sort of damage to one's own ego from that reporting.

272
00:37:13,000 --> 00:37:24,000
So whatever, he could probably still, you know, he told he was shoving out quite a lot of sats with the boosts and stuff for the other shows that we were previously advertising.

273
00:37:24,000 --> 00:37:34,000
And I was previously writing Boostergrams for, you know, what was the other ones podcasting 2.0 being one of them Planet Rage. What was the others? Oh, yeah, carry in the keeper.

274
00:37:34,000 --> 00:37:53,000
And I think there was another one, but maybe there wasn't. I can't remember now, but I'm going to get my SH bleep tea together and fill up a wallet with some sats and I will pick up the weather podcasting 2.2, 2.0 advertising model was dropped.

275
00:37:53,000 --> 00:38:06,000
And I'll say what I blooming will like in my own Boostergrams now because I'm a maniac. Yeah, it was amazing that the guy worked with me for that long, as long as he did, to be fair.

276
00:38:06,000 --> 00:38:15,000
That's probably the longest sort of relationship like that I've ever had with anyone. Only relationship like that I've ever had with anyone.

277
00:38:15,000 --> 00:38:34,000
And as much as I put up with him and his Kurt ability to be slightly short at times via messaging services and others and other such things, he also put up with me and my lackadaisical work ethic.

278
00:38:34,000 --> 00:38:48,000
Oh, no, there I did it and other ways and stuff. So, yeah, apologies all round. But hey, listen, much love, man. You know, if you're out there, dude, whatever. Love you, bro.

279
00:38:48,000 --> 00:39:01,000
And I mean that, like, literally mean that. And like, if you can all just give a little audible ah, after I've said that bit, then then I think the whole world would would be much better off for it.

280
00:39:01,000 --> 00:39:26,000
So they go. Fifthly, Databricks, the data and AI company, has announced a definitive agreement to acquire Mosaic ML, a leading generative AI platform in a deal valued at approximately one point three billion with a B US.

281
00:39:26,000 --> 00:39:40,000
A buck arounos, buck arounis dollars. The acquisition will enable organizations to build own and secure generative AI models using their own data.

282
00:39:40,000 --> 00:39:56,000
Hmm. Yeah, that's this. Even though this is number five and technically quite far on far down in the news items this fortnight, I've been told this is a big deal. This is a really big deal.

283
00:39:56,000 --> 00:40:10,000
Not like literally. I mean, it is a big deal, but it's also a big deal in terms of artificial intelligence and the takeover, I should imagine. So, yeah, thought we'd report it. It was it was it was worth reporting.

284
00:40:10,000 --> 00:40:30,000
So here we are reporting Mosaic ML is renowned for its state of the art MPT Large Language Models, LLMs, which have been used by organizations like AI2, Allen Institute for AI, Generally Intelligent.

285
00:40:30,000 --> 00:40:43,000
Never heard of that one, but it sounds pretty cool. Hippocratic AI or how do that? And that also sounds pretty cool. Replet and scatter labs for various generative AI use cases.

286
00:40:43,000 --> 00:41:02,000
The Databricks Lakehouse platform combined with Mosaics ML technology will offer customers a simple, fast way to retain control, security and ownership over their valuable data without high costs.

287
00:41:02,000 --> 00:41:15,000
According to Mosaic ML, automatic optimization of model training provides two to seven times faster training compared to standard approaches.

288
00:41:15,000 --> 00:41:44,000
The entire Mosaic ML team, including its industry leading research team, is expected to join Databricks after the transaction closes. Mosaics ML's platform will be supported, scaled and integrated over time to offer customers a seamless, unified platform where they can build, own and secure their generative AI model.

289
00:41:44,000 --> 00:41:52,000
Yeah, there you go. It's all coming together nicely, isn't it? Roll on 2024.

290
00:41:52,000 --> 00:42:21,000
Sixthly and penultimately for this fortnight, MIT has launched a three day program called Artificial Intelligence for National Security Leaders or AI4 as in the number NSL, which when you glance at it, kind of looks a little bit like AI4nessl, which could be taken in multiple ways, but as a arsenal football club.

291
00:42:21,000 --> 00:42:40,000
Football club and that's football as in soccer club supporter like myself. I think that kind of looks like arsenal to me or our soul either or to educate military and government leaders about AI and its implications for national security.

292
00:42:40,000 --> 00:42:56,000
The course, which is not specifically designed for those with a technical background, covers the basics of AI, machine learning and data science and how these intersect with national security.

293
00:42:56,000 --> 00:43:15,000
If your national security apparatus is not clued up on this, I'd be fairly concerned, but MIT is, you know, you don't know. If you don't know MIT is probably, I mean, kind of looks pretty spook centrally to me from the outset anyway.

294
00:43:15,000 --> 00:43:30,000
Not saying that that's a bad thing necessarily for any potential spooky spooks out there, you know, you do you, we do we and we'll see what happens. Whatever, Trevor, you know, I'm a live and let live kind of guy.

295
00:43:30,000 --> 00:43:47,000
The program, organized by MIT's School of Engineering, MIT Stephen, a Schwarzman College of Computing and MIT Sloan Executive Education, has recently completed its fifth cohort.

296
00:43:47,000 --> 00:44:08,000
Participants include leaders from every branch of the US military and some foreign military leaders from NATO. The course covers a variety of technical topics in AI and how to navigate organizational challenges that arise in this context.

297
00:44:08,000 --> 00:44:27,000
The AI for NSL program was born out of discussions with senior US Air Force USAAF leaders and members of the Department of the Air Force or DAF, MIT AI Accelerator in 2019.

298
00:44:27,000 --> 00:44:36,000
I feel like I've said something about this previously, but maybe not that one. Maybe it's a different one. They're probably doing these things all the time, but you know, whatever.

299
00:44:36,000 --> 00:44:57,000
The course aims to create smart consumers at the command level. The course aims to create smart consumers at the command level. What? Which? What else? If they're at command level, consuming AI? What is that sentence about? Weird, right?

300
00:44:57,000 --> 00:45:11,000
Am I? Am I? That is weird. Providing participants with a basic overview of AI technologies and emphasizing organizational planning and implementation.

301
00:45:11,000 --> 00:45:27,000
You can read the full announcement in the show notes. Yes. Don't forget to like, comment and subscribe if you're on YouTube. I know that that's a thing that you say when you're on YouTube, so I thought I'd say it now.

302
00:45:27,000 --> 00:45:42,000
You were on YouTube. If you're on YouTube and you're the one of the YouTube people doing this, then yeah, if you do like, comment and subscribe, then I'll be really happy.

303
00:45:42,000 --> 00:45:52,000
And I might even just personally thank you. Like literally, because I think we've got many of those guys doing that thing over on YouTube. So whatever.

304
00:45:52,000 --> 00:45:59,000
Yes, I know I'm running out of time, so we might have to do this a little bit quickly.

305
00:45:59,000 --> 00:46:26,000
Lastly, for this Fortnite, but honestly, even though this is probably going to be the last AI.Cooking, I will be back with the AI news.show. Show that thing. That's the thing that's going to happen. I promise you, because I already kind of said to myself that I would podcast until I die. And the thing is, I've already started podcasting about AI, and I think AI is going to be around pretty much forever now. I can't see it going anywhere.

306
00:46:26,000 --> 00:46:49,000
I think I'll be podcasting about AI until I die. So hey, if I stop doing that, anyone's free to clip it and play it back to me. Oh, yes. Seven and talking of clips. Clippy's back, folks. Yes, Clippy. Do you remember Clippy? The old clip thing from was it Microsoft Windows 98 or something?

307
00:46:49,000 --> 00:46:59,000
Anyway, yeah, Clippy's back, folks. And this time it's packing some serious AI power.

308
00:46:59,000 --> 00:47:08,000
Yeah, I just love the old Clippy thing. I remember playing around with that for ages, man, and being like, well, man, this is Terminator level stuff, dude.

309
00:47:08,000 --> 00:47:29,000
The infamous Microsoft Office mascot, Clip It or Clippy, as we all know him, has had quite the roller coaster ride introduced in 1997, retired in 2001, resurrected and re-retired in 2019. Yeah, HD Clippy.

310
00:47:29,000 --> 00:47:42,000
And now he's back from the digital dead once again. Digital dead. That's cool, man. But this time it's not Microsoft pulling the strings.

311
00:47:42,000 --> 00:48:11,000
Developer Firecube has breathed new life into the cheeky paperclip, supercharging him with OpenAI's GPT 3.5 large language model. Clippy by Firecube, not by Microsoft, brings back the infamous Clip It into your desktop, powered by the OpenAI GPT 3.5 model,

312
00:48:11,000 --> 00:48:22,000
and says the product description. You can pin Clippy to your screen for quick access to chat or just leave him there for a bit of nostalgia.

313
00:48:22,000 --> 00:48:45,000
But hold on to your hats. This more hat talk. This AI really thinks that we wear hats a lot, doesn't he? She, they. There's a chance that Clippy's newfound popularity could lead to his third demise if Microsoft decides to play the copyright card.

314
00:48:45,000 --> 00:48:59,000
But let's not forget the uproar from the Cliphead community the last time Clippy was shut down. Right, that's it. I'm calling it. I'm calling it right now. Clippy will become the Antichrist, whatever that is.

315
00:48:59,000 --> 00:49:16,000
Well, it's my Redbook. Where's your Redbook? Put it in the Redbook. You can't keep this thing down. So what's next for our paperclip pal? With his AI integration, Clippy's story is starting to sound like a plot from the mummy.

316
00:49:16,000 --> 00:49:25,000
I love this. This is great. Already. Well done. Pat yourself on the back. So you did a good job here. Good job here. This is good job. Good job. Well done.

317
00:49:25,000 --> 00:49:36,000
Let's hope Microsoft doesn't send him back to the digital underworld again. Oh, maybe that's what's happening. Every time he's coming back, he just becomes more and more demonic.

318
00:49:36,000 --> 00:49:53,000
After all, who knows what an AI empowered Clippy might do if provoked. This is it. This is literally. I'm the guy that's done this. I've reported on it. I didn't do it. Here I am reporting on it.

319
00:49:53,000 --> 00:50:05,000
Right. Listen, I've got to go to work, but I am going to get this new corner. I've got a new corner for you that I previously nipple, nipple, tickle, tickle, teased. And it's called for now Pioneers Corner.

320
00:50:05,000 --> 00:50:14,000
I thought we'd take a look at some of the pioneers, kind of a bit history cornerish, but also sort of a more personification history corner.

321
00:50:14,000 --> 00:50:25,000
I thought, who better to start with than the lovely lady that was mentioned in the first news item, Miss Ada Lovelace, which I mean, come on, is that not a stripper's name?

322
00:50:25,000 --> 00:50:37,000
I'm pretty sure I've seen like pornos with Ada Lovelace in it or whatever. Born Augusta Ada Byron. Oh, she changed it to Lovelace. OK, she must have got about those.

323
00:50:37,000 --> 00:50:44,000
I don't want to put aspersions, but she's still, you know, people know about her today, so she must have she must have put it out a little bit.

324
00:50:44,000 --> 00:50:48,000
That's horrible thing to say. I'm so sorry. I'm sorry for saying that.

325
00:50:48,000 --> 00:50:53,000
I'm going to leave it in there so that I can self admonish myself because I own my mistakes.

326
00:50:53,000 --> 00:51:03,000
And that was a mistake for saying that. So I'm going to I'm not going to edit that one out. But anyway, yes, Augusta Ada Byron was an English mathematician

327
00:51:03,000 --> 00:51:14,000
and writer who is best known for her work on Charles Babbage's early mechanical general purpose computer, the analytical engine.

328
00:51:14,000 --> 00:51:24,000
Her notes on the engine include what is recognized as the first algorithm intended to be processed by a machine.

329
00:51:24,000 --> 00:51:32,000
And because of this, she is often regarded as the first computer programmer.

330
00:51:32,000 --> 00:51:44,000
While Lovelace lived in the 19th century, that's that's the that's the ones that begin with 18s, long before the concept of modern artificial intelligence was developed,

331
00:51:44,000 --> 00:51:54,000
her contributions to the field of computing were foundational. Her vision of the potential of computing went beyond mere calculation.

332
00:51:54,000 --> 00:52:05,000
She speculated that any piece of content, including music, text, pictures and sounds, could be manipulated by a machine,

333
00:52:05,000 --> 00:52:12,000
a concept that is central to today's digital computers and artificial intelligence.

334
00:52:12,000 --> 00:52:16,000
Oh, that's where that sentence sentence ends. I kind of read it like it was going to go on.

335
00:52:16,000 --> 00:52:21,000
But that was where that sentence ends. This is going to be a movie film about this this woman.

336
00:52:21,000 --> 00:52:26,000
And I reckon that that that that actor from the Queen's Gambit is going to play out.

337
00:52:26,000 --> 00:52:33,000
That would be what I would. That's what I would suggest if I was if I had my executive producer hat on.

338
00:52:33,000 --> 00:52:41,000
Lovelace also suggested that the analytical engine might act upon other things besides number.

339
00:52:41,000 --> 00:52:51,000
The engine might compose elaborate and scientific pieces of music of any degree of complexity or extent.

340
00:52:51,000 --> 00:53:03,000
This idea of a machine going beyond mere calculations to create something new is a fundamental concept in AI.

341
00:53:03,000 --> 00:53:12,000
In summary, while Ada Lovelace did not directly contribute to artificial intelligence as we know it today,

342
00:53:12,000 --> 00:53:26,000
her pioneering work and vision in computing laid the groundwork for all subsequent developments in the field, including AI.

343
00:53:26,000 --> 00:53:31,000
And that is why she's our first of the pioneers corner.

344
00:53:31,000 --> 00:53:37,000
Yes, the first one, perhaps the last. I don't know. Maybe I'll never do it again. Someone might stop me.

345
00:53:37,000 --> 00:53:41,000
Maybe not. And that's it for this episode.

346
00:53:41,000 --> 00:53:46,000
Stay safe and stay humble.

347
00:53:46,000 --> 00:54:12,000
He usually yawning.

348
00:54:12,000 --> 00:54:18,000
Oh, what are we going to do? The world is going to die of AI. It's going to take over. We have to be very careful.

