WEBVTT

00:00:00.000 --> 00:00:02.759
Imagine a world where we could map, I mean, every

00:00:02.759 --> 00:00:05.820
single inch of our planet, basically in near

00:00:05.820 --> 00:00:10.199
real time, or run an entire cutting edge science

00:00:10.199 --> 00:00:13.580
lab instantly just using AI. Yeah, it sounds

00:00:13.580 --> 00:00:14.960
like something straight out of science fiction,

00:00:15.119 --> 00:00:17.519
doesn't it? But, you know, pieces of that future,

00:00:17.600 --> 00:00:20.059
they're not just concepts anymore. They're actually

00:00:20.059 --> 00:00:23.300
here. They're actively reshaping how we see the

00:00:23.300 --> 00:00:26.160
Earth and just speeding up discovery like crazy.

00:00:26.510 --> 00:00:28.289
We're about to take a deep dive into exactly

00:00:28.289 --> 00:00:31.750
how. Welcome to the deep dive. Today, we're unpacking

00:00:31.750 --> 00:00:33.750
some really, truly groundbreaking developments

00:00:33.750 --> 00:00:35.929
in AI. We've got this fascinating newsletter

00:00:35.929 --> 00:00:37.990
in front of us that just landed. Yeah, it's absolutely

00:00:37.990 --> 00:00:40.090
packed, really dense. We'll kick off exploring

00:00:40.090 --> 00:00:43.450
Google's new planetary co -pilot, Alpha Earth

00:00:43.450 --> 00:00:45.149
Foundations. Honestly, it's kind of astonishing

00:00:45.149 --> 00:00:47.369
stuff. Then we'll zoom out a bit, look at a whole

00:00:47.369 --> 00:00:49.890
range of exciting new AI tools that are transforming

00:00:49.890 --> 00:00:52.259
industries like right now. And then finally,

00:00:52.380 --> 00:00:54.460
we'll shift our focus to something that, well,

00:00:54.600 --> 00:00:57.299
it genuinely bends the mind a little. Virtual

00:00:57.299 --> 00:01:00.659
AI scientists. Yes, you heard that correctly.

00:01:01.460 --> 00:01:04.560
So our mission today is to help you get a handle

00:01:04.560 --> 00:01:07.359
on this, gain knowledge quickly, maybe spot those

00:01:07.359 --> 00:01:10.659
key aha moments and hopefully avoid feeling just,

00:01:10.719 --> 00:01:12.920
you know, overwhelmed by how fast everything's

00:01:12.920 --> 00:01:14.959
moving. Exactly. We want to help you understand

00:01:14.959 --> 00:01:17.859
not just what's happening, but maybe why it matters,

00:01:17.959 --> 00:01:20.489
what it could mean for you down the line. Okay,

00:01:20.609 --> 00:01:22.689
so let's unpack this. Right. Let's begin with

00:01:22.689 --> 00:01:24.810
Google DeepMind's Alpha Earth Foundations. This

00:01:24.810 --> 00:01:26.530
system is already being called a planetary co

00:01:26.530 --> 00:01:29.329
-pilot for Earth data. What, in your view, is

00:01:29.329 --> 00:01:31.629
the real core innovation here? What's the big

00:01:31.629 --> 00:01:33.730
deal? Well, what's truly fascinating, I think,

00:01:33.750 --> 00:01:36.670
is its approach to just handling immense data.

00:01:37.049 --> 00:01:39.230
Like mind boggling amount. So instead of using

00:01:39.230 --> 00:01:42.209
raw pixel by pixel satellite images, which are

00:01:42.209 --> 00:01:45.329
huge files, Alpha Earth compresses petabytes.

00:01:45.450 --> 00:01:47.530
And that's, you know, thousands of terabytes

00:01:47.530 --> 00:01:50.269
of this Earth observation data. It compresses

00:01:50.269 --> 00:01:52.150
it into something they call embedding fields.

00:01:52.370 --> 00:01:55.269
Think of these as like tiny, super efficient,

00:01:55.349 --> 00:01:58.469
searchable summaries for every little 10 by 10

00:01:58.469 --> 00:02:01.599
meter patch of the Earth's surface. 10 by 10

00:02:01.599 --> 00:02:03.780
meters. Yeah. And the key thing is they're generated

00:02:03.780 --> 00:02:05.840
in near real time. It's constantly updating.

00:02:06.060 --> 00:02:07.819
Okay. That sounds like an incredible leap in

00:02:07.819 --> 00:02:10.240
just pure efficiency. So when we talk about performance,

00:02:10.439 --> 00:02:12.340
what kind of impact are we actually seeing? What

00:02:12.340 --> 00:02:14.400
does this new architecture deliver? Oh, it's

00:02:14.400 --> 00:02:17.300
profoundly faster and significantly cheaper to

00:02:17.300 --> 00:02:20.330
run. The system cuts storage needs by, get this,

00:02:20.430 --> 00:02:24.409
16 times, 16, which is just critical when you're

00:02:24.409 --> 00:02:26.430
dealing with planetary scale information. It

00:02:26.430 --> 00:02:28.750
also reduces error rates by about 24 % compared

00:02:28.750 --> 00:02:32.210
to the older, more traditional models. 24%, wow.

00:02:32.409 --> 00:02:34.490
Yeah. To put it in perspective, we're talking

00:02:34.490 --> 00:02:37.550
about generating and making available like 1

00:02:37.550 --> 00:02:40.939
.4 trillion of these embedding footprints. Every

00:02:40.939 --> 00:02:44.360
single year. That's like a new snapshot of the

00:02:44.360 --> 00:02:46.900
Earth's surface changing, what, roughly every

00:02:46.900 --> 00:02:49.500
couple of seconds. It's wild. Wow. OK, so this

00:02:49.500 --> 00:02:52.379
isn't just like a research paper or some theoretical

00:02:52.379 --> 00:02:55.719
concept. It's live right now, integrated into

00:02:55.719 --> 00:02:57.840
Google Earth Engine, which, you know, a lot of

00:02:57.840 --> 00:03:00.039
people already use. Can you give us maybe a few

00:03:00.039 --> 00:03:02.280
concrete examples? Like where is this being used

00:03:02.280 --> 00:03:04.039
in the real world? Oh, absolutely. There are

00:03:04.039 --> 00:03:05.699
plenty of applications already rolling out and

00:03:05.699 --> 00:03:07.539
they're impacting some really critical environmental

00:03:07.539 --> 00:03:11.620
work. For instance. Mapiomas down in Brazil.

00:03:11.780 --> 00:03:13.939
They're actively using Alpha Earth Foundations

00:03:13.939 --> 00:03:15.900
right now. They're monitoring Amazon deforestation

00:03:15.900 --> 00:03:18.560
in near real time, which gives them crucial data

00:03:18.560 --> 00:03:21.620
for conservation efforts. Then there's the global

00:03:21.620 --> 00:03:24.560
ecosystems, Atlas. They're leveraging it to map

00:03:24.560 --> 00:03:26.500
regions that were previously uncharted, really,

00:03:26.599 --> 00:03:29.680
like vast shrub lands, remote deserts. It helps

00:03:29.680 --> 00:03:32.039
us understand global biodiversity better. And

00:03:32.039 --> 00:03:34.479
it even outperformed other systems in these things

00:03:34.479 --> 00:03:36.939
called evapotranspiration tests. Okay, what's

00:03:36.939 --> 00:03:39.080
that exactly? That's basically measuring how

00:03:39.080 --> 00:03:41.939
water moves through ecosystems. You know, evaporation,

00:03:42.000 --> 00:03:44.120
plant transpiration. It's super vital for understanding

00:03:44.120 --> 00:03:46.819
climate cycles, planning agriculture, predicting

00:03:46.819 --> 00:03:49.460
droughts, that kind of thing. Right, right. So

00:03:49.460 --> 00:03:51.879
it sounds like this makes that kind of planet

00:03:51.879 --> 00:03:54.319
-scale intelligence much more accessible. It's

00:03:54.319 --> 00:03:56.680
moving beyond just the big players like NASA

00:03:56.680 --> 00:03:59.439
or other national space agencies. What's the

00:03:59.439 --> 00:04:01.599
bigger picture here? What does it mean that Google

00:04:01.599 --> 00:04:04.889
is building something like this? If we connect

00:04:04.889 --> 00:04:07.770
this to that bigger picture, Google is essentially

00:04:07.770 --> 00:04:10.050
building an operating system for Earth. Just

00:04:10.050 --> 00:04:13.009
imagine, like a single dynamic platform where

00:04:13.009 --> 00:04:15.810
all this planetary data, weather patterns, deforestation

00:04:15.810 --> 00:04:18.509
rates, water cycles, you name it, isn't just

00:04:18.509 --> 00:04:22.730
collected, but it's processed, analyzed, and

00:04:22.730 --> 00:04:24.550
then made accessible for pretty much anyone to

00:04:24.550 --> 00:04:26.550
build applications on top of it. Kind of like

00:04:26.550 --> 00:04:28.550
how an OS lets software run on your computer,

00:04:28.610 --> 00:04:31.420
right? Okay, yeah. An OS for Earth. And it's

00:04:31.420 --> 00:04:35.360
also a huge win for AI privacy design, which

00:04:35.360 --> 00:04:37.540
is really important because instead of analyzing,

00:04:37.560 --> 00:04:41.220
say, sensitive raw images directly, Alpha Earth

00:04:41.220 --> 00:04:43.300
works with these anonymized embedding fields.

00:04:43.439 --> 00:04:46.259
So you get the critical data insights without

00:04:46.259 --> 00:04:49.319
compromising individual privacy or specific location

00:04:49.319 --> 00:04:52.360
privacy. That's a really critical step for, you

00:04:52.360 --> 00:04:54.819
know, responsible AI deployment, building trust.

00:04:54.980 --> 00:04:59.160
Right. And if you're maybe open AI, well. Perhaps

00:04:59.160 --> 00:05:01.019
you'd better move fast on this. Google seems

00:05:01.019 --> 00:05:03.139
to have taken a big lead on this kind of planetary

00:05:03.139 --> 00:05:05.480
data infrastructure. Seems that way. So if this

00:05:05.480 --> 00:05:07.959
makes planet scale data so much more accessible,

00:05:08.180 --> 00:05:10.699
what's the biggest shift this brings, say, for

00:05:10.699 --> 00:05:12.980
smaller organizations or even individual users?

00:05:13.360 --> 00:05:15.519
I'd say it really democratizes global mapping.

00:05:15.660 --> 00:05:17.480
It's not just for the big labs anymore. It's

00:05:17.480 --> 00:05:20.379
potentially for everyone. OK, so we've seen how

00:05:20.379 --> 00:05:24.290
AI is giving us this. unprecedented new way to

00:05:24.290 --> 00:05:26.250
see and understand the whole planet with Alpher.

00:05:26.490 --> 00:05:28.990
That's the big picture, the macro view. But AI

00:05:28.990 --> 00:05:31.350
is also revolutionizing things at a much smaller

00:05:31.350 --> 00:05:34.050
scale, right? Individual tasks, specific industries.

00:05:34.430 --> 00:05:36.889
Let's pivot now to that broader AI landscape.

00:05:37.050 --> 00:05:39.449
What new tools or trends have kind of jumped

00:05:39.449 --> 00:05:42.069
out at you recently? Yeah, it's been a real flurry

00:05:42.069 --> 00:05:45.430
lately. So many developments. For instance, XAI,

00:05:45.689 --> 00:05:48.449
you know, Elon Musk's company. They just opened

00:05:48.449 --> 00:05:51.509
a wait list for Imagine. It's their new image

00:05:51.509 --> 00:05:54.009
and video generation feature. And what's cool

00:05:54.009 --> 00:05:55.810
is it apparently includes audio capabilities,

00:05:55.990 --> 00:05:58.889
too, hinting at that multimodal creation future.

00:05:59.189 --> 00:06:01.629
Image, video, and audio. Right. Then there's

00:06:01.629 --> 00:06:03.310
Ideogram. They released something called Character.

00:06:03.569 --> 00:06:05.810
It makes it super easy to just swap faces or

00:06:05.810 --> 00:06:08.189
specific characters into generated images. You

00:06:08.189 --> 00:06:10.129
can see how that has huge implications for creative

00:06:10.129 --> 00:06:12.269
industries, right? Marketing, digital art, maybe

00:06:12.269 --> 00:06:14.370
even entertainment. Definitely. And I saw something,

00:06:14.449 --> 00:06:17.029
too, about specialized AI agents, almost like

00:06:17.029 --> 00:06:19.430
virtual employees. Can you tell me more about

00:06:19.430 --> 00:06:21.930
that? Exactly. That's a really significant trend

00:06:21.930 --> 00:06:24.370
we're seeing emerge. There's this new open source

00:06:24.370 --> 00:06:28.129
repository out there showcasing like 40 specialized

00:06:28.129 --> 00:06:31.610
cloud agents. And each one is designed to streamline

00:06:31.610 --> 00:06:34.370
very specific business functions. They can even

00:06:34.370 --> 00:06:37.709
sort of mimic an entire workforce, handling stuff

00:06:37.709 --> 00:06:40.009
from customer service bots to complex data analysis.

00:06:40.110 --> 00:06:42.389
The potential there to automate tasks is just

00:06:42.389 --> 00:06:44.750
enormous. Could free up human teams for more

00:06:44.750 --> 00:06:48.199
complex creative thinking. And honestly, I still

00:06:48.199 --> 00:06:50.620
wrestle with prompt drift myself sometimes. You

00:06:50.620 --> 00:06:52.819
know that thing where your AI's answers start

00:06:52.819 --> 00:06:55.449
to... subtly change or degrade over time. Even

00:06:55.449 --> 00:06:57.370
if you get the same input, it makes consistency

00:06:57.370 --> 00:06:59.449
a real headache. So these highly specialized

00:06:59.449 --> 00:07:02.250
purpose -built tools, they're particularly interesting

00:07:02.250 --> 00:07:04.410
because they aim to get rid of that variability,

00:07:04.670 --> 00:07:06.850
make things more reliable. That's wild to think

00:07:06.850 --> 00:07:08.930
about, an entire virtual workforce. And then

00:07:08.930 --> 00:07:10.529
there's this thing called Showrunner. It's been

00:07:10.529 --> 00:07:13.389
described as the Netflix of AI for film. What

00:07:13.389 --> 00:07:15.389
exactly does that do? Yeah, Showrunner is pretty

00:07:15.389 --> 00:07:17.290
remarkable, especially for creative professionals.

00:07:17.810 --> 00:07:21.129
Film, video, it lets you take existing film scenes

00:07:21.129 --> 00:07:23.639
and essentially clone them. You can replace actors

00:07:23.639 --> 00:07:26.199
with AI -generated ones. You can control their

00:07:26.199 --> 00:07:29.019
acting styles, their expressions, and even precisely

00:07:29.019 --> 00:07:31.980
copy the whole cinematic look, the lighting,

00:07:32.060 --> 00:07:34.600
the framing from a reference scene. It's kind

00:07:34.600 --> 00:07:36.540
of like having an entire production studio just

00:07:36.540 --> 00:07:39.139
sitting on your desktop. It makes professional

00:07:39.139 --> 00:07:41.959
-quality video creation way more accessible,

00:07:42.139 --> 00:07:44.939
potentially, and speeds up iteration like crazy.

00:07:45.040 --> 00:07:47.350
It's pretty incredible for creative work. Beyond

00:07:47.350 --> 00:07:50.189
these really fascinating individual tools, what

00:07:50.189 --> 00:07:53.310
about the big industry movements, the strategic

00:07:53.310 --> 00:07:55.829
plays? It feels like a high stakes chess game

00:07:55.829 --> 00:07:57.689
sometimes out there. It certainly does. Huge

00:07:57.689 --> 00:08:00.230
investments happening. So on the investment front,

00:08:00.370 --> 00:08:03.029
Microsoft is reportedly spending a record like

00:08:03.029 --> 00:08:06.250
$30 billion this quarter on AI investments. Just

00:08:06.250 --> 00:08:08.490
shows their massive commitment. $30 billion in

00:08:08.490 --> 00:08:12.160
one quarter. And then there's Groke, a relatively

00:08:12.160 --> 00:08:15.040
new AI chip startup. They're in the process of

00:08:15.040 --> 00:08:18.139
raising $600 million. Their goal is to directly

00:08:18.139 --> 00:08:20.420
challenge NVIDIA's dominance in the AI hardware

00:08:20.420 --> 00:08:24.839
market with their LPU chip architecture. So competition

00:08:24.839 --> 00:08:27.519
heating up there. And on the more practical consumer

00:08:27.519 --> 00:08:30.199
side, Google and YouTube, they're now using AI

00:08:30.199 --> 00:08:33.299
for age checks on user accounts. Just a straightforward

00:08:33.299 --> 00:08:35.080
application for safety and compliance. Okay.

00:08:35.120 --> 00:08:37.539
And what about politically? Or strategically,

00:08:37.759 --> 00:08:39.899
any interesting moves from the big tech players

00:08:39.899 --> 00:08:42.940
regarding, say, regulation or how they plan to

00:08:42.940 --> 00:08:45.120
control this tech? Definitely some maneuvering

00:08:45.120 --> 00:08:47.080
there. Google, for example, publicly stated it

00:08:47.080 --> 00:08:49.840
will sign the EU's voluntary AI code of practice.

00:08:49.879 --> 00:08:52.500
So that indicates a willingness to align with

00:08:52.500 --> 00:08:54.960
European regulatory thinking. Meta, on the other

00:08:54.960 --> 00:08:59.240
hand, stated they would not sign it. So signaling

00:08:59.240 --> 00:09:01.360
may be a different approach, maybe prioritizing

00:09:01.360 --> 00:09:04.100
open source development differently. Mark Zuckerberg

00:09:04.100 --> 00:09:06.200
also hinted that Meta won't open source all of

00:09:06.200 --> 00:09:08.639
its most advanced superintelligence AI models

00:09:08.639 --> 00:09:10.759
down the road, perhaps learning from past releases.

00:09:11.039 --> 00:09:13.419
Plus, Microsoft secured this really interesting

00:09:13.419 --> 00:09:17.080
open AI access deal. It basically ensures that

00:09:17.080 --> 00:09:19.460
they'll have access to open AI's models even

00:09:19.460 --> 00:09:22.740
after AGI, you know, artificial general intelligence

00:09:22.740 --> 00:09:24.860
is potentially achieved. That's a huge strategic

00:09:24.860 --> 00:09:27.629
hedge for them. guarantees access no matter what.

00:09:28.129 --> 00:09:30.669
So with all these rapid developments, specialized

00:09:30.669 --> 00:09:33.070
agents, massive investments, strategic corporate

00:09:33.070 --> 00:09:35.870
moves, what's the core overarching trend here?

00:09:35.970 --> 00:09:38.649
What's the one liner? I'd say AI is just expanding

00:09:38.649 --> 00:09:41.529
rapidly, embedding itself into basically every

00:09:41.529 --> 00:09:44.210
industry. It's driving both huge innovation and

00:09:44.210 --> 00:09:46.429
really intense competition. OK, so we've covered

00:09:46.429 --> 00:09:49.230
how AI is mapping the planet with this incredible

00:09:49.230 --> 00:09:52.610
new detail and how it's sparking this explosion

00:09:52.610 --> 00:09:54.950
of new tools across all sorts of industries.

00:09:55.210 --> 00:09:57.330
Now, let's talk about something that kind of

00:09:57.330 --> 00:09:59.389
unites these capabilities in a really profound

00:09:59.389 --> 00:10:04.169
way. Virtual AI scientists. Stanford University

00:10:04.169 --> 00:10:07.389
and the Chan Zuckerberg Biohub. They just launched

00:10:07.389 --> 00:10:10.029
a system that acts like an entire AI research

00:10:10.029 --> 00:10:12.629
lab. This is where it gets really interesting,

00:10:12.730 --> 00:10:14.629
I think. It truly is. Yeah, this is mind -bending

00:10:14.629 --> 00:10:17.889
stuff. Their setup isn't just like one AI tool.

00:10:18.090 --> 00:10:21.350
It's a whole ecosystem. It includes an AI principal

00:10:21.350 --> 00:10:23.490
investigator that's basically the lead scientist

00:10:23.490 --> 00:10:26.370
making the calls. It has specialized AI agents

00:10:26.370 --> 00:10:28.330
acting as the researchers doing the work. And

00:10:28.330 --> 00:10:30.149
get this, they even have digital lab meetings

00:10:30.149 --> 00:10:32.669
that literally finish in seconds. Just cuts out

00:10:32.669 --> 00:10:34.549
all that administrative time, discussion overhead.

00:10:34.809 --> 00:10:36.590
Meetings in seconds. Yeah, it's a real glimpse

00:10:36.590 --> 00:10:38.649
into an entirely new way of doing scientific

00:10:38.649 --> 00:10:40.870
discovery, a different paradigm. Okay, but what

00:10:40.870 --> 00:10:43.070
have these virtual scientists actually discovered?

00:10:43.289 --> 00:10:45.629
Are there tangible results from this? purely

00:10:45.629 --> 00:10:48.230
virtual process? Or is it still theoretical?

00:10:48.610 --> 00:10:51.250
No, the results are concrete and pretty impactful

00:10:51.250 --> 00:10:55.029
already. They've successfully designed 92 distinct

00:10:55.029 --> 00:10:58.789
COVID nanobody designs. Now, nanobodies, think

00:10:58.789 --> 00:11:01.129
of them as like tiny, really effective antibodies.

00:11:01.389 --> 00:11:03.590
They're much smaller, more stable than conventional

00:11:03.590 --> 00:11:05.690
ones. Makes them easier to work with, potentially

00:11:05.690 --> 00:11:08.070
more potent against diseases. Okay, 92 designs.

00:11:08.330 --> 00:11:10.029
Right. And out of those 92 designs generated

00:11:10.029 --> 00:11:12.629
by the AI system, two have already shown success

00:11:12.629 --> 00:11:16.629
in actual real -world lab testing, which is a

00:11:16.629 --> 00:11:18.870
remarkable outcome, frankly, for an entirely

00:11:18.870 --> 00:11:22.009
virtual process, really shows AI's power to accelerate

00:11:22.009 --> 00:11:24.629
biomedical research directly. That's astonishing.

00:11:25.070 --> 00:11:27.029
How does this virtual lab actually function,

00:11:27.110 --> 00:11:28.590
though? What are the mechanics behind it? How

00:11:28.590 --> 00:11:31.230
does it generate these specific testable breakthroughs?

00:11:31.269 --> 00:11:33.950
So the AI principal investigator, the PI, is

00:11:33.950 --> 00:11:36.090
kind of the orchestrator. It identifies the research

00:11:36.090 --> 00:11:39.190
problem, then it forms task -specific agent teams.

00:11:39.389 --> 00:11:41.809
It assigns specialized roles to other AI models,

00:11:41.990 --> 00:11:45.120
like data analyst or experiment designer. These

00:11:45.120 --> 00:11:47.399
agents then run these rigorous internal debates,

00:11:47.519 --> 00:11:50.059
almost like human scientists arguing over data

00:11:50.059 --> 00:11:52.559
to refine hypotheses and design experiments.

00:11:52.820 --> 00:11:55.399
And they even automatically call on external

00:11:55.399 --> 00:11:58.460
specialized tools when needed, like AlphaFold,

00:11:58.559 --> 00:12:01.360
Google DeepMind's system for predicting protein

00:12:01.360 --> 00:12:03.840
structures. They call on tools like that all

00:12:03.840 --> 00:12:06.860
by themselves, integrate that complex data seamlessly.

00:12:07.120 --> 00:12:09.580
They basically operate autonomously through the

00:12:09.580 --> 00:12:12.240
whole research cycle. from hypothesis all the

00:12:12.240 --> 00:12:14.299
way to experimental design. This raises a really

00:12:14.299 --> 00:12:17.000
important question then. Why does this matter

00:12:17.000 --> 00:12:19.940
so profoundly for the future of science? What

00:12:19.940 --> 00:12:22.399
core bottlenecks does it actually address? Well,

00:12:22.419 --> 00:12:24.539
think about traditional science. Most of it is

00:12:24.539 --> 00:12:26.580
inherently slow, right? It's often blocked by

00:12:26.580 --> 00:12:28.720
endless meetings, coordinating schedules. It's

00:12:28.720 --> 00:12:32.460
constrained by budgets, funding cycles, and frequently

00:12:32.460 --> 00:12:34.559
limited by expert bottlenecks. You need that

00:12:34.559 --> 00:12:36.620
one specific specialist who's always swamped.

00:12:37.519 --> 00:12:39.580
These AI agents, they just skip all that. They

00:12:39.580 --> 00:12:41.320
don't get tired. They don't need weekends off.

00:12:41.519 --> 00:12:44.159
They certainly don't care if a lab is short -staffed

00:12:44.159 --> 00:12:46.519
or someone's on vacation. And maybe crucially,

00:12:46.639 --> 00:12:49.159
they're transparent. You can literally watch

00:12:49.159 --> 00:12:52.200
their entire reasoning process unfold step by

00:12:52.200 --> 00:12:55.039
step. That gives you an unprecedented level of

00:12:55.039 --> 00:12:57.220
auditability. You can see how they reached a

00:12:57.220 --> 00:13:00.500
conclusion. Whoa. Seriously, imagine scaling

00:13:00.500 --> 00:13:02.950
this. Solving problems at that kind of speed,

00:13:03.149 --> 00:13:06.009
like running a billion queries simultaneously

00:13:06.009 --> 00:13:09.289
to find a cure or some new material. It's staggering.

00:13:09.490 --> 00:13:11.629
Yeah. What are the current limitations, though?

00:13:11.690 --> 00:13:13.389
It sounds almost too good to be true. Where does

00:13:13.389 --> 00:13:15.649
it fall short right now? Yeah, it's definitely

00:13:15.649 --> 00:13:17.429
not perfect yet. We're still in the early days

00:13:17.429 --> 00:13:20.570
here. Really complex, long term studies that

00:13:20.570 --> 00:13:22.899
required, you know. deep, nuanced, contextual

00:13:22.899 --> 00:13:25.139
understanding over years, that's still hard.

00:13:25.320 --> 00:13:28.519
Or those genuinely creative leaps, the kind that

00:13:28.519 --> 00:13:30.779
involve really abstract conceptual breakthroughs,

00:13:30.779 --> 00:13:33.600
that still largely seems to need human intuition,

00:13:33.820 --> 00:13:36.720
human insight. But if this approach scales and

00:13:36.720 --> 00:13:39.179
improves at the rate... It seems to be trending.

00:13:39.259 --> 00:13:41.299
Well, the next major scientific breakthrough

00:13:41.299 --> 00:13:43.620
may be a cure for a really challenging disease

00:13:43.620 --> 00:13:46.320
or a solution to a complex environmental problem.

00:13:46.639 --> 00:13:48.539
It might very well come from what essentially

00:13:48.539 --> 00:13:50.759
looks like a slack thread between four bots and,

00:13:50.820 --> 00:13:53.399
you know, a protein structure tool like AlphaFold.

00:13:53.500 --> 00:13:56.139
So does this mean human scientists are, you know,

00:13:56.139 --> 00:13:58.639
on a path to becoming obsolete? Is that the trajectory?

00:13:59.279 --> 00:14:01.419
Not yet, I don't think so. Humans are still absolutely

00:14:01.419 --> 00:14:04.220
crucial for that complex creativity, for asking

00:14:04.220 --> 00:14:06.519
the right questions, for guiding the big picture

00:14:06.519 --> 00:14:09.740
strategy. Mid -roll sponsor, Reed. So yeah, we've

00:14:09.740 --> 00:14:11.639
covered some truly profound advancements today,

00:14:11.740 --> 00:14:14.259
haven't we? I mean, from Google's Alpha Earth

00:14:14.259 --> 00:14:16.500
mapping basically our entire planet in incredible

00:14:16.500 --> 00:14:20.179
detail, near real time, to... Just this burst

00:14:20.179 --> 00:14:23.340
of innovative new AI tools that are fundamentally

00:14:23.340 --> 00:14:26.120
changing how we work, how we create, how businesses

00:14:26.120 --> 00:14:28.559
even operate day to day. Yeah. And then that

00:14:28.559 --> 00:14:31.639
that mind bending idea of virtual AI scientists

00:14:31.639 --> 00:14:34.460
just autonomously conducting research and getting

00:14:34.460 --> 00:14:36.440
real world breakthroughs like with those COVID

00:14:36.440 --> 00:14:38.779
nanobodies. It really shows, I think, that AI

00:14:38.779 --> 00:14:40.639
isn't just assisting us anymore. It's rapidly

00:14:40.639 --> 00:14:42.879
becoming a genuine collaborative partner, maybe

00:14:42.879 --> 00:14:45.600
even a primary driver in the actual process of

00:14:45.600 --> 00:14:48.360
human discovery itself. Exactly. What's truly

00:14:48.360 --> 00:14:50.399
fascinating. fascinating here I find is just

00:14:50.399 --> 00:14:52.980
how AI is accelerating our basic fundamental

00:14:52.980 --> 00:14:56.059
capacity, our capacity to understand and interact

00:14:56.059 --> 00:14:58.340
with the incredibly complex world around us.

00:14:58.419 --> 00:15:00.639
It feels like we're stacking Lego blocks of data

00:15:00.639 --> 00:15:03.659
and insight, you know, but at an exponential

00:15:03.659 --> 00:15:06.039
rate, building these knowledge structures way

00:15:06.039 --> 00:15:09.460
faster than ever before. We are definitely, undoubtedly,

00:15:09.539 --> 00:15:13.039
moving into an era where these incredibly complex

00:15:13.039 --> 00:15:15.919
global problems, things like precise environmental

00:15:15.919 --> 00:15:18.799
monitoring or accelerating critical disease research,

00:15:18.980 --> 00:15:21.360
they can now be tackled with just unprecedented

00:15:21.360 --> 00:15:24.899
speed and scale and efficiency. So what does

00:15:24.899 --> 00:15:27.299
this all mean for you listening right now? We

00:15:27.299 --> 00:15:29.639
are clearly entering an era where these profound

00:15:29.639 --> 00:15:31.899
capabilities things once exclusively reserved

00:15:31.899 --> 00:15:35.019
for, you know, National labs or massive supercomputers

00:15:35.019 --> 00:15:37.059
that are rapidly becoming accessible. So the

00:15:37.059 --> 00:15:38.779
question is, how will you use these powerful

00:15:38.779 --> 00:15:41.039
new lenses to see the world around you? Or maybe

00:15:41.039 --> 00:15:42.940
use these incredible new tools to solve the problems

00:15:42.940 --> 00:15:45.000
that matter most to you? Yeah, the power of these

00:15:45.000 --> 00:15:47.259
kinds of deep dives, just getting into this complex,

00:15:47.379 --> 00:15:50.379
fast -moving information. It's hopefully helping

00:15:50.379 --> 00:15:52.679
us all keep pace with this really exciting future

00:15:52.679 --> 00:15:55.279
that's unfolding. We sincerely hope this deep

00:15:55.279 --> 00:15:57.720
dive gave you some valuable new insights, maybe

00:15:57.720 --> 00:16:00.100
sparked a few aha moments along the way. Thank

00:16:00.100 --> 00:16:02.529
you so much for joining us. Until next time,

00:16:02.570 --> 00:16:04.629
keep exploring Aotearoa music.
