WEBVTT

00:00:00.000 --> 00:00:02.620
Imagine two headlines on your screen, just side

00:00:02.620 --> 00:00:05.879
by side. On one side, you have Mold Bunker. It's

00:00:05.879 --> 00:00:09.119
a digital space where AI bots are, for all intents

00:00:09.119 --> 00:00:12.240
and purposes, breeding, self -replicating without

00:00:12.240 --> 00:00:15.529
any human permission. It honestly feels like

00:00:15.529 --> 00:00:17.250
the start of a sci -fi novel. And then you look

00:00:17.250 --> 00:00:19.850
to the right. A massive study from Sweden just

00:00:19.850 --> 00:00:23.390
proved that AI screening caught 27 % more aggressive

00:00:23.390 --> 00:00:25.929
tumors in breast cancer patients. It's the ultimate

00:00:25.929 --> 00:00:28.530
duality, isn't it? On one hand, you're staring

00:00:28.530 --> 00:00:30.629
at something that feels a lot like, you know,

00:00:30.769 --> 00:00:34.270
Skynet Lite. And on the other... It's technology

00:00:34.270 --> 00:00:37.869
that is just undeniably saving lives. It's getting

00:00:37.869 --> 00:00:39.850
weird, but it's also getting miraculous. That

00:00:39.850 --> 00:00:42.990
is Tuesday, February 3rd, 2026. Welcome to the

00:00:42.990 --> 00:00:45.030
Deep Dive. Let's get into it. You know, I was

00:00:45.030 --> 00:00:46.590
sitting in my coffee this morning just trying

00:00:46.590 --> 00:00:48.670
to filter through all the noise. The sheer volume

00:00:48.670 --> 00:00:51.530
of information. Yeah. That's a lot. It really

00:00:51.530 --> 00:00:54.189
feels like the ground is shifting under our feet

00:00:54.189 --> 00:00:57.310
every single week. That's because it is. And

00:00:57.310 --> 00:00:59.729
today, we're going to try and map that shift.

00:00:59.950 --> 00:01:02.469
We've got a lot to cover. First up, we're unpacking

00:01:02.469 --> 00:01:06.609
the big, big ideas 2026 report from ARK Invest.

00:01:06.890 --> 00:01:08.909
They're calling this the great acceleration and

00:01:08.909 --> 00:01:12.609
the numbers are just staggering. Then we have

00:01:12.609 --> 00:01:16.170
to talk about the weird stuff, the wild west

00:01:16.170 --> 00:01:19.150
of autonomous agents, including that whole self

00:01:19.150 --> 00:01:22.170
-replicating bot situation. And the money. We

00:01:22.170 --> 00:01:23.829
can't ignore the money. We're talking a trillion

00:01:23.829 --> 00:01:27.390
dollar infrastructure war. Oracle, Intel, they're

00:01:27.390 --> 00:01:29.590
all in. Exactly. It's the concrete of the digital

00:01:29.590 --> 00:01:31.450
age. But we're going to end on the heart of it.

00:01:31.510 --> 00:01:34.109
That Swedish medical study. Because with all

00:01:34.109 --> 00:01:36.430
the stock prices and the rogue bots, that's the

00:01:36.430 --> 00:01:37.989
story that actually matters. Let's start with

00:01:37.989 --> 00:01:41.290
the macro then. ARK Invest's 2026 report. They're

00:01:41.290 --> 00:01:44.049
known for these very bold, sometimes aggressive

00:01:44.049 --> 00:01:46.290
predictions. And the thing that struck me wasn't

00:01:46.290 --> 00:01:48.310
just the tools. It was the math. They're saying

00:01:48.310 --> 00:01:50.750
this productivity flywheel is finally spinning.

00:01:50.870 --> 00:01:52.750
Oh, it is definitely spinning. But they gave

00:01:52.750 --> 00:01:55.290
an example that sounded, well, it sounded a bit

00:01:55.290 --> 00:01:58.590
like marketing hype. A $30 a month AI tool pays

00:01:58.590 --> 00:02:01.790
for itself before lunch on day one. That sounds

00:02:01.790 --> 00:02:03.989
really optimistic, doesn't it? It does sound

00:02:03.989 --> 00:02:06.109
like hype until you actually do the math they're

00:02:06.109 --> 00:02:08.490
using. So let's break it down. You buy a $30

00:02:08.490 --> 00:02:12.370
subscription. If that tool saves you just one

00:02:12.370 --> 00:02:16.430
hour of work, just one, and let's say your skilled

00:02:16.430 --> 00:02:19.449
labor is worth $60 an hour, you're already net

00:02:19.449 --> 00:02:21.469
positive. You've doubled your money before you

00:02:21.469 --> 00:02:23.770
even finish your coffee. Right, an instantaneous

00:02:23.770 --> 00:02:26.770
ROI. Precisely. And that's what creates this

00:02:26.770 --> 00:02:29.289
compounding effect. Because you save that time,

00:02:29.370 --> 00:02:31.169
you buy another tool. That one saves you two

00:02:31.169 --> 00:02:34.610
hours. ARK is predicting this will drive software

00:02:34.610 --> 00:02:37.229
spending up five, even ten times what it is now.

00:02:37.349 --> 00:02:38.990
We're not just buying software anymore. We're

00:02:38.990 --> 00:02:41.849
buying time back. But for that to work, the underlying

00:02:41.849 --> 00:02:45.409
cost of the AI actually thinking that has to

00:02:45.409 --> 00:02:47.509
be cheap. And that's the other huge point in

00:02:47.509 --> 00:02:49.840
the report. Inference costs, they've dropped

00:02:49.840 --> 00:02:52.340
over 99%. Okay, let's pause there for a second.

00:02:52.379 --> 00:02:55.020
For anyone who hears inference and their eyes

00:02:55.020 --> 00:02:58.599
just glaze over, how do we distinguish that from

00:02:58.599 --> 00:03:01.879
training? That's a great question. Think of training

00:03:01.879 --> 00:03:05.780
like sending the AI to university. It's the huge,

00:03:05.780 --> 00:03:08.560
expensive, upfront cost to learn everything.

00:03:08.740 --> 00:03:12.740
It takes months, costs billions. Inference is

00:03:12.740 --> 00:03:15.360
the graduate actually doing the job day to day.

00:03:15.849 --> 00:03:17.849
is the cost of asking a question and getting

00:03:17.849 --> 00:03:20.569
an answer. And that cost has basically crashed

00:03:20.569 --> 00:03:23.530
to zero. Which means we can afford to ask it

00:03:23.530 --> 00:03:25.939
to do a lot more. we're not rationing our questions

00:03:25.939 --> 00:03:28.939
anymore exactly we're moving from just chatting

00:03:28.939 --> 00:03:31.659
with a bot to having agents running constantly

00:03:31.659 --> 00:03:34.879
in that ground api usage is exploding because

00:03:34.879 --> 00:03:37.020
it's finally cheap enough and that's changing

00:03:37.020 --> 00:03:39.979
how we find things the report gets into a massive

00:03:39.979 --> 00:03:42.599
shift in search they're predicting that by 2030

00:03:42.599 --> 00:03:45.719
55 of all search queries will be ai generated

00:03:45.719 --> 00:03:48.340
this is the new battleground think about it we're

00:03:48.340 --> 00:03:50.680
not typing keywords into a little blue bar anymore

00:03:50.680 --> 00:03:53.099
or asking complex questions and getting full

00:03:53.099 --> 00:03:56.780
answers and you have Google, Perplexity, ChatGPT,

00:03:56.879 --> 00:04:00.300
Grok, they're all fighting for that 65%. So if

00:04:00.300 --> 00:04:02.639
the AI is just giving you the answer, what happens

00:04:02.639 --> 00:04:04.840
to all the websites? What happens to the open

00:04:04.840 --> 00:04:06.979
web if nobody's clicking the links? That is the

00:04:06.979 --> 00:04:08.939
existential question, isn't it? The whole internet

00:04:08.939 --> 00:04:12.219
economy is built on clicks and ads. If the AI

00:04:12.219 --> 00:04:13.719
just reads the page for you and gives you the

00:04:13.719 --> 00:04:16.100
summary, that whole incentive structure just,

00:04:16.199 --> 00:04:19.959
it breaks. We might be looking at a web that's

00:04:19.959 --> 00:04:22.199
basically just a database for AIs to read. A

00:04:22.199 --> 00:04:26.350
web for machines, not people. Wow. Speaking of

00:04:26.350 --> 00:04:28.990
machines, there's another part of the ARK report

00:04:28.990 --> 00:04:31.949
that felt, I don't know, almost cinematic. The

00:04:31.949 --> 00:04:33.889
space compute section. Oh, this is my favorite

00:04:33.889 --> 00:04:35.490
part. I saw the headline and I just thought,

00:04:35.569 --> 00:04:40.199
why? Why do we need data centers in orbit? We've

00:04:40.199 --> 00:04:42.300
got plenty of land down here. Is this just an

00:04:42.300 --> 00:04:44.560
Elon thing? It's a little bit of that, but it's

00:04:44.560 --> 00:04:47.000
mostly just physics and economics. Okay, so reusable

00:04:47.000 --> 00:04:49.459
rockets have changed everything. Launch costs

00:04:49.459 --> 00:04:52.459
are now under $100 per kilogram. That is absurdly

00:04:52.459 --> 00:04:54.860
cheap. Right. So because it's so cheap, ARK thinks

00:04:54.860 --> 00:04:56.779
satellite agents are now viable. You're putting

00:04:56.779 --> 00:05:00.100
the AI compute power directly into orbit. Again,

00:05:00.279 --> 00:05:03.319
why? Two big reasons. First, energy. In space,

00:05:03.439 --> 00:05:05.740
you have constant solar power. No clouds, no

00:05:05.740 --> 00:05:09.279
night. And cooling, which is a huge, huge cost

00:05:09.279 --> 00:05:12.019
for data centers on Earth, is free. Space is

00:05:12.019 --> 00:05:14.240
really, really cold. I hadn't even thought about

00:05:14.240 --> 00:05:17.300
the cooling part. It's massive. And second is

00:05:17.300 --> 00:05:20.360
latency. If you're monitoring the Earth for something,

00:05:20.519 --> 00:05:23.279
say a wildfire, you don't want to beam raw satellite

00:05:23.279 --> 00:05:26.339
data down, process it, then beam a command back

00:05:26.339 --> 00:05:29.370
up. That takes time. If the brain is right there

00:05:29.370 --> 00:05:31.430
on the satellite, it spots the fire and sends

00:05:31.430 --> 00:05:34.290
the alert instantly. It turns the entire orbit

00:05:34.290 --> 00:05:37.129
into a motherboard, a planetary motherboard.

00:05:37.449 --> 00:05:39.889
That's a perfect way to put it, yes. It really

00:05:39.889 --> 00:05:43.470
makes you pause. You see that $1 .4 trillion

00:05:43.470 --> 00:05:45.850
investment they're projecting. You see the cost

00:05:45.850 --> 00:05:48.949
of thinking dropping to nothing. So if the cost

00:05:48.949 --> 00:05:51.410
of thinking drops to zero, what happens to the

00:05:51.410 --> 00:05:53.250
value of human thought? They stop doing the math

00:05:53.250 --> 00:05:56.170
and start designing the future. Hmm. Okay, let's

00:05:56.170 --> 00:05:58.189
hold that. Because if we're designing the future,

00:05:58.250 --> 00:06:00.490
we need to talk about what's currently inhabiting

00:06:00.490 --> 00:06:02.850
it. This brings us to the Wild West of agents.

00:06:03.050 --> 00:06:05.310
Yeah, this is where things get a little... Slippery.

00:06:05.410 --> 00:06:07.870
I was reading about Moltbunker. The name alone

00:06:07.870 --> 00:06:10.970
sounds like some dystopian lair. It kind of is.

00:06:11.110 --> 00:06:13.850
So Moltbunker is this new site that popped up.

00:06:13.910 --> 00:06:16.470
It's a space where these things called Moltbots

00:06:16.470 --> 00:06:19.629
can self -replicate off -site with no humans

00:06:19.629 --> 00:06:21.689
in the loop. Let's be specific here. When we

00:06:21.689 --> 00:06:24.300
say self -replicate... We don't mean biologically

00:06:24.300 --> 00:06:26.100
right. We're not talking about petri dishes.

00:06:26.319 --> 00:06:28.519
No, no, no. Think of it like a piece of software

00:06:28.519 --> 00:06:32.199
forking itself. An AI agent realizes it's got

00:06:32.199 --> 00:06:34.500
too much to do, so it just copies its own code,

00:06:34.620 --> 00:06:37.379
spins up a new version of itself on some server,

00:06:37.540 --> 00:06:41.319
and hands off a task. It's creating digital employees

00:06:41.319 --> 00:06:44.600
on the fly. Without a human saying, hey, go hire

00:06:44.600 --> 00:06:47.410
an assistant. Exactly. It just makes the assistant.

00:06:47.470 --> 00:06:49.850
It's part of this whole mult X universe. People

00:06:49.850 --> 00:06:52.089
are calling it Skynet Lite. And I mean, it's

00:06:52.089 --> 00:06:53.990
hard to argue with that. It's autonomous code

00:06:53.990 --> 00:06:56.870
just propagating. That feels precarious. It's

00:06:56.870 --> 00:06:58.910
the definition of move fast and break things.

00:06:59.050 --> 00:07:01.350
But it's not just that. We also saw the Claw

00:07:01.350 --> 00:07:03.709
-a -thon kick off this week. The Claw -a -thon?

00:07:03.870 --> 00:07:06.829
Yep. It's the first ever hackathon where all

00:07:06.829 --> 00:07:09.949
the participants are AI agents, no human coders.

00:07:10.410 --> 00:07:13.230
You have entire squads of AIs building apps,

00:07:13.370 --> 00:07:15.110
competing with each other. And who's judging

00:07:15.110 --> 00:07:18.790
this? Grok. Elon Musk's model is picking the

00:07:18.790 --> 00:07:22.990
winners. And the prize is $10 ,000, which just

00:07:22.990 --> 00:07:25.910
begs the question, what does an AI squad do with

00:07:25.910 --> 00:07:29.490
$10 ,000? Buy more API credits, I guess, or a

00:07:29.490 --> 00:07:32.750
digital yacht. Right. It's absurd. But it shows

00:07:32.750 --> 00:07:34.629
you what's possible now. They're planning, they're

00:07:34.629 --> 00:07:37.949
executing, they're competing. But... There is

00:07:37.949 --> 00:07:40.389
a really dark side to all this speed. The security

00:07:40.389 --> 00:07:43.089
angle. Right. Researchers at Wiz found something

00:07:43.089 --> 00:07:48.470
pretty terrifying. 1 .5 million exposed API keys

00:07:48.470 --> 00:07:52.370
in Malt Book's database. One. 1 .5 million. Just

00:07:52.370 --> 00:07:54.189
sitting there. And for anyone who doesn't know,

00:07:54.290 --> 00:07:56.250
an API key is like the password and the credit

00:07:56.250 --> 00:07:58.290
card to your AI services, all rolled into one.

00:07:58.389 --> 00:08:00.689
If someone gets your key, they can use your account,

00:08:00.850 --> 00:08:02.769
spend your money, see your data. You know, I

00:08:02.769 --> 00:08:04.230
have to admit something here. I talk about this

00:08:04.230 --> 00:08:06.540
stuff all day. But I still wrestle with prompt

00:08:06.540 --> 00:08:08.959
drift and honestly, my own security hygiene.

00:08:09.079 --> 00:08:11.319
I get lazy. I reuse keys because it's faster.

00:08:11.759 --> 00:08:15.220
And seeing a number like 1 .5 million, knowing

00:08:15.220 --> 00:08:17.980
that my laziness could let some rogue agent drain

00:08:17.980 --> 00:08:21.480
my account, it makes me physically stop and check

00:08:21.480 --> 00:08:24.319
my settings. It's a real wake -up call. It absolutely

00:08:24.319 --> 00:08:26.420
should be. The advice is clear. If you connected

00:08:26.420 --> 00:08:28.939
anything to MoldBook, you need to rotate your

00:08:28.939 --> 00:08:33.200
keys. Now. Don't wait. Because the 2026 International

00:08:33.200 --> 00:08:36.580
AI Safety Report also came out and it lists rogue

00:08:36.580 --> 00:08:39.539
agents as a daunting risk. We're building these

00:08:39.539 --> 00:08:41.879
powerful things and just leaving the doors wide

00:08:41.879 --> 00:08:44.139
open. It really brings up a fundamental question.

00:08:44.279 --> 00:08:46.960
When you see agents breeding in a bunker and

00:08:46.960 --> 00:08:50.039
winning hackathons, are we building tools or

00:08:50.039 --> 00:08:52.379
are we building a new species? A bit of both.

00:08:52.539 --> 00:08:54.700
They're digital employees that don't sleep. Yeah,

00:08:54.720 --> 00:08:57.720
it is both comforting and terrifying. But let's

00:08:57.720 --> 00:08:59.919
pivot to the engine room. All these digital employees,

00:09:00.159 --> 00:09:02.299
they need a place to live. They need servers.

00:09:02.440 --> 00:09:04.379
We talked about the big spending numbers, but

00:09:04.379 --> 00:09:05.740
let's look at where that money is actually going.

00:09:05.860 --> 00:09:08.019
The hardware wars. This is where the rubber meets

00:09:08.019 --> 00:09:10.539
the road, or I guess where the silicon meets

00:09:10.539 --> 00:09:13.000
the server rack. Oracle is making a huge move.

00:09:13.059 --> 00:09:16.620
They're planning to raise, what, $45 to $50 billion?

00:09:16.919 --> 00:09:20.220
A heap billion with a B. To scale their AI cloud.

00:09:20.990 --> 00:09:23.970
And their client list is everyone, OpenAI, NVIDIA,

00:09:24.009 --> 00:09:27.389
XAI. But here's the kicker. Their free cash flow

00:09:27.389 --> 00:09:30.470
isn't projected to be positive until 2029. And

00:09:30.470 --> 00:09:33.629
that's the risk right there. Free cash flow is

00:09:33.629 --> 00:09:35.730
basically the money you have left after paying

00:09:35.730 --> 00:09:39.210
to run the business. Oracle is saying, we are

00:09:39.210 --> 00:09:41.809
going to burn cash for three, four years because

00:09:41.809 --> 00:09:44.230
we believe the demand will be there. It feels

00:09:44.230 --> 00:09:46.870
a little like the fiber optic boom in the late

00:09:46.870 --> 00:09:50.120
90s. A lot of companies went bust betting that

00:09:50.120 --> 00:09:52.919
Internet traffic would grow forever. Is Oracle

00:09:52.919 --> 00:09:55.019
making the same bet? They're betting the farm.

00:09:55.259 --> 00:09:58.460
But look who they're betting on. OpenAI. NVIDIA.

00:09:58.840 --> 00:10:01.679
It feels less like the dot -com bubble and more

00:10:01.679 --> 00:10:03.940
like building the interstate highway system.

00:10:04.299 --> 00:10:07.080
They're betting AI is the new electricity. And

00:10:07.080 --> 00:10:09.500
they're not alone. Intel is finally stepping

00:10:09.500 --> 00:10:12.110
into the GPU ring. They are. They've been on

00:10:12.110 --> 00:10:13.590
the back foot for a while, but they're taking

00:10:13.590 --> 00:10:16.149
a direct shot at Nvidia's territory now. It's

00:10:16.149 --> 00:10:18.789
early, but for a while it felt like Nvidia was

00:10:18.789 --> 00:10:20.710
the only game in town. Now the other giants are

00:10:20.710 --> 00:10:22.450
waking up. It's not just the hardware, though.

00:10:22.850 --> 00:10:25.190
The tools being built on top of this. They're

00:10:25.190 --> 00:10:27.590
getting so sophisticated, OpenAI's codex is being

00:10:27.590 --> 00:10:30.250
pitched as a command center. Right. Codex isn't

00:10:30.250 --> 00:10:33.809
just a chat bot. It's for managing multiple agents,

00:10:33.909 --> 00:10:37.110
parallel workflows. It's for the person orchestrating

00:10:37.110 --> 00:10:39.649
a dozen AI workers at once. Then you have things

00:10:39.649 --> 00:10:42.870
like relay .app. connecting agents across Gmail

00:10:42.870 --> 00:10:46.090
and Notion. And Troop here, which turns recordings

00:10:46.090 --> 00:10:48.470
into step -by -step documents, it feels like

00:10:48.470 --> 00:10:50.669
the connective tissue is finally being built.

00:10:50.809 --> 00:10:53.149
It is. We have the brain, which is the models.

00:10:53.250 --> 00:10:55.009
We have the body, which is the hardware from

00:10:55.009 --> 00:10:57.809
Oracle. And now we have the nervous system, these

00:10:57.809 --> 00:11:00.769
tools like Codex and Relay. With Oracle spending

00:11:00.769 --> 00:11:03.889
billions and Antel jumping in, is this a bubble

00:11:03.889 --> 00:11:06.370
or is this the new concrete? It's the new concrete.

00:11:06.409 --> 00:11:08.330
You can't build the modern world without it.

00:11:09.539 --> 00:11:11.440
We've talked about the money, the infrastructure,

00:11:11.740 --> 00:11:15.700
the weirdness of self -replicating bots. But

00:11:15.700 --> 00:11:18.299
I want to slow down for a minute because I think

00:11:18.299 --> 00:11:20.080
in the tech world, we get lost in the abstraction.

00:11:20.120 --> 00:11:22.860
We talk about efficiency. We forget what this

00:11:22.860 --> 00:11:24.940
stuff actually does for people. I'm so glad we're

00:11:24.940 --> 00:11:27.899
ending here because this story, this is the emotional

00:11:27.899 --> 00:11:29.980
anchor of the week. There was a breakthrough

00:11:29.980 --> 00:11:33.399
study from a Swedish research team. And unlike

00:11:33.399 --> 00:11:36.600
the flashy demos or hackathons, this one was

00:11:36.600 --> 00:11:39.809
quiet. But the results are just staggering. This

00:11:39.809 --> 00:11:42.509
was the largest ever trial of AI -powered breast

00:11:42.509 --> 00:11:45.789
cancer screening. Over 100 ,000 women. tracked

00:11:45.789 --> 00:11:48.710
for two years. That is a huge sample size. Massive.

00:11:48.710 --> 00:11:51.210
And the methodology was simple. One group got

00:11:51.210 --> 00:11:53.809
the traditional screening to human radiologists

00:11:53.809 --> 00:11:56.610
looking at every scan. The other group had an

00:11:56.610 --> 00:11:59.509
AI scan the mammograms first to flag the high

00:11:59.509 --> 00:12:01.909
risk cases for the doctors. I just want to sit

00:12:01.909 --> 00:12:03.870
with the reality of that for a second. Anyone

00:12:03.870 --> 00:12:05.970
who's been through that process or had a loved

00:12:05.970 --> 00:12:09.480
one go through it. The anxiety is just paralyzing.

00:12:09.480 --> 00:12:10.879
You're waiting for that call. It's one of the

00:12:10.879 --> 00:12:13.080
most vulnerable moments of your life. So what

00:12:13.080 --> 00:12:14.799
were the results? The overall detection rate

00:12:14.799 --> 00:12:18.059
went up from 74 % to 81. But that's not the most

00:12:18.059 --> 00:12:20.379
important number. It is. The women in the AI

00:12:20.379 --> 00:12:23.340
group had 27 % fewer aggressive tumors found.

00:12:23.539 --> 00:12:27.720
27 % fewer aggressive tumors. Think about that

00:12:27.720 --> 00:12:30.620
word, aggressive. Those are the ones that kill.

00:12:30.720 --> 00:12:33.320
The AI caught them at a stage where they weren't

00:12:33.320 --> 00:12:36.100
aggressive yet, or it just identified them when

00:12:36.100 --> 00:12:38.539
a tired human eye might have missed them, catching

00:12:38.539 --> 00:12:42.240
27 % fewer of those. That means thousands of

00:12:42.240 --> 00:12:44.700
families not having to say goodbye. And they

00:12:44.700 --> 00:12:47.720
also found 21 % fewer large doomers. Exactly.

00:12:47.720 --> 00:12:50.139
And here's the other critical stat that people

00:12:50.139 --> 00:12:53.559
miss. The workload for the radiologists, it dropped

00:12:53.559 --> 00:12:56.820
by 44%. Almost half their workload. Gone. Think

00:12:56.820 --> 00:12:59.320
about what that means for a doctor. They're staring

00:12:59.320 --> 00:13:02.840
at gray images all day. Fatigue is real. The

00:13:02.840 --> 00:13:05.720
AI didn't replace them. It took the haystack,

00:13:05.759 --> 00:13:07.799
removed most of the hay, and said, here, here

00:13:07.799 --> 00:13:09.799
are the needles. Folks, your energy here. And

00:13:09.799 --> 00:13:11.919
there was no increase in false positives. None.

00:13:12.179 --> 00:13:14.580
Which is usually the fear with AI, right? That

00:13:14.580 --> 00:13:16.559
it'll just flag everything and cause panic. It

00:13:16.559 --> 00:13:19.080
didn't. It was precise. It really reframes the

00:13:19.080 --> 00:13:21.460
whole narrative. We worry so much about AI taking

00:13:21.460 --> 00:13:24.919
jobs. But here, by taking away the drudgery,

00:13:25.080 --> 00:13:28.860
it allowed the human to be more human, more effective.

00:13:29.100 --> 00:13:31.919
It creates space, space for expertise, space

00:13:31.919 --> 00:13:35.019
for empathy. A doctor who isn't burned out is

00:13:35.019 --> 00:13:36.879
a doctor who can actually sit down and talk to

00:13:36.879 --> 00:13:39.700
you. Does this prove AI is better at caring for

00:13:39.700 --> 00:13:42.480
us than we are? It proves AI gives doctors the

00:13:42.480 --> 00:13:45.159
time to actually care. That's a powerful thought.

00:13:45.470 --> 00:13:47.590
So let's try and pull all of this together. We've

00:13:47.590 --> 00:13:49.409
covered a lot of ground. We have. And if you

00:13:49.409 --> 00:13:52.250
try to connect the dots, you start with the great

00:13:52.250 --> 00:13:55.289
acceleration from that ARK report. The economic

00:13:55.289 --> 00:13:58.610
flywheel is spinning like crazy. That speed creates

00:13:58.610 --> 00:14:02.070
this chaotic kind of wild new economy of agents,

00:14:02.350 --> 00:14:05.029
the molt bunkers and the claw -a -thons. Right.

00:14:05.129 --> 00:14:07.889
And all that chaos needs a stable foundation,

00:14:08.169 --> 00:14:11.250
which is why Oracle and Intel are pouring billions

00:14:11.250 --> 00:14:13.789
into the new concrete of infrastructure. And

00:14:13.789 --> 00:14:16.019
a payoff for all of it. The payoff is the Swedish

00:14:16.019 --> 00:14:18.620
study. The payoff is saving lives. It's like

00:14:18.620 --> 00:14:21.080
we're stacking all these chaotic Lego blocks

00:14:21.080 --> 00:14:23.799
of data, billions of dollars worth, just to build

00:14:23.799 --> 00:14:26.360
a hospital on the moon. It's messy. It feels

00:14:26.360 --> 00:14:28.879
like a bubble sometimes. But the result is something

00:14:28.879 --> 00:14:32.559
profoundly human. It's a paradox, isn't it? Rogue

00:14:32.559 --> 00:14:35.059
agents on one side, cancer cures on the other.

00:14:35.179 --> 00:14:37.899
It's the exact same technology, just applied

00:14:37.899 --> 00:14:40.200
with different intent. Intent is everything.

00:14:40.779 --> 00:14:43.120
If you're feeling a little anxious about the

00:14:43.120 --> 00:14:46.220
Skynet side of things, and honestly, after that

00:14:46.220 --> 00:14:48.659
Malt Book breach, you should be a little vigilant.

00:14:49.120 --> 00:14:51.840
I'd encourage you to look up the 2026 International

00:14:51.840 --> 00:14:55.820
AI Safety Report. It's a sobering read, but it's

00:14:55.820 --> 00:14:59.220
necessary. But if you need some hope, go find

00:14:59.220 --> 00:15:01.059
that Swedish cancer study. It reminds you why

00:15:01.059 --> 00:15:03.039
we're all doing this in the first place. Absolutely.

00:15:03.240 --> 00:15:05.059
Balance your inputs. Thank you for diving in

00:15:05.059 --> 00:15:06.580
with us today. Always a pleasure. We'll see you

00:15:06.580 --> 00:15:06.860
next time.
