WEBVTT

00:00:00.000 --> 00:00:01.480
Okay, wait, hold on a second. I just need to

00:00:01.480 --> 00:00:03.459
make sure I heard you read the pre -show. You

00:00:03.459 --> 00:00:07.160
said Sam Altman's kids were supposed to run the

00:00:07.160 --> 00:00:10.500
Mars colony. Yes, that is exactly what the leaked

00:00:10.500 --> 00:00:13.880
email said. It's part of the whole lawsuit discovery.

00:00:14.259 --> 00:00:17.500
Altman proposed this $80 billion project for

00:00:17.500 --> 00:00:20.539
a Mars colony. And yeah, the text implies that...

00:00:20.839 --> 00:00:24.480
the governance structure that involves his children.

00:00:24.579 --> 00:00:26.760
Your biological children or his AGI children.

00:00:27.000 --> 00:00:29.800
You know, knowing that whole circle of tech philosophy,

00:00:29.920 --> 00:00:32.060
genuinely hard to tell. Maybe they see them as

00:00:32.060 --> 00:00:33.979
the same thing eventually. But here's the real

00:00:33.979 --> 00:00:36.939
kicker. Elon Musk's reply to this whole pitch

00:00:36.939 --> 00:00:40.210
was just, it was brutal. I can only imagine.

00:00:40.429 --> 00:00:42.869
He just replied with a single statistic. He called

00:00:42.869 --> 00:00:46.170
the probability of success zero percent. Zero.

00:00:46.270 --> 00:00:48.770
Zero. Okay. Welcome back to the Deem Time. I'm

00:00:48.770 --> 00:00:51.009
really glad you're here with us. We usually spend

00:00:51.009 --> 00:00:52.929
our time trying to filter out all the noise,

00:00:53.090 --> 00:00:54.729
you know, find the signal in these huge data

00:00:54.729 --> 00:00:58.609
dumps. But today. I mean, the noise is incredibly

00:00:58.609 --> 00:01:01.130
loud. It is loud. But if you listen really closely,

00:01:01.270 --> 00:01:04.409
there's a very distinct rhythm underneath it

00:01:04.409 --> 00:01:06.549
all. Right. And we aren't just talking about

00:01:06.549 --> 00:01:08.730
the usual chatbot headlines today. We're looking

00:01:08.730 --> 00:01:11.549
at a fundamental shift in competence. We're talking

00:01:11.549 --> 00:01:14.209
about AI putting on a lab coat at Stanford. Yeah.

00:01:14.290 --> 00:01:17.030
To cure diseases. We're talking about AI literally

00:01:17.030 --> 00:01:20.420
breaking out of the screen. To what? Fix your

00:01:20.420 --> 00:01:22.840
dishwasher. And maybe most significantly, we're

00:01:22.840 --> 00:01:25.040
talking about a massive geopolitical shift in

00:01:25.040 --> 00:01:27.420
hardware. Something coming out of China that

00:01:27.420 --> 00:01:30.579
just changes the entire board. It feels like

00:01:30.579 --> 00:01:35.280
the theme for this dive is doing work. We're

00:01:35.280 --> 00:01:37.840
moving past the era of, you know, generating

00:01:37.840 --> 00:01:40.200
interesting text and into the era of executing

00:01:40.200 --> 00:01:43.700
complex tasks from the biology lab to the factory

00:01:43.700 --> 00:01:45.319
floor and then all the way to the global supply

00:01:45.319 --> 00:01:47.700
chain. There's so much to unpack here. Let's

00:01:47.700 --> 00:01:49.680
start with the scientists. Let's do it. So we

00:01:49.680 --> 00:01:51.599
have this really fascinating study from Anthropic.

00:01:51.599 --> 00:01:54.319
It's titled The AI Frontier, Scientific Research

00:01:54.319 --> 00:01:56.760
and Physical Evolution. And I think for a lot

00:01:56.760 --> 00:01:58.099
of people listening, this is going to require

00:01:58.099 --> 00:02:01.420
a real mental shift about what an LLM actually

00:02:01.420 --> 00:02:04.769
is. Ugh, a huge shift. Because usually when we

00:02:04.769 --> 00:02:07.349
talk about large language models, you know, Claude,

00:02:07.430 --> 00:02:09.969
GPT, whatever, we think of them as writing assistants.

00:02:10.169 --> 00:02:13.969
Like, Claude, summarize this PDF. Or, hey, write

00:02:13.969 --> 00:02:16.650
me a polite email to my boss. That kind of thing.

00:02:16.830 --> 00:02:18.930
Exactly. It's a retrieval tool, a formatting

00:02:18.930 --> 00:02:21.729
tool. It's basically a sophisticated autocomplete.

00:02:21.930 --> 00:02:24.909
Right. But what Anthropic did was profile three

00:02:24.909 --> 00:02:27.509
different labs, and these are serious, high -level

00:02:27.509 --> 00:02:29.930
research labs, where they're using Claude to

00:02:29.930 --> 00:02:32.909
do work that... That normally takes human scientists

00:02:32.909 --> 00:02:37.729
months, and the AI is doing it in minutes. And

00:02:37.729 --> 00:02:39.669
it's not just doing the grunt work. It's actually

00:02:39.669 --> 00:02:42.210
doing the thinking. I mean, take Stanford's Biomne

00:02:42.210 --> 00:02:44.610
project as the prime example. Biomne is fascinating.

00:02:44.710 --> 00:02:46.710
They call this a research agent. Okay, before

00:02:46.710 --> 00:02:48.490
we go further, we should probably clarify that

00:02:48.490 --> 00:02:50.569
term. When we say agent here, we're not just

00:02:50.569 --> 00:02:52.270
talking about a chatbot with a personality, are

00:02:52.270 --> 00:02:54.729
we? No, definitely not. A research agent is a

00:02:54.729 --> 00:02:57.449
system that doesn't just chat. It uses tools

00:02:57.449 --> 00:03:01.139
to execute tasks. Think of it this way. A chatbot

00:03:01.139 --> 00:03:05.090
is a brain in a jar. It can talk to you. An agent

00:03:05.090 --> 00:03:08.270
is a brain that's been given digital hands that

00:03:08.270 --> 00:03:12.710
can open files, run code, search databases. It

00:03:12.710 --> 00:03:15.270
can execute a whole workflow on its own. So it

00:03:15.270 --> 00:03:17.270
has agency. It can actually interact with its

00:03:17.270 --> 00:03:20.289
environment. Exactly. And Biomni is working across

00:03:20.289 --> 00:03:23.569
25 different scientific fields. It's building

00:03:23.569 --> 00:03:26.610
this memory bank of skills. It learns expert

00:03:26.610 --> 00:03:28.689
workflows so it doesn't have to start from scratch

00:03:28.689 --> 00:03:31.430
every single time. See, that's the key differentiator.

00:03:31.509 --> 00:03:33.569
It's not just retrieving data. It's figuring

00:03:33.569 --> 00:03:35.909
out. how to do the science. Yeah. But the example

00:03:35.909 --> 00:03:37.610
that really stood out to me, the one that really

00:03:37.610 --> 00:03:40.389
made me pause, was from MIT. Oh, the Matsuram

00:03:40.389 --> 00:03:42.729
Project. Such a great name, by the way. Scientists

00:03:42.729 --> 00:03:44.789
love a good pun. They really do. So they were

00:03:44.789 --> 00:03:47.270
using it for CRISPR screens. For anyone listening

00:03:47.270 --> 00:03:49.550
who might not be a biologist, CRISPR is basically

00:03:49.550 --> 00:03:53.490
gene editing. And these screens, they're usually

00:03:53.490 --> 00:03:56.000
a brute force process, right? Oh, incredibly

00:03:56.000 --> 00:03:58.979
brute force. You basically have to disable thousands

00:03:58.979 --> 00:04:01.360
of genes one by one just to see what breaks.

00:04:01.539 --> 00:04:04.699
It's tedious. It's expensive. And it's so prone

00:04:04.699 --> 00:04:06.979
to human error because you're just staring at

00:04:06.979 --> 00:04:09.960
these massive data sets for weeks. And this is

00:04:09.960 --> 00:04:12.900
where the story gets a little wild. Because Claude

00:04:12.900 --> 00:04:15.039
didn't just speed up the spreadsheet work. No.

00:04:15.319 --> 00:04:17.339
This is the part that just stopped me in my tracks.

00:04:17.439 --> 00:04:20.709
I mean, whoa. Yeah. Get this. Claude actually

00:04:20.709 --> 00:04:23.850
analyzed the data and it spotted an RNA modification

00:04:23.850 --> 00:04:26.870
pathway that the human scientists had completely

00:04:26.870 --> 00:04:29.769
missed. Wait, hold on. Let's pause right there.

00:04:29.870 --> 00:04:32.470
It found a biological reality that the human

00:04:32.470 --> 00:04:35.389
experts overlooked. Yes. When they benchmarked

00:04:35.389 --> 00:04:37.410
it, you know, against other models and against

00:04:37.410 --> 00:04:40.050
the human analysis, Claude identified a pathway

00:04:40.050 --> 00:04:42.269
that wasn't obvious. It saw a pattern in the

00:04:42.269 --> 00:04:44.490
noise that the humans just didn't see. That feels

00:04:44.490 --> 00:04:46.810
like a pivotal moment. It's one thing to say

00:04:46.810 --> 00:04:49.829
AI can calculate faster than me. We're used to

00:04:49.829 --> 00:04:52.069
that. It's a totally different thing to say the

00:04:52.069 --> 00:04:54.430
AI noticed something about biology that I didn't.

00:04:54.449 --> 00:04:56.490
It completely changes the value proposition.

00:04:56.610 --> 00:04:59.170
It moves from assistant to collaborator. And

00:04:59.170 --> 00:05:00.790
there's a third example, right? It touched on

00:05:00.790 --> 00:05:03.149
the economics of all this. The Lindbergh lab

00:05:03.149 --> 00:05:06.779
at Stanford. Right. This is all about cost. Gene

00:05:06.779 --> 00:05:09.259
screens are incredibly expensive. We're talking

00:05:09.259 --> 00:05:12.259
something like $20 ,000 per screen. 20 grand.

00:05:12.459 --> 00:05:14.480
Yeah. So you can't just test everything. You

00:05:14.480 --> 00:05:16.420
have to be picky. You have to place your bets

00:05:16.420 --> 00:05:19.759
very, very carefully. Usually, scientists use

00:05:19.759 --> 00:05:22.899
a mix of spreadsheets, some intuition, and existing

00:05:22.899 --> 00:05:25.819
research to guess which genes to target. It's

00:05:25.819 --> 00:05:29.819
an educated guess, but at $20 ,000 a pop, it's

00:05:29.819 --> 00:05:32.540
still a big gamble. So how did the agent change

00:05:32.540 --> 00:05:35.980
that gamble? Well, they had Claude navigate a

00:05:35.980 --> 00:05:38.879
molecule relationship map. So instead of just

00:05:38.879 --> 00:05:41.759
looking at lists, the AI analyzed this whole

00:05:41.759 --> 00:05:44.540
network of connections between molecules, and

00:05:44.540 --> 00:05:46.939
it suggested the targets based on those relationships.

00:05:47.259 --> 00:05:49.540
So it's replacing intuition with network analysis.

00:05:50.000 --> 00:05:52.660
Precisely. And the study emphasizes this point

00:05:52.660 --> 00:05:55.139
over and over again. None of these breakthroughs

00:05:55.139 --> 00:05:56.939
came from just chatting with the bot. Right.

00:05:57.000 --> 00:05:59.420
That goes back to your agent definition. Yeah,

00:05:59.480 --> 00:06:01.500
the breakthrough is connecting Claude to tools,

00:06:01.680 --> 00:06:04.779
giving it a custom workflow, adding guardrails

00:06:04.779 --> 00:06:06.620
so it doesn't just hallucinate and make things

00:06:06.620 --> 00:06:08.660
up. It's like the difference between asking a

00:06:08.660 --> 00:06:11.860
carpenter to describe a table and giving a carpenter

00:06:11.860 --> 00:06:14.519
a hammer and wood and saying, build it. That's

00:06:14.519 --> 00:06:16.519
a perfect analogy. So this brings up the big

00:06:16.519 --> 00:06:19.540
question for me here. If AI can cut the time

00:06:19.540 --> 00:06:21.339
from months to minutes and find things that we

00:06:21.339 --> 00:06:25.459
miss, does this make science cheaper? Or does

00:06:25.459 --> 00:06:27.519
it just mean we start asking much, much harder

00:06:27.519 --> 00:06:30.480
questions? It removes the grunt work so brains

00:06:30.480 --> 00:06:32.420
can focus on the breakthrough. So we just level

00:06:32.420 --> 00:06:34.300
up the difficulty because we have a smarter teammate.

00:06:34.519 --> 00:06:36.480
We have to. All the easy problems are already

00:06:36.480 --> 00:06:39.759
solved. All right. Let's move from the sterile

00:06:39.759 --> 00:06:41.720
lab environment to something a little messier.

00:06:41.959 --> 00:06:45.319
The kitchen, the garage, the factory floor. The

00:06:45.319 --> 00:06:47.579
real world. Exactly. We're talking about physical

00:06:47.579 --> 00:06:50.529
AI. Yeah. Now, I have to admit, when I first

00:06:50.529 --> 00:06:53.069
saw that term in the source material, my marketing

00:06:53.069 --> 00:06:55.889
alarm bells went off. Physical AI just sounds

00:06:55.889 --> 00:06:57.649
like a buzzword somebody made up to sell more

00:06:57.649 --> 00:07:00.350
chips. It definitely has that ring to it. I get

00:07:00.350 --> 00:07:03.350
it. But if you look at the actual capital expenditure,

00:07:03.649 --> 00:07:06.910
the money is telling a different story. The source

00:07:06.910 --> 00:07:10.550
material is adamant about this. 2026 won't be

00:07:10.550 --> 00:07:13.649
on screens. That is a very bold claim considering

00:07:13.649 --> 00:07:15.910
how much of our lives are on screens. It is.

00:07:16.069 --> 00:07:18.189
But look at the investment stack. We're looking

00:07:18.189 --> 00:07:23.170
at a $123 billion ecosystem. Yeah. NVIDIA, Tesla,

00:07:23.329 --> 00:07:26.649
the entire robotics supply chain, they are all

00:07:26.649 --> 00:07:29.029
racing towards this. So when they say physical

00:07:29.029 --> 00:07:32.050
AI, what are they actually building? It's really

00:07:32.050 --> 00:07:34.689
about embodiment. For the last few years, AI

00:07:34.689 --> 00:07:37.269
has been a brain in a jar, right? It's a server

00:07:37.269 --> 00:07:39.670
farm in Virginia. It outputs text or images.

00:07:40.189 --> 00:07:42.370
Physical AI is about giving that brain a body.

00:07:42.470 --> 00:07:44.149
We're putting that brain inside the bodies we

00:07:44.149 --> 00:07:46.790
already have. You know, our cars, our appliances.

00:07:47.110 --> 00:07:49.649
Right. The source material had the slightly hilarious

00:07:49.649 --> 00:07:51.970
example of a future where your fridge is negotiating

00:07:51.970 --> 00:07:54.629
with your dishwasher. I saw that. Yeah. And honestly,

00:07:54.750 --> 00:07:56.670
it sounds like a bad cartoon. Why do I need my

00:07:56.670 --> 00:07:58.779
appliances to gossip with each other? I know,

00:07:58.899 --> 00:08:01.680
it sounds ridiculous. But think about the internet

00:08:01.680 --> 00:08:04.459
of things. We've been hearing about IoT for a

00:08:04.459 --> 00:08:07.079
decade. Your toaster will talk to the internet.

00:08:07.339 --> 00:08:10.399
And for 10 years, all that really meant was that

00:08:10.399 --> 00:08:12.459
your toaster needed a firmware update and had

00:08:12.459 --> 00:08:15.220
a security vulnerability. It was just dumb connectivity.

00:08:15.600 --> 00:08:20.240
Exactly. Physical AI is about competence. It's

00:08:20.240 --> 00:08:23.259
not just connected. It has agency. It's about

00:08:23.259 --> 00:08:25.800
the fridge, analyzing what's inside, realizing

00:08:25.800 --> 00:08:28.399
you're out of milk, checking the dishwasher cycle

00:08:28.399 --> 00:08:30.839
to coordinate energy use to the cheapest time

00:08:30.839 --> 00:08:33.539
of day, and maybe ordering groceries based on

00:08:33.539 --> 00:08:35.399
your diet plan. So it's the fulfillment of the

00:08:35.399 --> 00:08:38.179
smart home promise, but like actually smart this

00:08:38.179 --> 00:08:40.580
time. And it's about the supply chain. If you're

00:08:40.580 --> 00:08:43.419
a consumer or a creator, you are now part of

00:08:43.419 --> 00:08:45.970
this loop. This isn't just about cool robots

00:08:45.970 --> 00:08:48.309
doing backflips. It's about the entire physical

00:08:48.309 --> 00:08:51.149
world getting a new operating system. That is

00:08:51.149 --> 00:08:53.509
both exciting and, frankly, a little terrifying.

00:08:53.789 --> 00:08:55.710
I'm just picturing my fridge judging my late

00:08:55.710 --> 00:08:57.789
-night snack choices. Oh, it absolutely will.

00:08:57.889 --> 00:08:59.509
I'm sorry, Dave. I can't open the door. You've

00:08:59.509 --> 00:09:01.710
exceeded your calorie limit for the day. See,

00:09:01.750 --> 00:09:03.649
that's where I pull the plug. But a serious question,

00:09:03.769 --> 00:09:07.370
then. Yeah. Are we ready for our appliances to

00:09:07.370 --> 00:09:10.409
have agency? Or do we just want better automation?

00:09:10.830 --> 00:09:12.809
We want them to do the dishes, not argue about

00:09:12.809 --> 00:09:15.500
the soap. Right. We want the labor, not the debate.

00:09:15.820 --> 00:09:18.779
Exactly. Okay. Let's pivot from the domestic

00:09:18.779 --> 00:09:22.240
to the geopolitical. Because while the U .S.

00:09:22.240 --> 00:09:25.000
is focusing on robots, there's this massive story

00:09:25.000 --> 00:09:28.019
coming out of China about the chips that run

00:09:28.019 --> 00:09:30.620
them. Hardware wars. We have a story here about

00:09:30.620 --> 00:09:33.899
Zippo AI launching a new model. It's called GLM

00:09:33.899 --> 00:09:36.559
Image. And on the surface, okay, it's an image

00:09:36.559 --> 00:09:39.500
generator, like Midjourney or DALI. Right. It's

00:09:39.500 --> 00:09:43.200
a 16 billion parameter model. Decent size, pretty

00:09:43.200 --> 00:09:45.539
capable. But the specs of the model aren't the

00:09:45.539 --> 00:09:48.059
real headline here. The headline is the silicon

00:09:48.059 --> 00:09:50.100
it was trained on. Exactly. Usually when you

00:09:50.100 --> 00:09:51.679
read a paper like this, you scroll down to the

00:09:51.679 --> 00:09:53.139
hardware section and it always says, you know,

00:09:53.159 --> 00:09:55.980
trained on NVIDIA H100 clusters. Industry standard.

00:09:56.179 --> 00:09:59.960
But this one, no NVIDIA, no AMD, no US tech at

00:09:59.960 --> 00:10:02.620
all. This was trained entirely on Huawei hardware.

00:10:03.120 --> 00:10:05.679
That is a huge, huge signal. We've been talking

00:10:05.679 --> 00:10:08.179
about the chip bans for years. The U .S. government

00:10:08.179 --> 00:10:11.120
placed these really strict restrictions to try

00:10:11.120 --> 00:10:14.759
and limit China's access to high -end AI training

00:10:14.759 --> 00:10:17.759
chips. And the prevailing theory in Washington

00:10:17.759 --> 00:10:20.080
was that this would, you know, cripple their

00:10:20.080 --> 00:10:22.740
ability to train foundation models. It was supposed

00:10:22.740 --> 00:10:26.480
to be a chokehold. But Zipu AI just proved that

00:10:26.480 --> 00:10:28.860
that chokehold might be slipping. They build

00:10:28.860 --> 00:10:31.259
a full domestic stack. This is what China has

00:10:31.259 --> 00:10:34.259
been chasing for years. Independence. Let's look

00:10:34.259 --> 00:10:36.039
under the hood for a second, because the way

00:10:36.039 --> 00:10:38.179
they did this is technically pretty interesting.

00:10:38.320 --> 00:10:41.000
It's not just a copy paste of a U .S. model.

00:10:41.139 --> 00:10:42.960
No, the architecture is quite clever, actually.

00:10:43.000 --> 00:10:46.080
They use a two stage system. OK. Bring that down

00:10:46.080 --> 00:10:50.080
for us. So stage one uses what's called an autoregressive

00:10:50.080 --> 00:10:52.039
transformer. All right, jargon alert. Let's define

00:10:52.039 --> 00:10:54.299
autoregressive transformer for everyone listening.

00:10:54.500 --> 00:10:56.940
Yeah, the simplest definition is it's a model

00:10:56.940 --> 00:10:59.559
that predicts the next piece of data in a sequence.

00:10:59.860 --> 00:11:02.179
Kind of like how chat GPT predicts the next word

00:11:02.179 --> 00:11:05.309
in a sentence. Exactly like that. But here it's

00:11:05.309 --> 00:11:07.470
predicting the layout of an image. It's kind

00:11:07.470 --> 00:11:09.570
of acting like an architect. So it creates the

00:11:09.570 --> 00:11:11.870
blueprint first. Right. It predicts what they

00:11:11.870 --> 00:11:15.029
call semantic VQ tokens. Basically, it figures

00:11:15.029 --> 00:11:16.929
out the meaning and the structure of the image

00:11:16.929 --> 00:11:19.730
first. That's a nine billion parameter model

00:11:19.730 --> 00:11:22.529
doing the planning. And then stage two. Stage

00:11:22.529 --> 00:11:25.450
two is a diffusion transformer. This is the painter.

00:11:25.629 --> 00:11:28.169
It takes that blueprint and it renders the final

00:11:28.169 --> 00:11:31.549
pixels. So logic first, then aesthetics. Yes.

00:11:32.429 --> 00:11:34.769
Because of this split approach, it's really good

00:11:34.769 --> 00:11:37.230
at text -heavy images, which has been a weak

00:11:37.230 --> 00:11:39.190
point for some other models. Now, the source

00:11:39.190 --> 00:11:42.929
material does mention that in terms of pure artistic

00:11:42.929 --> 00:11:47.690
vibes, it might not beat the absolute top -tier

00:11:47.690 --> 00:11:50.149
Western models yet. You know, models like Nano

00:11:50.149 --> 00:11:52.990
Banana or Seadream. Maybe not yet, no. But it

00:11:52.990 --> 00:11:55.710
proves the concept. It proves you can train a

00:11:55.710 --> 00:11:58.370
massive functional commercial -grade model without

00:11:58.370 --> 00:12:01.340
ever touching American silicon. So if the hardware

00:12:01.340 --> 00:12:03.879
bands were meant to stop them, did they just

00:12:03.879 --> 00:12:05.740
force them to accelerate their own development?

00:12:06.299 --> 00:12:08.419
Necessity is the mother of invention. We force

00:12:08.419 --> 00:12:10.740
them to build their own. And now that stack exists.

00:12:10.980 --> 00:12:12.779
And once it exists, you can't un -invent it.

00:12:12.899 --> 00:12:15.399
The divergence is real now. We basically have

00:12:15.399 --> 00:12:18.460
two separate hardware ecosystems developing in

00:12:18.460 --> 00:12:21.799
parallel. Which complicates, well, everything.

00:12:22.139 --> 00:12:24.100
All right, let's take a quick breath. That was

00:12:24.100 --> 00:12:27.039
a lot of heavy lifting. Science agents, robotic

00:12:27.039 --> 00:12:30.480
supply chains, and semiconductor geopolitics.

00:12:30.559 --> 00:12:33.279
The trifecta of modern anxiety. Let's move to

00:12:33.279 --> 00:12:36.679
our rapid fire segment. Culture, chaos, and what

00:12:36.679 --> 00:12:38.759
we're calling cursed phrases. Oh, I love this

00:12:38.759 --> 00:12:42.860
list. First up, Leo Tolstoy. Poor Tolstoy. Can't

00:12:42.860 --> 00:12:45.759
even rest in peace. People on X, formerly Twitter,

00:12:45.960 --> 00:12:48.899
are accusing Leo Tolstoy of using AI. Because

00:12:48.899 --> 00:12:51.440
of the em dashes. Apparently using em dashes,

00:12:51.440 --> 00:12:53.700
that long dash you use for a break in a sentence,

00:12:53.860 --> 00:12:56.679
is now considered a sign of AI writing. Never

00:12:56.679 --> 00:12:58.399
mind that War and Peace was published in the

00:12:58.399 --> 00:13:01.320
1860s. Right. Or that, you know, he died in 1910.

00:13:01.500 --> 00:13:03.639
The paranoia is real. People are seeing ghosts

00:13:03.639 --> 00:13:05.539
in the machine everywhere. It's becoming a witch

00:13:05.539 --> 00:13:08.559
hunt. If you use good grammar or complex punctuation,

00:13:08.779 --> 00:13:11.000
you must be a bot. Which leads us right to the

00:13:11.000 --> 00:13:13.980
next item, the cursed phrases list. Ah, yes.

00:13:14.679 --> 00:13:17.379
There's a new list of words circulating that

00:13:17.379 --> 00:13:21.860
supposedly out you as an AI. Words that the models

00:13:21.860 --> 00:13:25.039
are statistically more likely to use. I actually

00:13:25.039 --> 00:13:26.860
checked this list before we started recording.

00:13:27.100 --> 00:13:29.480
And did you pass? I have to be honest. Yeah.

00:13:29.580 --> 00:13:32.320
I failed miserably. I used Delve and Tapestry

00:13:32.320 --> 00:13:34.860
all the time. You're a bot, confirmed. I know.

00:13:34.919 --> 00:13:37.809
I felt so called out. But it's a real problem

00:13:37.809 --> 00:13:39.649
for writers now. I actually still wrestle with

00:13:39.649 --> 00:13:41.509
this. I find myself pruning my own vocabulary,

00:13:41.950 --> 00:13:44.789
deleting words I love just because I'm afraid

00:13:44.789 --> 00:13:47.090
someone will think I use chat GPT to write it.

00:13:47.169 --> 00:13:49.730
It's this weird feedback loop of anxiety. It's

00:13:49.730 --> 00:13:52.029
a crisis of authenticity. We're stripping down

00:13:52.029 --> 00:13:53.929
our language just to prove we're human, which

00:13:53.929 --> 00:13:56.690
is so ironic. Item three. We touch on this in

00:13:56.690 --> 00:13:59.090
the intro. Elon versus Sam. The lawsuit that

00:13:59.090 --> 00:14:01.210
keeps on giving. We mentioned the Mars colony

00:14:01.210 --> 00:14:03.009
email, but there was another detail that really

00:14:03.009 --> 00:14:06.129
stuck out to me. Elon Musk. in that correspondence

00:14:06.129 --> 00:14:10.470
called OpenAI 0 % likely to succeed before he

00:14:10.470 --> 00:14:12.750
left. That is such a definitive statement, not

00:14:12.750 --> 00:14:15.509
unlikely, zero. It really just highlights the

00:14:15.509 --> 00:14:17.669
split in vision, doesn't it? You have Altman

00:14:17.669 --> 00:14:21.889
with this expansive, almost hallucinogenic optimism,

00:14:22.210 --> 00:14:27.409
and then Musk with this brutal probability -based

00:14:27.409 --> 00:14:30.210
skepticism. And yet, strangely, they're both

00:14:30.210 --> 00:14:32.629
pushing the frontier harder than anyone else.

00:14:32.889 --> 00:14:36.480
True. Item four. Energy. The elephant in the

00:14:36.480 --> 00:14:38.379
room. The Trump administration is reportedly

00:14:38.379 --> 00:14:42.679
asking tech giants for $15 billion. For power

00:14:42.679 --> 00:14:44.799
plants. Power plants they might never use. They

00:14:44.799 --> 00:14:46.879
want an upfront commitment. And the tech companies

00:14:46.879 --> 00:14:48.460
aren't thrilled. They say they weren't even consulted

00:14:48.460 --> 00:14:50.679
on the price tag. But it just shows the scale

00:14:50.679 --> 00:14:52.259
of the infrastructure problem. We're not talking

00:14:52.259 --> 00:14:53.980
about coding anymore. We're talking about AI

00:14:53.980 --> 00:14:56.539
consuming electricity on a nation state level.

00:14:56.720 --> 00:14:58.539
The constraints aren't software anymore. It's

00:14:58.539 --> 00:15:02.899
physics. It's electrons. And finally, the bear

00:15:02.899 --> 00:15:06.330
case. Michael Burry. The big short guy himself.

00:15:06.710 --> 00:15:09.129
He's quoting Warren Buffett, and he's predicting

00:15:09.129 --> 00:15:12.490
the AI stock boom, specifically NVIDIA, is going

00:15:12.490 --> 00:15:15.350
to end badly. He thinks it's a bubble. He thinks

00:15:15.350 --> 00:15:18.090
the hype has just completely outpaced the reality.

00:15:18.409 --> 00:15:20.750
So is Burry right, or is he just early again?

00:15:20.950 --> 00:15:23.149
He's betting against the momentum of the entire

00:15:23.149 --> 00:15:27.649
world. That's a bold move. Very bold. OK, we're

00:15:27.649 --> 00:15:29.570
back. Let's try to pull all of these threads

00:15:29.570 --> 00:15:31.830
together before we go. We covered a lot of ground

00:15:31.830 --> 00:15:34.649
today. We did. But I think there's a pretty clear

00:15:34.649 --> 00:15:37.289
through line here. I agree. It starts with competence.

00:15:37.570 --> 00:15:39.730
That's Claude. It's not just chatting. It's doing

00:15:39.730 --> 00:15:43.009
real science at MIT and Stanford. It's discovering

00:15:43.009 --> 00:15:45.889
biology that we missed. Then you have embodiment,

00:15:45.970 --> 00:15:49.769
the whole physical layer. That $123 billion stack,

00:15:50.029 --> 00:15:53.830
the fridge, the car, the factory. AI is growing

00:15:53.830 --> 00:15:56.129
legs and entering the supply chain. And then

00:15:56.129 --> 00:15:58.730
divergence. China building a non -U .S. stack

00:15:58.730 --> 00:16:01.690
with Zipu and Huawei. The hardware world is splitting

00:16:01.690 --> 00:16:03.889
in two, and it's proving that bands might just

00:16:03.889 --> 00:16:06.809
accelerate independence. And finally, hype versus

00:16:06.809 --> 00:16:10.450
reality. Mars colonies run by kids versus the

00:16:10.450 --> 00:16:13.149
reality of crumbling power grids and these $15

00:16:13.149 --> 00:16:15.970
billion demands for electricity. It really feels

00:16:15.970 --> 00:16:18.289
like the play phase is over. Yeah, it's not a

00:16:18.289 --> 00:16:20.610
toy anymore. Now the hard work and the hardware

00:16:20.610 --> 00:16:23.929
begins. So here's my question for you, listening

00:16:23.929 --> 00:16:26.529
to this. We talked about research agents and

00:16:26.529 --> 00:16:31.139
physical AI. Are you using these tools to chat

00:16:31.139 --> 00:16:34.960
or are you using them to build workflows? That's

00:16:34.960 --> 00:16:37.500
the shift you really need to make. Because if

00:16:37.500 --> 00:16:40.000
you're just chatting, you might be missing the

00:16:40.000 --> 00:16:41.879
whole point. Oh, and definitely check the show

00:16:41.879 --> 00:16:44.379
notes for that cursed phrases list. Save yourself.

00:16:44.480 --> 00:16:47.039
Don't be a tapestry person. Please don't. I want

00:16:47.039 --> 00:16:48.899
to leave you with one final thought. We spent

00:16:48.899 --> 00:16:50.840
a lot of time on the code and the chips today.

00:16:51.519 --> 00:16:55.039
But think about that $15 billion demand for power

00:16:55.039 --> 00:16:58.149
plants. It's a massive number. The ultimate constraint

00:16:58.149 --> 00:17:00.710
on AI isn't code anymore. It isn't even really

00:17:00.710 --> 00:17:03.230
chips, as China just proved. It's electricity.

00:17:03.750 --> 00:17:05.890
The smartest model in the world is completely

00:17:05.890 --> 00:17:08.150
useless if you can't plug it in. Keep diving.

00:17:08.309 --> 00:17:08.930
See you next time.
