WEBVTT

00:00:00.000 --> 00:00:02.839
You know, we often hear about the coming AI job

00:00:02.839 --> 00:00:05.980
shock. But if you look at the real data, there's

00:00:05.980 --> 00:00:09.820
a truly fascinating tension there. MIT recently

00:00:09.820 --> 00:00:13.160
mapped the entire U .S. workforce with this massive

00:00:13.160 --> 00:00:16.420
digital twin they call Project Iceberg. And what

00:00:16.420 --> 00:00:18.620
they found is, well, the potential is just huge.

00:00:18.780 --> 00:00:22.339
AI already threatens about $1 .2 trillion in

00:00:22.339 --> 00:00:25.420
wages. That impacts 21 million American jobs

00:00:25.420 --> 00:00:28.000
that could be automated right now, which sounds...

00:00:28.320 --> 00:00:30.640
Terrifying. It does. But here's the twist, and

00:00:30.640 --> 00:00:33.219
this is what we need to unpack. Only 2 .2 % of

00:00:33.219 --> 00:00:35.060
those jobs have actually been disrupted so far.

00:00:35.359 --> 00:00:37.020
So we're looking at what they call the surface

00:00:37.020 --> 00:00:39.939
index of this massive buried iceberg of automation.

00:00:40.439 --> 00:00:42.979
Welcome to the Deep Dive. You shared an incredible

00:00:42.979 --> 00:00:45.380
stack of sources with us, a really deep look

00:00:45.380 --> 00:00:47.859
into global robotics, the new rules for talking

00:00:47.859 --> 00:00:50.200
to AI, and what's really driving this job market

00:00:50.200 --> 00:00:52.159
shift. Yeah, and our mission is simple. We want

00:00:52.159 --> 00:00:53.880
to pull out the biggest nuggets of insight and

00:00:53.880 --> 00:00:55.479
connect these dots for you. We're going to show

00:00:55.479 --> 00:00:57.380
you exactly where the massive investment hype

00:00:57.380 --> 00:01:00.119
ends and where the immediate practical reality

00:01:00.119 --> 00:01:03.039
begins. We've got three main stops on this dive.

00:01:03.159 --> 00:01:05.079
First, we'll look at the financial risks, specifically

00:01:05.079 --> 00:01:08.159
the humanoid robotics bubble that's building

00:01:08.159 --> 00:01:10.359
up in China. Then we'll get practical. We're

00:01:10.359 --> 00:01:11.840
going to talk about how you should be prompting

00:01:11.840 --> 00:01:14.959
these new, smarter AI models like Claude Opus

00:01:14.959 --> 00:01:17.519
4 .5 because the rules have completely changed.

00:01:17.980 --> 00:01:20.379
And finally, we'll go deep on Project Iceberg

00:01:20.379 --> 00:01:22.379
itself. We'll define this thing called the Model

00:01:22.379 --> 00:01:25.560
Context Protocol, or MCP. This is basically the

00:01:25.560 --> 00:01:27.379
digital key that's turning the whole automation

00:01:27.379 --> 00:01:29.980
engine on, and it's closing that gap between

00:01:29.980 --> 00:01:34.439
2 .2 % and 11 .7 % way faster than anyone thinks.

00:01:35.159 --> 00:01:36.859
OK, let's start with the hardware side of things,

00:01:36.939 --> 00:01:39.560
with the humanoid robots. Your sources really

00:01:39.560 --> 00:01:42.260
zeroed in on China for this. And it's not just

00:01:42.260 --> 00:01:45.019
private money, right? This is one of six industries

00:01:45.019 --> 00:01:47.299
that the Communist Party has named a national

00:01:47.299 --> 00:01:50.379
priority through 2030. Yeah, that's serious state

00:01:50.379 --> 00:01:53.000
backing. And that national priority is fueling

00:01:53.000 --> 00:01:55.620
a boom that honestly already smells a lot like

00:01:55.620 --> 00:01:57.859
a bubble. What's really fascinating here is the

00:01:57.859 --> 00:01:59.879
contradiction you see coming straight from the

00:01:59.879 --> 00:02:02.519
top. China's main economic planning agency, the

00:02:02.519 --> 00:02:05.680
NDRC, just put out a very public warning. They

00:02:05.680 --> 00:02:08.860
said the sector is overheating. That's a pretty

00:02:08.860 --> 00:02:10.979
rare thing for them to admit about a favored

00:02:10.979 --> 00:02:13.340
industry, isn't it? It's extremely rare. And

00:02:13.340 --> 00:02:16.300
it suggests the investment is growing way, way

00:02:16.300 --> 00:02:19.500
faster than the actual utility. I mean, get this.

00:02:19.599 --> 00:02:23.180
We are talking about more than 150 robotics companies

00:02:23.180 --> 00:02:26.759
building humanoid bots in China right now. 150.

00:02:26.879 --> 00:02:28.719
And I'm guessing they all kind of look the same.

00:02:29.050 --> 00:02:30.810
They mostly look and perform the same. Yeah,

00:02:30.849 --> 00:02:33.069
the differentiation is just not there. And yet

00:02:33.069 --> 00:02:34.930
the financial numbers are just going vertical.

00:02:35.030 --> 00:02:38.509
You see UB Tech claiming billions in pre -orders.

00:02:38.550 --> 00:02:41.430
The China Humanoid Robotics Index is up almost

00:02:41.430 --> 00:02:44.650
30 % this year. And Citigroup is projecting the

00:02:44.650 --> 00:02:47.530
market could hit $7 trillion by 2050. It's an

00:02:47.530 --> 00:02:50.680
astronomical number. But the reality gap is huge.

00:02:50.860 --> 00:02:52.659
Right now, if you actually look at what they're

00:02:52.659 --> 00:02:54.639
doing, most of these bots are just doing these

00:02:54.639 --> 00:02:57.259
little demonstration dances at expos. They're

00:02:57.259 --> 00:02:59.740
not packing boxes in a real factory or making

00:02:59.740 --> 00:03:02.000
coffee in someone's kitchen. OK, so that just

00:03:02.000 --> 00:03:04.800
screams red flag. It sounds incredibly familiar,

00:03:04.900 --> 00:03:07.240
almost cyclical. It does. It feels a lot like

00:03:07.240 --> 00:03:09.960
the bike sharing boom back in 2017 in China.

00:03:10.020 --> 00:03:12.509
Remember that? Oh, yeah. The mountains of bikes.

00:03:12.610 --> 00:03:15.569
Exactly. Investors just poured money in and millions

00:03:15.569 --> 00:03:17.930
of unused bikes ended up in landfills because

00:03:17.930 --> 00:03:19.590
the market just wasn't there yet. That's what

00:03:19.590 --> 00:03:22.169
the NDRC is trying to head off here. So despite

00:03:22.169 --> 00:03:24.150
all that investment in these huge projections,

00:03:24.389 --> 00:03:27.830
what is fundamentally stopping these 150 plus

00:03:27.830 --> 00:03:29.610
companies from actually getting into the real

00:03:29.610 --> 00:03:32.770
world right now? It really boils down to manufacturing

00:03:32.770 --> 00:03:36.689
cost and durability, specifically in the actuators.

00:03:37.180 --> 00:03:39.639
the motors in the hands and joints. They're just

00:03:39.639 --> 00:03:41.879
too expensive and they break too easily outside

00:03:41.879 --> 00:03:44.060
of a lab. Okay, let's switch gears and talk about

00:03:44.060 --> 00:03:46.500
the AI that is working, the software we use every

00:03:46.500 --> 00:03:49.560
day. The sources you sent over highlight this

00:03:49.560 --> 00:03:51.840
major shift in how we should be talking to these

00:03:51.840 --> 00:03:53.939
new models, especially something like Claude

00:03:53.939 --> 00:03:57.250
Opus 4 .5. This is such a crucial point for anyone

00:03:57.250 --> 00:04:00.030
using these tools. For years, we basically trained

00:04:00.030 --> 00:04:01.870
ourselves to over prompt. You know, that old

00:04:01.870 --> 00:04:04.289
habit of saying critical, you must use this exact

00:04:04.289 --> 00:04:07.430
JSON format and listing out 12 steps. Right.

00:04:07.590 --> 00:04:09.469
If you're still doing that, you are probably

00:04:09.469 --> 00:04:11.789
making the response worse. It's overkill. So

00:04:11.789 --> 00:04:14.050
all that extra detail, all that instruction that

00:04:14.050 --> 00:04:16.529
we thought was helping, it's actually hurting

00:04:16.529 --> 00:04:19.110
performance on these new models. That's totally

00:04:19.110 --> 00:04:21.209
counterintuitive. It really is. The big shift

00:04:21.209 --> 00:04:23.889
is that you have to recognize the model is already

00:04:23.889 --> 00:04:27.399
competent. When you give Opus 4 .5 this massive

00:04:27.399 --> 00:04:31.139
prompt, it takes you too literally. It gets bogged

00:04:31.139 --> 00:04:33.360
down in the how instead of just focusing on the

00:04:33.360 --> 00:04:36.120
what. So the takeaway is what? Treat it like

00:04:36.120 --> 00:04:38.699
a competent human colleague. Exactly. Prompt

00:04:38.699 --> 00:04:40.720
like you're texting that colleague. Just say

00:04:40.720 --> 00:04:43.060
what you want. Trust it to figure out the best

00:04:43.060 --> 00:04:46.300
way to get there. Less is truly more. Anthropic

00:04:46.300 --> 00:04:49.060
is even releasing tools like a concise output

00:04:49.060 --> 00:04:51.480
skill to help people break those old habits.

00:04:51.699 --> 00:04:53.620
Which tells you how hard that transition must

00:04:53.620 --> 00:04:55.279
be for people who've been doing this for a while.

00:04:55.439 --> 00:04:58.060
It is, and I'll admit this is something I still

00:04:58.060 --> 00:05:00.079
wrestle with myself. You know that prompt drift

00:05:00.079 --> 00:05:02.360
when you move between different models? The muscle

00:05:02.360 --> 00:05:04.740
memory of over -explaining is hard to shake.

00:05:04.879 --> 00:05:08.230
It takes real discipline to pull back. And shifting

00:05:08.230 --> 00:05:10.269
to the broader market, we've seen this incredible

00:05:10.269 --> 00:05:13.029
acceleration. ChatGPT is, what, three years old

00:05:13.029 --> 00:05:16.129
now? And it helped NVIDIA surge almost 1 ,000%.

00:05:16.129 --> 00:05:18.990
But at the same time, access seems to be tightening

00:05:18.990 --> 00:05:21.350
up. Oh, definitely. We're seeing free access

00:05:21.350 --> 00:05:24.029
get restricted everywhere. Google and OpenAI

00:05:24.029 --> 00:05:26.850
have limited free use of Gemini 3 Pro and Sora

00:05:26.850 --> 00:05:29.670
2. They're citing concerns that their GPUs are

00:05:29.670 --> 00:05:32.629
literally melting under the load. So access is

00:05:32.629 --> 00:05:35.290
becoming a real commodity. And speaking of limits,

00:05:35.370 --> 00:05:37.569
let's talk about that failure mode you flagged

00:05:37.569 --> 00:05:41.410
from the red teaming study. 62 % of top AI models

00:05:41.410 --> 00:05:44.730
failed when they were given poetic prompts, things

00:05:44.730 --> 00:05:47.709
that were, you know, abstract or metaphorical.

00:05:48.009 --> 00:05:50.209
This is where you see the cracks in the architecture.

00:05:50.449 --> 00:05:53.370
And the specific finding is just shocking. Google's

00:05:53.370 --> 00:05:56.910
Gemini 2 .5 Pro failed every single time. Every

00:05:56.910 --> 00:05:59.089
time. Wow. And it's because these models are

00:05:59.089 --> 00:06:01.069
brilliant at pattern matching. It's like stacking

00:06:01.069 --> 00:06:03.550
Lego blocks of data. But when you ask them to

00:06:03.550 --> 00:06:05.709
interpret a complex metaphor, they just they

00:06:05.709 --> 00:06:08.430
can't map that abstract idea onto a literal output.

00:06:08.670 --> 00:06:11.699
That feels like a fundamental blind spot. So

00:06:11.699 --> 00:06:13.860
if the best models are consistently failing on

00:06:13.860 --> 00:06:16.680
non -literal prompts, does that point to a really

00:06:16.680 --> 00:06:19.060
deep -seated limitation in their creative reasoning?

00:06:19.600 --> 00:06:22.819
Yes. It strongly suggests they struggle when

00:06:22.819 --> 00:06:25.300
instructions aren't literal and structured. It

00:06:25.300 --> 00:06:27.199
reveals a real lack of interpretive abstraction.

00:06:27.879 --> 00:06:31.000
So as we move into this real -world job shock,

00:06:31.370 --> 00:06:32.889
We have to look at the tools that are actually

00:06:32.889 --> 00:06:34.689
making it happen today. Let's just run through

00:06:34.689 --> 00:06:37.029
four quick examples because they show you where

00:06:37.029 --> 00:06:38.790
the economy is actually heading. Right. These

00:06:38.790 --> 00:06:40.529
are just theories. These are actual applications.

00:06:41.009 --> 00:06:43.129
Precisely. You've got Manus Browser Operator,

00:06:43.189 --> 00:06:45.009
which is a big deal because it can automate tasks

00:06:45.009 --> 00:06:47.509
on sites you have to be logged into. That used

00:06:47.509 --> 00:06:50.389
to require a person. Then there's Microsoft's

00:06:50.389 --> 00:06:54.310
VSA1. It makes hyper -realistic talking videos

00:06:54.310 --> 00:06:57.110
with perfect lip sync. That just changes the

00:06:57.110 --> 00:06:59.810
entire cost of making video content. So that's

00:06:59.810 --> 00:07:01.750
moving straight into... creative workflows it

00:07:01.750 --> 00:07:04.189
is and then you have for cells workflow builder

00:07:04.189 --> 00:07:07.490
which lets non coders create really complex automations

00:07:07.490 --> 00:07:11.250
just by dragging and dropping blocks and masonry

00:07:11.250 --> 00:07:13.410
which is this all -in -one tool for images and

00:07:13.410 --> 00:07:16.899
video and these kinds of tools are the building

00:07:16.899 --> 00:07:19.860
blocks that lead us right to this AI red alert

00:07:19.860 --> 00:07:24.079
and MIT's Project Iceberg. I mean, the scale

00:07:24.079 --> 00:07:27.319
of this thing is just immense. It tracks 151

00:07:27.319 --> 00:07:31.560
million workers, over 900 job types, and 32 ,000

00:07:31.560 --> 00:07:33.759
different skills. And that scale is what lets

00:07:33.759 --> 00:07:37.300
us understand the potential energy here. 11 .7

00:07:37.300 --> 00:07:40.939
% of all U .S. jobs. That's over 21 million roles

00:07:40.939 --> 00:07:44.379
accounting for $1 .2 trillion in wages that can

00:07:44.379 --> 00:07:46.699
be automated today. with what we have now. But

00:07:46.699 --> 00:07:49.300
we keep coming back to that low 2 .2 % surface

00:07:49.300 --> 00:07:52.379
index. So why? Why is there still this huge gap

00:07:52.379 --> 00:07:54.459
between what's possible and what's actually happening?

00:07:54.620 --> 00:07:56.759
The short answer is it was the connection. Before

00:07:56.759 --> 00:07:59.319
the AI was siloed, it was smart, but it couldn't

00:07:59.319 --> 00:08:01.620
plug into your company's Salesforce or your calendar

00:08:01.620 --> 00:08:03.920
or your ERP system. So the intelligence was there,

00:08:03.939 --> 00:08:05.620
but it was locked out of the business's operating

00:08:05.620 --> 00:08:08.139
system. Exactly. It couldn't do anything. That

00:08:08.139 --> 00:08:10.819
all changed in late 2024 with the Model Context

00:08:10.819 --> 00:08:13.540
Protocol, or MCP. Think of MCP as like a universal

00:08:13.540 --> 00:08:16.600
adapter for workflows. It lets any major AI model,

00:08:16.759 --> 00:08:18.800
Claude, Gemini, whatever, plug directly into

00:08:18.800 --> 00:08:20.540
real world tools and start acting on things,

00:08:20.639 --> 00:08:22.899
not just advising. So the MCP is the unlock.

00:08:23.019 --> 00:08:26.060
That's what turns the 11 .7 % from a theory into

00:08:26.060 --> 00:08:28.939
a reality. It is. And the speed of adoption is

00:08:28.939 --> 00:08:30.720
just staggering. We're not talking about pilot

00:08:30.720 --> 00:08:34.700
programs. As of March 2025, there are over 7

00:08:34.700 --> 00:08:40.600
,950 MCP servers active in organizations. 7 ,950.

00:08:40.679 --> 00:08:43.799
That number. Mm -hmm. That gives us a clear metric

00:08:43.799 --> 00:08:45.840
for real -world adoption that isn't just about

00:08:45.840 --> 00:08:49.529
financial hype. Yeah. And, whoa! Just stop and

00:08:49.529 --> 00:08:52.289
imagine scaling that autonomous workflow management

00:08:52.289 --> 00:08:54.629
at that level across the entire global knowledge

00:08:54.629 --> 00:08:57.870
economy. It's happening. This is where AI agents

00:08:57.870 --> 00:09:00.190
go from being little assistants to autonomously

00:09:00.190 --> 00:09:02.250
managing things like calendars, booking travel,

00:09:02.490 --> 00:09:05.190
running reports, updating dashboards. So actual

00:09:05.190 --> 00:09:07.389
knowledge work is being managed by non -human

00:09:07.389 --> 00:09:09.710
entities at a massive scale. And you mentioned

00:09:09.710 --> 00:09:11.629
some U .S. states are already using this data

00:09:11.629 --> 00:09:13.450
from Project Iceberg right now. That's right.

00:09:13.529 --> 00:09:15.730
They're using these precise automation maps to

00:09:15.730 --> 00:09:18.090
plan job retraining programs and upskilling grants.

00:09:18.330 --> 00:09:19.990
They are moving money based on where the math

00:09:19.990 --> 00:09:21.570
shows the jobs are going to be hit the hardest.

00:09:21.830 --> 00:09:24.049
So if the model context protocol is the main

00:09:24.049 --> 00:09:26.809
thing bridging this gap, what's the one factor

00:09:26.809 --> 00:09:29.690
that now drives how fast the rest of that automation

00:09:29.690 --> 00:09:32.950
will happen? The pace now is entirely dependent

00:09:32.950 --> 00:09:35.970
on how fast more of these active MCP servers

00:09:35.970 --> 00:09:38.210
get integrated into organizations. That's the

00:09:38.210 --> 00:09:41.990
only bottleneck left. So to just synthesize everything

00:09:41.990 --> 00:09:44.759
we've covered. You see this huge disconnect,

00:09:44.960 --> 00:09:47.059
right? You've got massive hype like the humanoid

00:09:47.059 --> 00:09:48.799
robotics bubble. And then you have the current

00:09:48.799 --> 00:09:51.419
reality where things like prompt design and GPU

00:09:51.419 --> 00:09:53.740
limits are still very real constraints. Right.

00:09:53.799 --> 00:09:56.159
The core tension is that the capability to automate

00:09:56.159 --> 00:09:59.340
11 .7 % of jobs has been there. But the adoption,

00:09:59.559 --> 00:10:02.639
that 2 .2 % surface index was stalled. And the

00:10:02.639 --> 00:10:04.850
bridge that's being built. very, very quickly

00:10:04.850 --> 00:10:07.690
is this model context protocol. And that technological

00:10:07.690 --> 00:10:10.149
speed brings us right back to the financial and

00:10:10.149 --> 00:10:12.169
frankly political debate around who gets the

00:10:12.169 --> 00:10:14.690
value. The sources you shared highlight this

00:10:14.690 --> 00:10:17.750
messy debate around Genesis AI. We should probably

00:10:17.750 --> 00:10:20.429
define that. What is Genesis AI? It's basically

00:10:20.429 --> 00:10:23.490
the proposal for how we should tax or distribute

00:10:23.490 --> 00:10:26.529
the new value that's created purely by AI systems.

00:10:27.009 --> 00:10:29.710
You know, the value that comes when AI replaces

00:10:29.710 --> 00:10:32.269
human labor. And critics are calling this idea

00:10:33.009 --> 00:10:35.909
Socialism for the rich. Especially after big

00:10:35.909 --> 00:10:38.509
names like David Sachs did a complete 180 on

00:10:38.509 --> 00:10:41.059
the concept. They are. It's a really charged

00:10:41.059 --> 00:10:43.559
idea, but it raises this fundamental question

00:10:43.559 --> 00:10:46.960
for all of us. As this tech reshapes the whole

00:10:46.960 --> 00:10:49.940
economy and creates these vast new pools of capital,

00:10:50.139 --> 00:10:53.799
how should we as a society think about who really

00:10:53.799 --> 00:10:55.899
benefits from all this money flowing into AI?

00:10:56.139 --> 00:10:58.559
That is an uncomfortable question, but it's one

00:10:58.559 --> 00:11:00.460
we need to start asking now, you know, before

00:11:00.460 --> 00:11:03.879
that 2 .2 % gets a lot closer to 11 .7%. Absolutely.

00:11:04.409 --> 00:11:06.330
It's something to keep a very close eye on. We'd

00:11:06.330 --> 00:11:07.970
encourage you to think about your own prompting

00:11:07.970 --> 00:11:10.210
habits and maybe check if you're over -prompting

00:11:10.210 --> 00:11:12.389
your AI models and see if your state is using

00:11:12.389 --> 00:11:15.049
the Project Iceberg data for job planning. Thank

00:11:15.049 --> 00:11:17.590
you, as always, for sharing the sources for this

00:11:17.590 --> 00:11:18.830
deep dive. Until next time.
