WEBVTT

00:00:00.000 --> 00:00:01.820
I want you to visualize two very different things.

00:00:01.899 --> 00:00:06.620
On one side, you have this plan for a gigawatt

00:00:06.620 --> 00:00:09.240
scale supercomputer in India. They're calling

00:00:09.240 --> 00:00:12.779
it Stargate. It's massive. Or city level power

00:00:12.779 --> 00:00:14.980
consumption. Exactly. And then on the other side,

00:00:15.099 --> 00:00:17.399
you just have a tweet, a warning from Andrew

00:00:17.399 --> 00:00:21.399
Yang about the, quote, disemboweling of white

00:00:21.399 --> 00:00:24.239
collar jobs. It feels like two completely separate

00:00:24.239 --> 00:00:27.260
worlds. It does. But they're not. They're actually

00:00:27.260 --> 00:00:29.940
the same story. That huge piece of infrastructure

00:00:29.940 --> 00:00:32.320
is being built to automate the exact kind of

00:00:32.320 --> 00:00:35.020
cognitive work that keeps our offices running.

00:00:35.899 --> 00:00:39.520
Welcome to the Deep Dive. It is Wednesday, February

00:00:39.520 --> 00:00:42.719
18th, 2026. If you're listening, you're probably

00:00:42.719 --> 00:00:44.700
what we call the learner, someone who wants to

00:00:44.700 --> 00:00:46.500
see the mechanics underneath the machine, not

00:00:46.500 --> 00:00:48.439
just the headlines. And the headlines are just...

00:00:48.700 --> 00:00:50.840
They're deafening right now. I was looking at

00:00:50.840 --> 00:00:52.479
our sources for today, and it really feels like

00:00:52.479 --> 00:00:54.719
the tectonic plates are shifting. We're talking

00:00:54.719 --> 00:00:57.100
about physical AI infrastructure in India, a

00:00:57.100 --> 00:00:59.539
total change in how software thinks, and then,

00:00:59.539 --> 00:01:02.420
well, the economic fallout when all that hits

00:01:02.420 --> 00:01:04.140
the job market. Yeah, it's a chain reaction.

00:01:04.319 --> 00:01:06.459
So here's our plan. We're going to start in Delhi

00:01:06.459 --> 00:01:09.459
with this whole Stargate thing and what sovereign

00:01:09.459 --> 00:01:11.439
AI even means. Yeah, then we'll hit the agentic

00:01:11.439 --> 00:01:13.980
shift when tools start doing things on their

00:01:13.980 --> 00:01:15.939
own. And the cost of that intelligence, which

00:01:15.939 --> 00:01:18.140
is... Honestly, it's pretty staggering. Yeah.

00:01:18.200 --> 00:01:19.980
Then we'll land on that warning from Adrienne

00:01:19.980 --> 00:01:23.200
about the end of the middle manager. It's a lot,

00:01:23.200 --> 00:01:25.260
but you can draw a straight line through all

00:01:25.260 --> 00:01:26.519
of it. Okay, so let's start with the hardware.

00:01:27.000 --> 00:01:29.799
OpenAI for India, we hear these big user numbers

00:01:29.799 --> 00:01:32.739
all the time, but what's the actual scale we're

00:01:32.739 --> 00:01:34.500
talking about here? The scale is kind of hard

00:01:34.500 --> 00:01:37.239
to wrap your head around. It's 100 million weekly

00:01:37.239 --> 00:01:41.659
chat GPT users. Just in India? 100 million. Yeah.

00:01:41.719 --> 00:01:43.939
I mean, that's like a third of the entire U .S.

00:01:43.959 --> 00:01:46.640
population. But that's not even the big news.

00:01:46.799 --> 00:01:49.719
The real story is the partnership between OpenAI

00:01:49.719 --> 00:01:52.359
and the Tata Group. Which is basically the industrial

00:01:52.359 --> 00:01:55.379
backbone of India. Right. Specifically, Tata

00:01:55.379 --> 00:01:58.599
Consultancy Services, TCS, they've announced

00:01:58.599 --> 00:02:01.079
the Stargate Initiative. And this isn't just

00:02:01.079 --> 00:02:03.780
a cool name. They are literally building the

00:02:03.780 --> 00:02:06.700
infrastructure. OpenAI is the first customer

00:02:06.700 --> 00:02:09.439
for their new HyperVault data center. And they

00:02:09.439 --> 00:02:12.460
are starting at 100 megawatts of capacity. For

00:02:12.460 --> 00:02:15.939
anyone listening who's not an engineer, 100 megawatts.

00:02:15.939 --> 00:02:18.819
Is that a lot? It's huge for a starting point.

00:02:18.919 --> 00:02:21.000
Your typical big data center might be, I don't

00:02:21.000 --> 00:02:25.259
know, 20 or 30 megs. But the plan here is to

00:02:25.259 --> 00:02:29.400
scale to one gigawatt. A gigawatt. That's a nuclear

00:02:29.400 --> 00:02:31.699
reactor. It is. It's the kind of power that lights

00:02:31.699 --> 00:02:34.539
up a major city. And all of that energy is going

00:02:34.539 --> 00:02:38.530
to be dedicated to one thing. Thinking. Processing

00:02:38.530 --> 00:02:40.930
Tokens Which brings us to the key concept here,

00:02:41.110 --> 00:02:44.030
sovereign AI. I keep seeing that term, sovereign

00:02:44.030 --> 00:02:46.069
AI. It sounds very official, very diplomatic.

00:02:46.430 --> 00:02:48.550
What does it actually mean, though? Think of

00:02:48.550 --> 00:02:50.449
it like a national border for digital thought.

00:02:50.590 --> 00:02:53.370
For years, if you used a big AI model, your data

00:02:53.370 --> 00:02:56.069
was probably being sent to a server in Virginia

00:02:56.069 --> 00:02:58.509
or Dublin. Right. But if you're the Indian government

00:02:58.509 --> 00:03:01.270
or a huge bank there, you just can't have your

00:03:01.270 --> 00:03:03.150
sensitive data leaving the country. That's a

00:03:03.150 --> 00:03:05.530
non -starter. So sovereign AI means the whole

00:03:05.530 --> 00:03:08.349
stack, the computers, the storage, the model.

00:03:08.560 --> 00:03:11.199
It all lives inside India's borders. Exactly.

00:03:11.300 --> 00:03:13.800
And that suddenly unlocks all these use cases

00:03:13.800 --> 00:03:16.180
that were impossible before. Mission critical

00:03:16.180 --> 00:03:18.080
government work, for example. You couldn't do

00:03:18.080 --> 00:03:20.340
that before because of security. Right. Now you

00:03:20.340 --> 00:03:23.960
can. So you see Tata rolling out chat GPT enterprise

00:03:23.960 --> 00:03:27.060
to hundreds of thousands of their own people.

00:03:27.460 --> 00:03:29.780
A's New Delhi is using it for medical education.

00:03:29.979 --> 00:03:33.159
The data stays local. The intelligence is world

00:03:33.159 --> 00:03:35.560
class. So it's about nations controlling their

00:03:35.560 --> 00:03:38.889
own what? cognitive infrastructure? That's the

00:03:38.889 --> 00:03:40.849
perfect phrase for it. Yeah. Just like you need

00:03:40.849 --> 00:03:43.110
rows and bridges, nations now feel they need

00:03:43.110 --> 00:03:45.729
their own sovereign AI capacity. Otherwise, you're

00:03:45.729 --> 00:03:47.509
just renting intelligence from somebody else.

00:03:47.669 --> 00:03:49.669
Okay. So you have this massive engine being built

00:03:49.669 --> 00:03:52.550
in the East. Let's pivot to the software that

00:03:52.550 --> 00:03:54.909
runs on these things. The big news right now

00:03:54.909 --> 00:03:57.509
is what's being called the agentic shift. We

00:03:57.509 --> 00:04:03.030
saw XAI drop Grok 4 .20 into public beta. And,

00:04:03.050 --> 00:04:05.949
you know, Aside from the classic Elon Musk version

00:04:05.949 --> 00:04:07.889
number, what's actually different under the hood?

00:04:08.069 --> 00:04:10.110
The whole workflow is different. I mean, until

00:04:10.110 --> 00:04:13.129
now, we've treated AI like a very smart encyclopedia.

00:04:13.409 --> 00:04:15.610
You ask a question, it gives you an answer. It's

00:04:15.610 --> 00:04:19.610
one to one. Grok 4 .2 changes that. When you

00:04:19.610 --> 00:04:22.290
give it a complex task, it doesn't just answer.

00:04:22.410 --> 00:04:25.370
It actually spins up four separate AI agents

00:04:25.370 --> 00:04:28.209
that work in parallel. So like one agent is doing

00:04:28.209 --> 00:04:30.189
research while another one is reasoning through

00:04:30.189 --> 00:04:31.769
the logic. Exactly. It's like a little committee.

00:04:32.329 --> 00:04:35.209
One researches, one reasons, one might draft

00:04:35.209 --> 00:04:37.569
some code and another one reviews it. It's a

00:04:37.569 --> 00:04:40.170
shift from just chatting to actually collaborating.

00:04:40.529 --> 00:04:42.649
That makes sense. But my first thought is cost.

00:04:42.949 --> 00:04:45.769
If I ask one question and it kicks off four different

00:04:45.769 --> 00:04:48.470
processes, haven't I just like quadrupled my

00:04:48.470 --> 00:04:51.050
compute cost? You absolutely have. And that is

00:04:51.050 --> 00:04:53.430
the elephant in the room. You know, we saw Apple

00:04:53.430 --> 00:04:56.230
integrating ChatGPT and Claude into CarPlay,

00:04:56.350 --> 00:04:59.209
which is great. Tap to ask is way better than

00:04:59.209 --> 00:05:01.939
yelling at your phone. Sure. But the ubiquity

00:05:01.939 --> 00:05:04.600
of it all hides the incredible expense. I was

00:05:04.600 --> 00:05:06.519
just reading the projections for Anthropic. The

00:05:06.519 --> 00:05:08.620
numbers, they're just eye -watering. They're

00:05:08.620 --> 00:05:11.660
heavy. Anthropic is projected to pay over $80

00:05:11.660 --> 00:05:14.860
billion. That's billion with a B. $80 billion.

00:05:15.000 --> 00:05:18.240
To Amazon, Google, and Microsoft by 2029. And

00:05:18.240 --> 00:05:19.839
that is just the cloud computing bill. That's

00:05:19.839 --> 00:05:22.639
the rent they pay just to exist. $80 billion

00:05:22.639 --> 00:05:25.060
just to keep the lights on. And look, their sales

00:05:25.060 --> 00:05:28.060
are exploding. Projected to hit like $6 .4 billion.

00:05:28.180 --> 00:05:31.259
That's amazing growth. But just... Do the math.

00:05:31.560 --> 00:05:34.259
The cost of generating this intelligence is astronomical.

00:05:34.639 --> 00:05:37.420
That explains moves like Mistral acquiring Coyab.

00:05:37.540 --> 00:05:40.500
Right. The French AI lab. They're trying to build

00:05:40.500 --> 00:05:42.959
their own infrastructure muscle because renting

00:05:42.959 --> 00:05:45.439
compute forever is just not sustainable. This

00:05:45.439 --> 00:05:47.540
really isn't a software bubble then. It's an

00:05:47.540 --> 00:05:50.519
infrastructure arms race. Totally. Okay. So we

00:05:50.519 --> 00:05:53.500
have these huge expensive models. Yeah. How is

00:05:53.500 --> 00:05:55.699
that actually trickling down into the tools that

00:05:55.699 --> 00:05:57.629
you and I use every day? It's trickling down

00:05:57.629 --> 00:05:59.850
fast, and it's changing what the tools do. Look

00:05:59.850 --> 00:06:02.509
at something like WordPress. For 20 years, it

00:06:02.509 --> 00:06:05.310
was a blank page. You had to do the work. Now

00:06:05.310 --> 00:06:08.209
they've launched a built -in AI assistant. It

00:06:08.209 --> 00:06:10.730
can rewrite your text, change the layout, all

00:06:10.730 --> 00:06:12.889
without you leaving the dashboard. So the tool

00:06:12.889 --> 00:06:15.910
is no longer passive. No. Or look at Flixier

00:06:15.910 --> 00:06:18.089
for video editing. It automates trimming and

00:06:18.089 --> 00:06:21.980
cutting. Or Figma. which is using Claude code

00:06:21.980 --> 00:06:24.980
to turn a UI design directly into editable code

00:06:24.980 --> 00:06:27.879
layers, the tool is becoming an active partner.

00:06:28.180 --> 00:06:30.399
I have to. I have to make a bit of a vulnerable

00:06:30.399 --> 00:06:32.899
admission here. I honestly still wrestle with

00:06:32.899 --> 00:06:35.540
prompt drift myself. Oh, yeah. As these models

00:06:35.540 --> 00:06:38.160
get smarter, I find myself kind of struggling.

00:06:38.279 --> 00:06:41.019
Like, I get used to prompting Claude 4 .5, I

00:06:41.019 --> 00:06:43.480
learn his little quirks, and then 4 .6 drops,

00:06:43.639 --> 00:06:46.060
which it just did, and all my old prompts, my

00:06:46.060 --> 00:06:48.819
spells, they don't work the same. The model just

00:06:48.819 --> 00:06:51.459
thinks differently now. It's so different from

00:06:51.459 --> 00:06:54.060
any other software, right? When Microsoft updates

00:06:54.060 --> 00:06:58.319
Excel, SUM still means SUM. But when an AI model

00:06:58.319 --> 00:07:00.779
updates, its whole reasoning path can shift.

00:07:00.959 --> 00:07:02.560
It's like your colleague got a brain transplant

00:07:02.560 --> 00:07:04.319
over the weekend. Yeah, you have to relearn how

00:07:04.319 --> 00:07:06.519
to talk to them. And that leads us right to the

00:07:06.519 --> 00:07:08.939
heaviest part of this. We've got hardware scaling

00:07:08.939 --> 00:07:11.800
to gigawatts. We have software becoming agentic.

00:07:11.819 --> 00:07:14.589
And that brings us to Andrew Yang. Yeah. That

00:07:14.589 --> 00:07:16.910
piece he wrote on X about the end of the office,

00:07:17.089 --> 00:07:19.370
it went absolutely viral. And the language he

00:07:19.370 --> 00:07:23.089
uses, it's visceral. He calls it the great disemboweling

00:07:23.089 --> 00:07:25.290
of white collar jobs. Disemboweling. He's not

00:07:25.290 --> 00:07:27.529
pulling any punches. No. He's trying to shake

00:07:27.529 --> 00:07:29.629
people up. His whole argument is that we are

00:07:29.629 --> 00:07:31.790
witnessing the extinction of a very specific

00:07:31.790 --> 00:07:34.529
kind of worker. The middle manager. The coordinator.

00:07:34.910 --> 00:07:37.639
Exactly. But why the middle? For a decade, we

00:07:37.639 --> 00:07:39.399
heard that robots were coming for the blue collar

00:07:39.399 --> 00:07:42.079
jobs, for truck drivers and factory workers.

00:07:42.579 --> 00:07:45.779
Why is the target now the person in middle management?

00:07:46.079 --> 00:07:48.120
It all comes down to what he calls cost logic.

00:07:48.660 --> 00:07:50.879
Replacing a truck driver is actually really,

00:07:50.959 --> 00:07:54.139
really hard. You need robotics, sensors, insurance.

00:07:55.399 --> 00:07:57.639
It's a mess. Right. Physical world problems.

00:07:57.779 --> 00:08:01.639
But a middle manager, what is their primary output?

00:08:01.839 --> 00:08:04.480
It's not a physical thing. In theory, it's strategy

00:08:04.480 --> 00:08:07.649
and guidance and practice. It's a lot of emails.

00:08:07.850 --> 00:08:10.310
It's summarizing reports from one team to send

00:08:10.310 --> 00:08:12.569
up to the next level. It's high cost coordination.

00:08:12.810 --> 00:08:15.290
It's routing information. And Yang's point is

00:08:15.290 --> 00:08:17.009
that the AI agents we were just talking about,

00:08:17.110 --> 00:08:20.029
they are infinitely cheaper and faster at routing

00:08:20.029 --> 00:08:22.589
information than a human who is earning 150 grand

00:08:22.589 --> 00:08:25.810
a year. The math is just, it's brutal. It's a

00:08:25.810 --> 00:08:28.050
spreadsheet decision. He says if five managers

00:08:28.050 --> 00:08:30.649
can be replaced by one operator with a fleet

00:08:30.649 --> 00:08:33.330
of AI agents, the market will force that to happen.

00:08:33.720 --> 00:08:35.679
You know, this gets to what I call the CEO's

00:08:35.679 --> 00:08:38.600
dilemma. Think about this a lot. Let's say you're

00:08:38.600 --> 00:08:41.220
a CEO and you want to do the right thing. You

00:08:41.220 --> 00:08:43.759
don't want to fire people, but your main competitor

00:08:43.759 --> 00:08:47.059
just automated their entire middle layer and

00:08:47.059 --> 00:08:50.440
their profit margins just jumped 20%. Suddenly

00:08:50.440 --> 00:08:52.600
your investors are on the phone screaming at

00:08:52.600 --> 00:08:54.700
you. They don't care about your morals. They

00:08:54.700 --> 00:08:57.019
care about the returns. You're trapped. You have

00:08:57.019 --> 00:08:59.419
to make the same cuts just to survive. That's

00:08:59.419 --> 00:09:01.919
the trap. It becomes a race to the bottom on

00:09:01.919 --> 00:09:05.620
labor costs. And the ripple effects, they're

00:09:05.620 --> 00:09:07.980
frightening. If you hollow out that whole middle

00:09:07.980 --> 00:09:10.580
layer, what happens to entry -level jobs? There's

00:09:10.580 --> 00:09:13.659
no next step. You hire a junior analyst to eventually

00:09:13.659 --> 00:09:16.700
become a manager. But if that manager role doesn't

00:09:16.700 --> 00:09:19.679
exist, why hire the junior in the first place?

00:09:19.919 --> 00:09:22.019
And what does that do to the value of a degree?

00:09:22.509 --> 00:09:25.090
Yang makes the point that if knowledge is basically

00:09:25.090 --> 00:09:27.950
free and reasoning is cheap, does a generic business

00:09:27.950 --> 00:09:30.570
degree even hold its value? We could see some

00:09:30.570 --> 00:09:33.330
colleges, the weaker ones, just shut down. It's

00:09:33.330 --> 00:09:36.090
possible. We're moving from an economy of credentialism

00:09:36.090 --> 00:09:39.769
to an economy of capability. It's not about what

00:09:39.769 --> 00:09:42.250
you know. It's about whether you can direct the

00:09:42.250 --> 00:09:45.309
machine. It's a heavy thought. It really shakes

00:09:45.309 --> 00:09:47.350
the foundation of what we've all been told is

00:09:47.350 --> 00:09:50.110
a safe career. It does. It means the safe middle

00:09:50.110 --> 00:09:52.669
is now the most dangerous place to be. And that's

00:09:52.669 --> 00:09:55.009
the thing that keeps me up at night. OK, let's

00:09:55.009 --> 00:09:57.230
try to tie all this together. We started with

00:09:57.230 --> 00:10:00.450
gigawatts of power in India and we ended with

00:10:00.450 --> 00:10:03.029
the deaths of the middle manager. How do those

00:10:03.029 --> 00:10:05.870
two things connect? They are the hardware and

00:10:05.870 --> 00:10:08.789
the software of the exact same phenomenon. That

00:10:08.789 --> 00:10:11.330
Stargate initiative, that massive amount of compute.

00:10:11.929 --> 00:10:14.570
That is the engine. Right. It's being built specifically

00:10:14.570 --> 00:10:17.850
to run the kind of heavy reasoning models that

00:10:17.850 --> 00:10:20.129
can perform that high -level coordination work.

00:10:20.429 --> 00:10:23.950
So TATA and OpenAI are literally laying the physical

00:10:23.950 --> 00:10:26.490
groundwork. For the exact economic shift Andrew

00:10:26.490 --> 00:10:28.789
Yang is warning us about, it's a move from AI

00:10:28.789 --> 00:10:31.009
as an assistant, something that helps you work

00:10:31.009 --> 00:10:33.990
to AI as an operator, something that does the

00:10:33.990 --> 00:10:36.509
work. The infrastructure is being built to support

00:10:36.509 --> 00:10:39.009
the replacement of cognitive labor. That's the

00:10:39.009 --> 00:10:42.029
big idea. That's it. And for you, the learner

00:10:42.029 --> 00:10:44.649
listening to this, it's not about panic. It's

00:10:44.649 --> 00:10:47.090
about recognizing that the middle is not a safe

00:10:47.090 --> 00:10:49.830
place to be anymore. You either need to be building

00:10:49.830 --> 00:10:52.529
the machine or directing the machine. Which leads

00:10:52.529 --> 00:10:54.110
to the final thought I want to leave you with.

00:10:54.409 --> 00:10:57.289
If that middle part of the corporate ladder just

00:10:57.289 --> 00:11:01.429
disappears, what does a career path even look

00:11:01.429 --> 00:11:05.110
like in 2030? Do you just go from intern straight

00:11:05.110 --> 00:11:08.750
to executive? or do we need a completely new

00:11:08.750 --> 00:11:11.250
structure for how people contribute value? That

00:11:11.250 --> 00:11:12.929
might be the question of the decade. Maybe it's

00:11:12.929 --> 00:11:14.490
not a ladder anymore. Maybe it's more like a

00:11:14.490 --> 00:11:17.230
network of specialists. But the days of just

00:11:17.230 --> 00:11:20.370
climbing the rungs, those are probably over.

00:11:20.690 --> 00:11:22.649
So I want you to think about your own job this

00:11:22.649 --> 00:11:25.730
week. Look at your calendar. How much of what

00:11:25.730 --> 00:11:27.789
you do is coordination, just moving information

00:11:27.789 --> 00:11:30.909
around, and how much is actual creation or direction?

00:11:31.649 --> 00:11:33.649
Because the coordination part is being solved.

00:11:33.750 --> 00:11:35.789
That's the work. Thank you for diving in with

00:11:35.789 --> 00:11:38.629
us. It's a complicated new world, but at least

00:11:38.629 --> 00:11:40.350
we can try to parse it together. See you in the

00:11:40.350 --> 00:11:41.330
deep end. See you next time.
