WEBVTT

00:00:00.000 --> 00:00:02.200
The world of software is changing so fast that

00:00:02.200 --> 00:00:04.759
even the top developers are saying they feel

00:00:04.759 --> 00:00:08.160
completely lost. Absolutely. The game isn't about

00:00:08.160 --> 00:00:10.679
writing perfect code anymore. No. It's about

00:00:10.679 --> 00:00:15.919
orchestrating this unpredictable AI layer. So

00:00:15.919 --> 00:00:18.820
we have to ask, what does that new baseline for

00:00:18.820 --> 00:00:21.649
professional actually look like? Welcome to the

00:00:21.649 --> 00:00:23.929
Deep Dive. Today we're looking at sources that

00:00:23.929 --> 00:00:26.370
really map out the foundation of next -gen AI.

00:00:26.750 --> 00:00:30.010
We're talking about the literal energy that powers

00:00:30.010 --> 00:00:32.750
it, the automation flows that build it, and,

00:00:32.829 --> 00:00:35.369
well, the identity crisis hitting the people

00:00:35.369 --> 00:00:37.729
who make it all work. And our mission today is

00:00:37.729 --> 00:00:40.689
to unpack these three huge interconnected shifts.

00:00:41.289 --> 00:00:43.350
We're going to start with Google's massive bet

00:00:43.350 --> 00:00:46.210
on clean power using Corolla batteries. I mean,

00:00:46.229 --> 00:00:48.460
it's almost sci -fi. Then we'll drill down into

00:00:48.460 --> 00:00:51.079
the surprising secret behind building AI agents,

00:00:51.200 --> 00:00:53.299
which is really just focusing on a handful of

00:00:53.299 --> 00:00:55.179
core automation nodes. And finally, we're going

00:00:55.179 --> 00:00:57.820
to dissect the brutally honest truth that Andrzej

00:00:57.820 --> 00:01:00.200
Karpathy laid out about the new programming paradigm.

00:01:00.759 --> 00:01:04.140
It's forcing developers to become managers of

00:01:04.140 --> 00:01:06.260
chaos. Okay, let's unpack this. Let's do it.

00:01:06.340 --> 00:01:08.519
So our first source gets right to the heart of

00:01:08.519 --> 00:01:11.359
the problem. For any company trying to build

00:01:11.359 --> 00:01:15.659
AGI. Energy. Energy. You need massive nonstop.

00:01:16.109 --> 00:01:19.790
Carbon -free compute. 24 -7. And the old ways,

00:01:19.969 --> 00:01:23.909
you know, lithium, rare earth minerals, they

00:01:23.909 --> 00:01:26.569
just can't scale. Not cleanly enough, not fast

00:01:26.569 --> 00:01:28.569
enough. Yeah, and the supply chain for that stuff

00:01:28.569 --> 00:01:31.890
is a huge headache. It's volatile, geopolitically

00:01:31.890 --> 00:01:34.709
tricky, just a mess. Which brings us to Google's

00:01:34.709 --> 00:01:37.969
solution with this company, Energy Dome. They're

00:01:37.969 --> 00:01:40.829
developing these huge -scale co -euro batteries.

00:01:40.969 --> 00:01:43.109
And this is the brilliant part. The system uses

00:01:43.109 --> 00:01:45.750
only co -euro steel and water. Things that are

00:01:45.750 --> 00:01:47.810
cheap, abundant. Exactly. You can source them

00:01:47.810 --> 00:01:49.810
anywhere. It's a completely closed system, which

00:01:49.810 --> 00:01:51.650
is why it lasts so long. Okay, so how does it

00:01:51.650 --> 00:01:53.870
actually work? Well, when you have excess solar

00:01:53.870 --> 00:01:56.430
or wind power during the day, that electricity

00:01:56.430 --> 00:01:58.870
is used to compress carbon dioxide gas into a

00:01:58.870 --> 00:02:01.810
liquid. And that stores the energy. Like a giant

00:02:01.810 --> 00:02:04.810
battery. A giant silent battery. Then at night,

00:02:04.870 --> 00:02:07.049
or whenever the grid needs a boost, you just

00:02:07.049 --> 00:02:09.229
release the liquid to a yoyo. It turns back into

00:02:09.229 --> 00:02:12.870
a gas. Spins a turbine and generates power. It's

00:02:12.870 --> 00:02:15.430
basically a kinetic energy system. And the metrics

00:02:15.430 --> 00:02:17.430
here are why Google is jumping on this. This

00:02:17.430 --> 00:02:19.550
isn't just some research project. No, not at

00:02:19.550 --> 00:02:23.250
all. We're talking about a 75 % roundtrip efficiency.

00:02:23.550 --> 00:02:26.490
Very competitive. But the longevity is what gets

00:02:26.490 --> 00:02:29.849
me. Zero degradation. Over 30 years. And it's

00:02:29.849 --> 00:02:32.150
about half the cost of lithium -ion equivalent.

00:02:32.610 --> 00:02:35.090
So instead of waiting a decade to build, say,

00:02:35.250 --> 00:02:38.000
a nuclear plant. Right. They can just drop one

00:02:38.000 --> 00:02:39.860
of these massive domes. They honestly look like

00:02:39.860 --> 00:02:41.360
they're from Mars out in the desert and get going.

00:02:41.500 --> 00:02:44.240
It completely changes the strategy. It's modular.

00:02:44.439 --> 00:02:47.639
It's fast. And it's clean. Whoa. Yeah. Imagine

00:02:47.639 --> 00:02:50.400
scaling to a billion queries on energy that's

00:02:50.400 --> 00:02:53.500
just that clean and stable. It's transformative

00:02:53.500 --> 00:02:56.060
for AGI development. That kind of reliability

00:02:56.060 --> 00:02:58.759
is the dream. So these domes are obviously huge,

00:02:58.919 --> 00:03:01.280
though. We're talking football field sized. What's

00:03:01.280 --> 00:03:04.500
the main tradeoff for all this clean, scalable

00:03:04.500 --> 00:03:06.780
energy? The tradeoff against speed and efficiency

00:03:06.780 --> 00:03:10.280
is simply the sheer physical size, the land it

00:03:10.280 --> 00:03:13.020
requires. Okay, so we have this incredibly stable

00:03:13.020 --> 00:03:16.060
bedrock of power. Now let's move up the stack

00:03:16.060 --> 00:03:20.219
to the actual AI agents. Automation. Right. And

00:03:20.219 --> 00:03:22.900
if you've ever opened a platform like NANA, it

00:03:22.900 --> 00:03:26.960
can look like just chaos. It's a firehose of

00:03:26.960 --> 00:03:29.800
information, yeah. 100 plus nodes, arrows going

00:03:29.800 --> 00:03:32.509
everywhere. It's overwhelming. But the key insight

00:03:32.509 --> 00:03:35.370
from our sources is that the experts, the people

00:03:35.370 --> 00:03:38.830
building really complex agents, they aren't memorizing

00:03:38.830 --> 00:03:41.490
all 100 of those tools. Not even close. They're

00:03:41.490 --> 00:03:43.870
focusing on just a handful. Exactly. They rely

00:03:43.870 --> 00:03:47.150
on just 17 core nodes for about 80 % of all their

00:03:47.150 --> 00:03:49.810
automations. So the goal isn't to master every

00:03:49.810 --> 00:03:53.669
tool. It's to recognize the situation, the pattern,

00:03:53.669 --> 00:03:56.530
and the flow. Precisely. And a node here is just

00:03:56.530 --> 00:03:58.530
a functional block. Think of it like a digital

00:03:58.530 --> 00:04:01.139
Lego piece that does one single thing. Like checking

00:04:01.139 --> 00:04:04.259
some data or running a condition. Yeah, an IFELS

00:04:04.259 --> 00:04:06.539
decision, a loop to process a list, that kind

00:04:06.539 --> 00:04:08.800
of thing. Okay. Once you really know those 17

00:04:08.800 --> 00:04:11.479
core functions, building a workflow becomes almost

00:04:11.479 --> 00:04:14.159
plug and play. You stop staring at the huge menu.

00:04:14.379 --> 00:04:16.339
And you start thinking. You start thinking, okay,

00:04:16.399 --> 00:04:18.720
I need a loop right here, or this calls for an

00:04:18.720 --> 00:04:21.139
execute code block. So that's the real skill.

00:04:21.220 --> 00:04:23.660
Not the tool knowledge, but thinking in flows.

00:04:23.819 --> 00:04:26.459
That's the mindset shift. It defines the current

00:04:26.459 --> 00:04:29.529
AI skills gap. The person who can look at a business

00:04:29.529 --> 00:04:32.050
problem and map out that flow. They're the most

00:04:32.050 --> 00:04:33.949
valuable person in the room. Without a doubt.

00:04:34.050 --> 00:04:36.970
They bridge the logic and the execution. So why

00:04:36.970 --> 00:04:39.970
is recognizing that flow more valuable than just

00:04:39.970 --> 00:04:42.329
mastering every single tool you can find? It

00:04:42.329 --> 00:04:44.709
defines you as an orchestrator. And that lets

00:04:44.709 --> 00:04:47.569
you bridge the growing AI skills gap. Okay, so

00:04:47.569 --> 00:04:50.629
we have stable power, efficient workflows. But

00:04:50.629 --> 00:04:53.769
how are people in the real world actually reacting

00:04:53.769 --> 00:04:57.420
to all this? This is where it gets tricky. Public

00:04:57.420 --> 00:05:00.079
trust is a massive friction point, especially

00:05:00.079 --> 00:05:02.800
with something as visible as advertising. I can

00:05:02.800 --> 00:05:05.180
imagine. Yeah. The data is pretty stark. If you

00:05:05.180 --> 00:05:08.160
mark an ad as AI -generated, the number of clicks

00:05:08.160 --> 00:05:12.300
it gets drops by about 31%. Wow, 31%. That's

00:05:12.300 --> 00:05:14.660
huge. It's a commercial failure. Yeah. And it

00:05:14.660 --> 00:05:16.079
makes perfect sense when you remember all the

00:05:16.079 --> 00:05:18.360
AI ads that got roasted last year. Oh, yeah.

00:05:18.560 --> 00:05:20.980
Meta's AI granny that looked like she was melting.

00:05:21.040 --> 00:05:24.339
Right. Or the weird Coca -Cola trucks that defied

00:05:24.339 --> 00:05:26.779
physics. Yeah. And that McDonald's Christmas

00:05:26.779 --> 00:05:30.379
ad. A total mess. That uncanny valley effect

00:05:30.379 --> 00:05:32.600
is so powerful. If something feels off, people

00:05:32.600 --> 00:05:35.019
just pull back. Exactly. And that suspicion is

00:05:35.019 --> 00:05:37.540
reflected in the numbers. But, you know, while

00:05:37.540 --> 00:05:40.300
the public is cautious, AI deployment in other

00:05:40.300 --> 00:05:42.560
areas is still moving at lightning speed. Right.

00:05:42.620 --> 00:05:45.139
Like NVIDIA just dropped a new free guide on

00:05:45.139 --> 00:05:48.339
how to fine tune your own LLMs locally. And an

00:05:48.339 --> 00:05:50.730
LLM is just a computer model that... generates

00:05:50.730 --> 00:05:52.829
human -like text. So that's happening at the

00:05:52.829 --> 00:05:54.949
developer level. And then you have the really

00:05:54.949 --> 00:05:57.949
weird stuff. Like the Chinese Agibot? Yeah, renting

00:05:57.949 --> 00:06:00.329
out humanoid robots for weddings and concerts.

00:06:00.410 --> 00:06:02.449
The future is definitely here, ready or not.

00:06:02.569 --> 00:06:04.689
And the money is still there. We saw that in

00:06:04.689 --> 00:06:07.069
India, AI startups raised over $600 million.

00:06:07.509 --> 00:06:10.550
So the investment is flowing, but... Investors

00:06:10.550 --> 00:06:14.170
are getting pickier. They want to see practical...

00:06:14.430 --> 00:06:17.569
real world value, not just blue sky promises.

00:06:17.850 --> 00:06:20.110
Which circles back to the ad problem. They need

00:06:20.110 --> 00:06:22.990
agents that work, that are reliable, and that

00:06:22.990 --> 00:06:26.329
don't creep people out. So with this negative

00:06:26.329 --> 00:06:29.089
reaction to ads, does that mean consumer facing

00:06:29.089 --> 00:06:31.649
AI applications are going to slow down? I don't

00:06:31.649 --> 00:06:34.089
think they'll slow down, but it will force much

00:06:34.089 --> 00:06:36.629
higher standards for transparency and practical

00:06:36.629 --> 00:06:39.110
value. That pressure for reliability brings us

00:06:39.110 --> 00:06:42.220
to our last segment, the human element. This

00:06:42.220 --> 00:06:44.860
programming identity crisis. Yeah, we have to

00:06:44.860 --> 00:06:47.300
talk about Andrej Karpathy's post. For anyone

00:06:47.300 --> 00:06:49.079
who doesn't know, he's one of the most respected

00:06:49.079 --> 00:06:52.899
minds in AI. And he posted on X just being brutally

00:06:52.899 --> 00:06:55.220
honest, saying he's never felt so left behind

00:06:55.220 --> 00:06:57.579
as a programmer. And millions of developers immediately

00:06:57.579 --> 00:07:00.259
responded saying, yes, that's exactly how I feel.

00:07:00.379 --> 00:07:03.639
The issue is this quiet takeover of a new programming

00:07:03.639 --> 00:07:06.500
paradigm. The job has fundamentally changed.

00:07:06.819 --> 00:07:10.259
The old way was so predictable, so logical. Write

00:07:10.259 --> 00:07:13.779
code, test it. Debug the syntax, deploy. You

00:07:13.779 --> 00:07:16.240
are in complete control. The job is almost unrecognizable.

00:07:16.420 --> 00:07:18.680
You're prompting LLMs, orchestrating agents,

00:07:18.980 --> 00:07:21.160
wrangling all these different tools and APIs.

00:07:21.660 --> 00:07:23.920
The work isn't debugging your code syntax anymore.

00:07:24.120 --> 00:07:26.899
It's debugging the AI's behavior. And that's

00:07:26.899 --> 00:07:29.480
because of this new abstraction layer that tools

00:07:29.480 --> 00:07:32.180
like chat GPT agents have created. Right. And

00:07:32.180 --> 00:07:34.560
because of that layer, the whole vocabulary is

00:07:34.560 --> 00:07:37.000
changed overnight. If you don't know terms like

00:07:37.000 --> 00:07:40.240
MCP or LSP. You're already behind. Exactly. And

00:07:40.240 --> 00:07:42.120
MCP is just the model control plane which decides

00:07:42.120 --> 00:07:44.600
which tool the AI should use. Yeah. LSP is large

00:07:44.600 --> 00:07:46.839
scale prompting. And then there's context and

00:07:46.839 --> 00:07:49.360
memory. Context being the info for a single task

00:07:49.360 --> 00:07:51.779
and memory being the history over a whole conversation.

00:07:52.240 --> 00:07:55.060
But it's not just learning new words. It's the

00:07:55.060 --> 00:07:58.079
unpredictability of it all. That's the core challenge.

00:07:58.500 --> 00:08:01.319
You are no longer writing code that does exactly

00:08:01.319 --> 00:08:03.800
what you tell it to do. You're managing this

00:08:03.800 --> 00:08:06.740
living, unpredictable system. One that suffers

00:08:06.740 --> 00:08:08.959
from things like prompt drift, where the AI's

00:08:08.959 --> 00:08:11.600
output changes even if your input stays the same.

00:08:11.800 --> 00:08:14.180
Yeah. I still wrestle with prompt drift myself.

00:08:14.439 --> 00:08:16.860
It feels like the foundation just shifts every

00:08:16.860 --> 00:08:19.040
month. It adds this layer of anxiety to the whole

00:08:19.040 --> 00:08:21.480
process. That anxiety is the psychological toll.

00:08:22.120 --> 00:08:24.959
The honest truth is managing that unpredictable

00:08:24.959 --> 00:08:28.279
behavior is the job now. So if a top developer

00:08:28.279 --> 00:08:30.779
like Carpathy feels left behind by all this,

00:08:31.000 --> 00:08:33.840
what is the single most important skill for a

00:08:33.840 --> 00:08:36.659
new developer to build right now? It has to be

00:08:36.659 --> 00:08:39.779
critical thinking. Strong diagnostic skills to

00:08:39.779 --> 00:08:42.799
debug AI behavior and manage that unpredictability.

00:08:43.000 --> 00:08:44.899
So we've moved from the physical infrastructure

00:08:44.899 --> 00:08:46.860
all the way to the psychological impact today.

00:08:47.039 --> 00:08:49.580
Quite a journey. Let's recap the three big shifts

00:08:49.580 --> 00:08:52.590
we saw in the sources. Okay, first. AI infrastructure

00:08:52.590 --> 00:08:57.190
is moving to massive, reliable, low -carbon solutions

00:08:57.190 --> 00:08:59.750
like those Cocoa Your batteries. They're built

00:08:59.750 --> 00:09:02.649
for 30 years of stability to meet AGI's energy

00:09:02.649 --> 00:09:05.429
demands. Second, mastering automation isn't about

00:09:05.429 --> 00:09:07.590
the tools anymore. It's about recognizing flow

00:09:07.590 --> 00:09:10.610
patterns, those 17 core nodes, so you can be

00:09:10.610 --> 00:09:12.970
an orchestrator, not just a user. And third,

00:09:13.399 --> 00:09:15.740
The job of a programmer has fundamentally changed.

00:09:16.000 --> 00:09:18.820
It's now about orchestrating and debugging this

00:09:18.820 --> 00:09:21.860
unpredictable AI layer. It's a shift from being

00:09:21.860 --> 00:09:24.679
a craftsman to being a manager of chaos. Knowledge

00:09:24.679 --> 00:09:26.980
is power, but it seems like all the sources suggest

00:09:26.980 --> 00:09:29.899
success now comes from being that orchestrator.

00:09:30.019 --> 00:09:33.139
We saw that continuous, stable energy is critical

00:09:33.139 --> 00:09:36.519
for AGI. And yet... The developers building on

00:09:36.519 --> 00:09:39.419
top of it are struggling with this new, unpredictable

00:09:39.419 --> 00:09:42.399
software layer. Which brings us to a final thought

00:09:42.399 --> 00:09:44.720
for you to take with you. Yeah. If the physical

00:09:44.720 --> 00:09:47.080
infrastructure for AI is being solved with these

00:09:47.080 --> 00:09:50.879
massive, unchanging 30 -year batteries, why is

00:09:50.879 --> 00:09:53.019
the software layer still so incredibly volatile?

00:09:53.399 --> 00:09:55.659
That's a great question. What happens when those

00:09:55.659 --> 00:09:58.320
two radically different speeds, the rock -solid

00:09:58.320 --> 00:10:00.919
stability of physics and the utter chaos of code,

00:10:01.979 --> 00:10:04.240
Thanks for diving deep with us. We hope this

00:10:04.240 --> 00:10:05.759
knowledge gives you a competitive edge.
