WEBVTT

00:00:00.000 --> 00:00:03.299
So a $38 billion deal with AWS just landed. That's

00:00:03.299 --> 00:00:09.220
huge, securing massive compute power. And for

00:00:09.220 --> 00:00:12.240
a company looking at, what, over $1 .4 trillion

00:00:12.240 --> 00:00:15.140
on infrastructure over the next decade? Yeah,

00:00:15.160 --> 00:00:17.859
$1 .4 trillion. It sounds absolutely staggering.

00:00:18.140 --> 00:00:21.100
It is. And that's not just spending money. It

00:00:21.100 --> 00:00:23.260
feels like they're building the literal foundations

00:00:23.260 --> 00:00:27.420
for whatever AI becomes next. It really is the

00:00:27.420 --> 00:00:30.280
price of admission these days, isn't it? Welcome,

00:00:30.399 --> 00:00:32.719
everyone, to the Deep Dive. Our whole mission

00:00:32.719 --> 00:00:35.259
here is to get you past those splashy headlines

00:00:35.259 --> 00:00:38.420
and really dive into the core strategy. What's

00:00:38.420 --> 00:00:40.640
really going on in this, let's call it the Great

00:00:40.640 --> 00:00:43.320
Compute War. Okay. So today, yeah, we're going

00:00:43.320 --> 00:00:45.320
to unpack these massive deals, look at some of

00:00:45.320 --> 00:00:48.719
the flashpoints, you know, layoffs blamed on

00:00:48.719 --> 00:00:51.039
AI, where the legal lines are being drawn, and

00:00:51.039 --> 00:00:52.740
then we'll get to the really wild stuff like

00:00:52.740 --> 00:00:55.600
data centers in space. Seriously. Seriously.

00:00:55.740 --> 00:00:57.579
So, yeah, this is going to be your shortcut to

00:00:57.579 --> 00:00:59.810
really getting this strategic. landscape of AI

00:00:59.810 --> 00:01:01.710
right now. All right, let's start with that $38

00:01:01.710 --> 00:01:04.530
billion AWS deal then. Because for a while, it

00:01:04.530 --> 00:01:06.750
really felt like Microsoft kind of had open AI

00:01:06.750 --> 00:01:08.569
locked down, didn't it? It did seem that way.

00:01:08.670 --> 00:01:13.069
They put in $13 billion early on, had that first

00:01:13.069 --> 00:01:15.629
dibs thing on infrastructure spent. Exactly.

00:01:15.629 --> 00:01:17.689
But well, that old agreement, it's basically

00:01:17.689 --> 00:01:20.409
done, gone. What's really interesting, though,

00:01:20.469 --> 00:01:24.430
is the type of money involved. How so? Well,

00:01:24.530 --> 00:01:26.659
yes, they still have this enormous... commitment

00:01:26.659 --> 00:01:30.180
with azure we're talking like 250 billion dollars

00:01:30.180 --> 00:01:33.480
huge number quarter trillion yeah but A lot of

00:01:33.480 --> 00:01:35.099
that is in cloud credits, and it's spread out

00:01:35.099 --> 00:01:37.819
over quite a few years. Okay, so it's not like

00:01:37.819 --> 00:01:39.920
a giant cash payment up front that stops them

00:01:39.920 --> 00:01:41.939
looking elsewhere. Precisely. That's the key.

00:01:42.060 --> 00:01:44.379
So this AWS deal, plus the fact they're working

00:01:44.379 --> 00:01:47.780
with Google and Oracle and even CoreWeave, it

00:01:47.780 --> 00:01:50.939
shows they're absolutely multi -cloud now. They're

00:01:50.939 --> 00:01:53.140
free to shop around. They can look for the best

00:01:53.140 --> 00:01:56.560
GPUs, the lowest latency, specific network setups,

00:01:56.739 --> 00:01:59.379
whatever they need. It's smart. Stops them getting

00:01:59.379 --> 00:02:01.579
locked into one vendor. It almost feels like...

00:02:02.500 --> 00:02:04.799
Like a geopolitical strategy, not just business.

00:02:05.540 --> 00:02:07.700
Spreading the risk, spreading the power. Making

00:02:07.700 --> 00:02:10.039
sure no single partner controls their most vital

00:02:10.039 --> 00:02:14.039
resource, which is compute. That's exactly the

00:02:14.039 --> 00:02:16.580
strategic play. And, you know, this makes Amazon's

00:02:16.580 --> 00:02:19.580
position fascinating. How is that? Well, Amazon

00:02:19.580 --> 00:02:22.599
is the main backer of Anthropic, right? One of

00:02:22.599 --> 00:02:25.360
OpenAI's biggest rivals in the LLM space. Yeah,

00:02:25.379 --> 00:02:27.740
a direct competitor. And yet AWS is now hosting

00:02:27.740 --> 00:02:31.210
both of them. On their servers. Whoa. So Amazon

00:02:31.210 --> 00:02:34.909
is trying to be like the neutral ground, the

00:02:34.909 --> 00:02:38.550
Switzerland of AI clouds hosting everyone, even

00:02:38.550 --> 00:02:40.169
the company competing with their own investment.

00:02:40.490 --> 00:02:43.909
That's bold. It's a massive gamble because controlling

00:02:43.909 --> 00:02:46.449
that underlying compute infrastructure, you know,

00:02:46.449 --> 00:02:49.030
where the actual models run, that's rapidly becoming

00:02:49.030 --> 00:02:51.629
the new critical global infrastructure, maybe

00:02:51.629 --> 00:02:53.409
even like a sovereign resource down the line.

00:02:53.449 --> 00:02:55.050
And Amazon just wants to own the highway. Doesn't

00:02:55.050 --> 00:02:56.879
matter who's driving the trucks. Pretty much.

00:02:56.939 --> 00:02:59.719
And the spending. It's just ramping up this $38

00:02:59.719 --> 00:03:03.560
billion to AWS. It's just one more step towards

00:03:03.560 --> 00:03:06.219
that massive $1 .4 trillion goal over the next

00:03:06.219 --> 00:03:09.680
decade. That number. It's still hard to really

00:03:09.680 --> 00:03:11.699
wrap your head around. How do you even plan for

00:03:11.699 --> 00:03:14.500
needing over a trillion dollars worth of servers

00:03:14.500 --> 00:03:18.819
and power? Softly. Whoa. Imagine scaling that

00:03:18.819 --> 00:03:21.800
infrastructure to handle like a billion queries.

00:03:23.199 --> 00:03:26.349
Consistently. So why is this big shift to multi

00:03:26.349 --> 00:03:28.830
-cloud actually matter for future innovation?

00:03:29.289 --> 00:03:31.909
Well, thinking about it, it means less reliance

00:03:31.909 --> 00:03:34.330
on just one partner, right? And way more freedom

00:03:34.330 --> 00:03:36.270
to shop around for pure speed and capability.

00:03:36.469 --> 00:03:38.530
Exactly. Less reliance, more freedom to chase

00:03:38.530 --> 00:03:40.750
performance. But, you know, scrambling for the

00:03:40.750 --> 00:03:43.129
hardware for the compute, that's only one side

00:03:43.129 --> 00:03:46.030
of this. The other side is how society, how the

00:03:46.030 --> 00:03:48.669
law, is trying to keep up with what this power

00:03:48.669 --> 00:03:50.990
means when it actually gets deployed. Let's talk

00:03:50.990 --> 00:03:52.289
about those flashpoints happening right now.

00:03:52.349 --> 00:03:55.150
And that deployment is moving fast. OpenAI's

00:03:55.150 --> 00:03:57.969
video tool, Sora, it's now live on Android across

00:03:57.969 --> 00:04:01.129
the US, Canada, Japan, Asia. Big rollout. And

00:04:01.129 --> 00:04:04.090
we're seeing a whole new flood of AI -generated

00:04:04.090 --> 00:04:06.449
videos hitting places like TikTok. It's gone

00:04:06.449 --> 00:04:09.490
viral again. But at the same time, you see these

00:04:09.490 --> 00:04:13.150
headlines about layoffs. Big numbers. Over 60

00:04:13.150 --> 00:04:16.949
,000 jobs cut at huge companies, Amazon, UPS,

00:04:17.269 --> 00:04:19.629
Target. Yep. And the official reason often given

00:04:19.629 --> 00:04:22.889
is AI automation. Right. But the experts, they

00:04:22.889 --> 00:04:24.370
often point to something else, something they

00:04:24.370 --> 00:04:27.300
call AI washing. AI washing. Can you break that

00:04:27.300 --> 00:04:29.240
down for us? Yeah, sure. AI washing is basically,

00:04:29.379 --> 00:04:31.639
well, it's a convenient story for corporations.

00:04:31.860 --> 00:04:35.899
It means blaming massive job cuts on AI, you

00:04:35.899 --> 00:04:38.459
know, this futuristic force, even when the reality

00:04:38.459 --> 00:04:40.519
is it's not purely because of automation. It

00:04:40.519 --> 00:04:42.959
might be standard cost cutting or market changes

00:04:42.959 --> 00:04:46.519
or restructuring. But AI is an easy scapegoat.

00:04:46.600 --> 00:04:49.199
Yeah. Makes it sound inevitable almost. Yeah,

00:04:49.259 --> 00:04:50.720
I can see that. You know, I still wrestle with

00:04:50.720 --> 00:04:53.139
prompt drift myself. Like you tweak one word

00:04:53.139 --> 00:04:54.879
in a prompt and suddenly the AI gives you something

00:04:54.879 --> 00:04:56.509
totally. different. It's unpredictable sometimes.

00:04:56.629 --> 00:04:59.449
And this AI watching thing just highlights how

00:04:59.449 --> 00:05:02.370
much confusion there still is. Confusion about

00:05:02.370 --> 00:05:05.310
what the tech can actually do versus, you know,

00:05:05.329 --> 00:05:08.290
the hype around it. And that confusion. It's

00:05:08.290 --> 00:05:09.829
right at the heart of the whole public trust

00:05:09.829 --> 00:05:12.209
issue. Why do some people love AI and others

00:05:12.209 --> 00:05:14.670
really fear it? It often comes down to how our

00:05:14.670 --> 00:05:17.529
brains handle risk and, frankly, how much we

00:05:17.529 --> 00:05:19.629
trust something we don't fully understand. If

00:05:19.629 --> 00:05:22.329
it feels confusing, it feels riskier. Makes sense.

00:05:22.529 --> 00:05:24.750
And the legal system is definitely being forced

00:05:24.750 --> 00:05:27.790
to react now. We're already seeing some clear

00:05:27.790 --> 00:05:30.860
boundaries emerge. Like what? Well, like you

00:05:30.860 --> 00:05:33.639
can ask GBT for health questions, although seriously,

00:05:33.779 --> 00:05:36.399
be careful. Double check every. Absolutely. Disclaimer

00:05:36.399 --> 00:05:39.420
needed there. But you cannot currently build

00:05:39.420 --> 00:05:42.379
a regulated business that offers, say, legal

00:05:42.379 --> 00:05:45.300
advice. using that model. That's a really important

00:05:45.300 --> 00:05:47.480
line in the sand for commercial uses right now.

00:05:47.560 --> 00:05:50.040
And the legal heat is definitely rising. I mean,

00:05:50.040 --> 00:05:52.259
talk about drama. Did you see that moment when

00:05:52.259 --> 00:05:54.480
a guy actually walked up and served Sam Altman

00:05:54.480 --> 00:05:57.879
a real legal subpoena? When was this? During

00:05:57.879 --> 00:06:00.120
a live talk Altman was having with Steve Kerr,

00:06:00.120 --> 00:06:02.319
the basketball coach. It was apparently related

00:06:02.319 --> 00:06:06.759
to some anti -AI protest case. Wow. That's intense.

00:06:06.939 --> 00:06:08.839
It shows this isn't just academic anymore. It's

00:06:08.839 --> 00:06:10.939
hitting the real world legal system and all.

00:06:11.040 --> 00:06:13.259
For sure. But even with all that tension, the

00:06:13.259 --> 00:06:15.439
money keeps pouring in for specific applications.

00:06:15.620 --> 00:06:18.040
Look at Hippocratic AI. That's a health care

00:06:18.040 --> 00:06:20.399
focused model. Right. I saw that. They just raised

00:06:20.399 --> 00:06:23.000
one hundred and twenty six million dollars. Big

00:06:23.000 --> 00:06:25.540
names backing them. Andreessen Horowitz, Google

00:06:25.540 --> 00:06:28.060
Ventures. They're valued at what? Three point

00:06:28.060 --> 00:06:30.100
five billion dollars now. Shows there's still

00:06:30.100 --> 00:06:33.259
a huge appetite for specialized AI. OK, so given

00:06:33.259 --> 00:06:37.060
all this legal stuff, this tension. How quickly

00:06:37.060 --> 00:06:39.740
is the legal system actually adapting to these

00:06:39.740 --> 00:06:42.879
non -negotiable AI limits you mentioned? Honestly,

00:06:43.100 --> 00:06:45.899
slowly. It seems like it requires really clear

00:06:45.899 --> 00:06:48.620
boundaries, like avoiding using AI for regulated

00:06:48.620 --> 00:06:50.540
stuff like health or legal businesses directly.

00:06:50.779 --> 00:06:52.540
Yeah, slow and steady seems to be the pace for

00:06:52.540 --> 00:06:54.579
law. Oh. Okay, let's shift gears a bit. From

00:06:54.579 --> 00:06:57.600
the serious stuff in boardrooms and courtrooms

00:06:57.600 --> 00:07:00.100
to the culture around AI, it feels like we've

00:07:00.100 --> 00:07:03.170
reached peak AI meme season. Oh yeah, definitely.

00:07:03.430 --> 00:07:05.389
The vibe check is strong right now. AI memes

00:07:05.389 --> 00:07:07.829
are everywhere. And honestly, the collected roasting

00:07:07.829 --> 00:07:10.629
of the big tech players, OpenAI, Anthropic, Google,

00:07:10.870 --> 00:07:13.129
Apple, it feels kind of healthy, you know? A

00:07:13.129 --> 00:07:14.790
way to process all the competition and maybe

00:07:14.790 --> 00:07:17.269
some of the frustration. It is pretty funny sometimes.

00:07:17.569 --> 00:07:21.449
But then, then there's the weird stuff. The spine

00:07:21.449 --> 00:07:24.050
check trends, as the source called it. Apparently,

00:07:24.370 --> 00:07:29.050
there's this bizarre new wave of... AI chiropractors

00:07:29.050 --> 00:07:32.370
fueled by video tools like Sora generating realistic

00:07:32.370 --> 00:07:36.470
looking adjustments. Wait, what? AI chiropractors?

00:07:36.470 --> 00:07:38.370
Yeah. And we really need to say this clearly.

00:07:38.550 --> 00:07:40.870
Please, please do not use AI for medical advice

00:07:40.870 --> 00:07:43.269
and especially not for anything like spinal adjustments

00:07:43.269 --> 00:07:45.509
based on a generated video. Just don't. Okay.

00:07:45.529 --> 00:07:47.629
Yeah. That's definitely a do not try this at

00:07:47.629 --> 00:07:51.089
home situation. Wow. But that absurdity, it actually

00:07:51.089 --> 00:07:52.930
highlights something important, doesn't it? What's

00:07:52.930 --> 00:07:55.930
that? When the outputs look this convincing visually.

00:07:56.800 --> 00:07:58.860
Some people will try anything, which is maybe

00:07:58.860 --> 00:08:00.899
why we're seeing creators push back in other

00:08:00.899 --> 00:08:04.439
ways. Oh, so? Like sharing really detailed AI

00:08:04.439 --> 00:08:06.399
performance charts, adding their own analysis

00:08:06.399 --> 00:08:08.860
of what these models can really do. It feels

00:08:08.860 --> 00:08:11.220
like a push for more transparency, for community

00:08:11.220 --> 00:08:13.860
driven benchmarks, not just corporate hype. That

00:08:13.860 --> 00:08:15.839
makes sense. That need for reliable info, for

00:08:15.839 --> 00:08:17.819
real metrics. Yeah. It connects right into all

00:08:17.819 --> 00:08:20.120
the new tools popping up, too. This feels like

00:08:20.120 --> 00:08:21.819
where the democratization starts to kick in.

00:08:21.959 --> 00:08:24.420
Totally. We're seeing this explosion of really

00:08:24.420 --> 00:08:28.810
focused. tools like a lay its whole purpose is

00:08:28.810 --> 00:08:31.930
making high quality presentations using ai takes

00:08:31.930 --> 00:08:34.110
away a lot of that painful slide building time

00:08:34.110 --> 00:08:37.070
yeah i could use that and loosen for video right

00:08:37.070 --> 00:08:39.429
you just chat with the ai tell it what you want

00:08:39.429 --> 00:08:41.730
and it generates edited video that cuts out a

00:08:41.730 --> 00:08:44.669
ton of manual editing exactly then there's skyvern

00:08:44.669 --> 00:08:47.370
that one sounds interesting automating workflows

00:08:47.370 --> 00:08:49.970
you do in your web browser think of all those

00:08:49.970 --> 00:08:53.129
repetitive clicks and data entry tasks and offices

00:08:53.129 --> 00:08:56.480
oh yeah automating the boring stuff. Huge potential

00:08:56.480 --> 00:08:58.659
there. And my favorite from this list has to

00:08:58.659 --> 00:09:01.659
be Atom, text to 3D models. Yeah. And apparently

00:09:01.659 --> 00:09:03.879
with sliders, so you can tweak them easily after

00:09:03.879 --> 00:09:06.600
generation. Like stacking Lego blocks of data

00:09:06.600 --> 00:09:09.419
almost. You just describe what you want and poof,

00:09:09.419 --> 00:09:11.480
a 3D object appears that you can actually adjust.

00:09:11.879 --> 00:09:15.299
Really cool. So thinking about the culture, these

00:09:15.299 --> 00:09:19.179
tools. What do those weird trends like the AI

00:09:19.179 --> 00:09:22.480
chiropractors really tell us about the risk when

00:09:22.480 --> 00:09:24.360
this stuff gets deployed? I mean, it tells me

00:09:24.360 --> 00:09:26.639
people will try absolutely anything, right? So

00:09:26.639 --> 00:09:29.059
any regulation, any safety measures, they really

00:09:29.059 --> 00:09:31.480
need to focus super hard on user safety first

00:09:31.480 --> 00:09:34.039
and foremost. Because assumption of sensible

00:09:34.039 --> 00:09:37.490
use. Probably not safe. Good point. User safety

00:09:37.490 --> 00:09:39.370
first. And this brings us kind of full circle

00:09:39.370 --> 00:09:42.450
back to infrastructure, but on a way bigger scale,

00:09:42.490 --> 00:09:45.629
a cosmic scale maybe. We know these massive AI

00:09:45.629 --> 00:09:48.230
models are already straining Earth's power grids,

00:09:48.269 --> 00:09:51.769
and crucially, water. They need so much water

00:09:51.769 --> 00:09:54.409
for cooling those giant server farms. That strain,

00:09:54.549 --> 00:09:56.629
that resource pressure, it's forcing the big

00:09:56.629 --> 00:09:59.330
players to think, well, radically differently.

00:09:59.389 --> 00:10:02.470
Which brings us to Google's Project Suncatcher.

00:10:02.889 --> 00:10:05.570
Possibly their wildest plan yet. Which is? Launching

00:10:05.570 --> 00:10:08.350
data centers into space. Okay. You mentioned

00:10:08.350 --> 00:10:10.450
this earlier. Data centers in orbit. Tell us

00:10:10.450 --> 00:10:13.610
more. Where exactly? When? So the plan is pretty

00:10:13.610 --> 00:10:16.149
ambitious. They want to deploy over 80 satellites,

00:10:16.409 --> 00:10:18.570
each one packed with AI processors powered by

00:10:18.570 --> 00:10:21.330
solar panels. And they're aiming for low Earth

00:10:21.330 --> 00:10:26.629
orbit about 400 miles up the target date by 2027.

00:10:26.929 --> 00:10:30.169
2027. That's soon. But why space? It just sounds

00:10:30.169 --> 00:10:31.909
incredibly expensive and maybe environmentally

00:10:31.909 --> 00:10:34.289
tricky, too. Launching rockets isn't exactly

00:10:34.289 --> 00:10:36.669
green. Right. There are definitely costs, but

00:10:36.669 --> 00:10:38.669
the efficiency gains they're chasing are potentially

00:10:38.669 --> 00:10:41.289
enormous. That's the justification. Efficiency

00:10:41.289 --> 00:10:44.129
how? Solar panels. In space, they're roughly

00:10:44.129 --> 00:10:46.649
eight times more efficient than on Earth. Eight

00:10:46.649 --> 00:10:51.840
times. Wow. Why? Simple. No clouds, no atmosphere

00:10:51.840 --> 00:10:54.379
scattering the light, no nighttime cutting off

00:10:54.379 --> 00:10:57.299
power, no shade from buildings or trees, just

00:10:57.299 --> 00:10:59.899
pure constant sunlight. Okay, that's a big energy

00:10:59.899 --> 00:11:03.019
advantage. Huge. Plus, the vacuum of space means

00:11:03.019 --> 00:11:05.000
you don't need vast amounts of land for server

00:11:05.000 --> 00:11:07.240
farms. And critically, you don't need millions

00:11:07.240 --> 00:11:09.799
of gallons of water for cooling. Space is its

00:11:09.799 --> 00:11:12.279
own cooling system, basically. So you bypass

00:11:12.279 --> 00:11:14.480
the biggest resource bottlenecks we have down

00:11:14.480 --> 00:11:16.580
here, power generation, consistency, and water

00:11:16.580 --> 00:11:18.720
usage. That's right. Instantly. That's the idea.

00:11:19.149 --> 00:11:20.889
extreme efficiency. It's actually making the

00:11:20.889 --> 00:11:23.409
economics look surprisingly possible. Launch

00:11:23.409 --> 00:11:25.850
costs are dropping dramatically. Yeah, SpaceX

00:11:25.850 --> 00:11:28.509
and others are making it cheaper. Exactly. So

00:11:28.509 --> 00:11:31.750
some projections suggest that by the 2030s, the

00:11:31.750 --> 00:11:34.549
actual operational cost of a space data center

00:11:34.549 --> 00:11:38.730
might match or maybe even beat the cost of running

00:11:38.730 --> 00:11:41.309
one on Earth. Still, that counterpoint you raised,

00:11:41.429 --> 00:11:45.460
launching rockets does create CO2. A lot of it.

00:11:45.600 --> 00:11:47.679
Are we just trading one environmental headache

00:11:47.679 --> 00:11:49.960
for another? That's the big question, isn't it?

00:11:50.220 --> 00:11:52.799
Launching is polluting. And astronomers are really

00:11:52.799 --> 00:11:54.600
worried, too. They're concerned about all these

00:11:54.600 --> 00:11:56.879
new satellites cluttering up the sky, interfering

00:11:56.879 --> 00:11:59.799
with telescopes. They call it the bugs on a windshield

00:11:59.799 --> 00:12:03.389
effect. Bugs on a windshield. Vivid. And Google's

00:12:03.389 --> 00:12:04.889
not the only one thinking about this, right?

00:12:05.009 --> 00:12:07.029
This space compute race. Oh, definitely not.

00:12:07.389 --> 00:12:10.269
Elon Musk with Starlink and SpaceX is obviously

00:12:10.269 --> 00:12:12.950
a massive player in space infrastructure already.

00:12:13.210 --> 00:12:15.009
And there are startups jumping in, too, like

00:12:15.009 --> 00:12:16.809
StarCloud. Apparently, they're planning to send

00:12:16.809 --> 00:12:19.110
NVIDIA chips up this month just for testing.

00:12:19.210 --> 00:12:21.549
This month. Wow. OK, so the race is already underway.

00:12:21.750 --> 00:12:24.149
It seems so. So final thought on this space angle.

00:12:24.480 --> 00:12:27.179
Will the environmental hit from all those launches

00:12:27.179 --> 00:12:30.820
end up canceling out the long term energy savings

00:12:30.820 --> 00:12:32.679
they're promising? That's the trillion dollar

00:12:32.679 --> 00:12:35.620
question. Maybe literally it might. But the company

00:12:35.620 --> 00:12:37.919
is pushing this. They're focusing hard on those

00:12:37.919 --> 00:12:41.019
huge long term efficiency gains and getting rid

00:12:41.019 --> 00:12:43.919
of the strain on Earth's water and power grids.

00:12:44.000 --> 00:12:48.559
Right. It's a long term bet. OK. So we've covered

00:12:48.559 --> 00:12:51.019
a lot of ground in this deep dive today. What's

00:12:51.019 --> 00:12:52.980
the big picture here? What should we take away?

00:12:53.389 --> 00:12:56.009
I think the core findings are pretty clear. First,

00:12:56.190 --> 00:12:58.870
that compute war. It's leading to power decentralizing,

00:12:58.909 --> 00:13:01.070
moving away from just one single infrastructure

00:13:01.070 --> 00:13:03.990
partner. It's a strategic necessity for companies

00:13:03.990 --> 00:13:07.610
like OpenAI. Okay. Decentralization. Yes. Second,

00:13:07.870 --> 00:13:10.429
public trust in AI is really fragile. It's under

00:13:10.429 --> 00:13:12.470
constant pressure from things like that AI washing

00:13:12.470 --> 00:13:14.389
we talked about and all these legal challenges

00:13:14.389 --> 00:13:17.889
popping up. Trust is a major issue. Right. Fragile

00:13:17.889 --> 00:13:20.500
trust. And the third thing. The third thing is

00:13:20.500 --> 00:13:23.919
just the sheer, almost unbelievable demand for

00:13:23.919 --> 00:13:26.600
compute power. It's so intense, it's literally

00:13:26.600 --> 00:13:30.279
pushing the industry to look off planet, to space,

00:13:30.419 --> 00:13:33.059
to orbit, to figure out how to power its future.

00:13:33.980 --> 00:13:36.779
Decentralization, fragile trust, and looking

00:13:36.779 --> 00:13:39.690
to space for power. Got it. And knowing this

00:13:39.690 --> 00:13:42.029
stuff, it's really vital for you listening now

00:13:42.029 --> 00:13:44.289
to understand the strategies behind these huge

00:13:44.289 --> 00:13:47.230
tech companies. Their decisions today are genuinely

00:13:47.230 --> 00:13:49.669
going to shape how pretty much every digital

00:13:49.669 --> 00:13:52.230
service works over the next 10 years or more.

00:13:52.450 --> 00:13:54.850
Absolutely. Okay, so here's a final thought for

00:13:54.850 --> 00:13:56.590
you to chew on, something to consider after we

00:13:56.590 --> 00:13:59.350
wrap up. Will any of these big cloud providers,

00:13:59.509 --> 00:14:02.929
AWS, Azure, Google Cloud, will any of them truly

00:14:02.929 --> 00:14:06.220
stay neutral? Switzerland. Once hosting, the

00:14:06.220 --> 00:14:09.879
core AI models become seen as the new essential

00:14:09.879 --> 00:14:12.700
global infrastructure. Ooh, that's a great question.

00:14:13.000 --> 00:14:15.440
Can neutrality hold when the stakes are that

00:14:15.440 --> 00:14:17.299
high? Something to think about in a world that

00:14:17.299 --> 00:14:19.600
seems to be changing at, well, orbital speed.

00:14:20.039 --> 00:14:21.919
Thanks so much for joining us for this deep dive

00:14:21.919 --> 00:14:23.940
into the sources today. We really hope you found

00:14:23.940 --> 00:14:25.919
it useful. Keep exploring. Keep questioning.

00:14:26.179 --> 00:14:27.700
Yeah. Thanks, everyone. We'll get you next time.
