WEBVTT

00:00:00.000 --> 00:00:02.799
So when we talk about artificial intelligence,

00:00:02.980 --> 00:00:05.660
the conversation often gets stuck in these two

00:00:05.660 --> 00:00:08.060
really simple boxes. Yeah, it's either the super

00:00:08.060 --> 00:00:11.880
sunny view, instant utopia, robots doing everything.

00:00:11.919 --> 00:00:13.980
Or it's the complete opposite total disaster,

00:00:14.279 --> 00:00:16.519
societal collapse looming. Right. And that black

00:00:16.519 --> 00:00:19.679
and white thinking, it kind of misses the reality

00:00:19.679 --> 00:00:23.000
that, well, the sources we looked at, the folks

00:00:23.000 --> 00:00:25.039
actually building this stuff, are really focused

00:00:25.039 --> 00:00:27.940
on. It's not about perfection or destruction

00:00:27.940 --> 00:00:29.660
happening overnight, it's about the timeline.

00:00:30.260 --> 00:00:34.219
We're heading into, well, a pretty messy, uncomfortable

00:00:34.219 --> 00:00:37.520
transition, maybe 10, 15 years. And that transition,

00:00:37.659 --> 00:00:39.460
that's what we're calling the tough years. And

00:00:39.460 --> 00:00:41.700
if there's one really big warning buried in all

00:00:41.700 --> 00:00:44.899
the expert analysis we read, it's this. You really,

00:00:44.899 --> 00:00:47.259
really do not want to be financially vulnerable

00:00:47.259 --> 00:00:50.200
when this period hits peak instability. Beat.

00:00:50.879 --> 00:00:54.179
Welcome to the deep dive. So our mission today

00:00:54.179 --> 00:00:56.920
is basically to pull out the practical economic

00:00:56.920 --> 00:00:59.100
strategy from all that noise. We're going to

00:00:59.100 --> 00:01:01.780
set aside the hype, set aside the fear and focus

00:01:01.780 --> 00:01:04.670
on how you can build some money safety. like

00:01:04.670 --> 00:01:08.150
actually owning durable assets before AI makes

00:01:08.150 --> 00:01:10.709
catching up feel almost impossible for the average

00:01:10.709 --> 00:01:13.010
person. OK, so let's unpack this. It seems like

00:01:13.010 --> 00:01:15.730
a really crucial path forward. First, we need

00:01:15.730 --> 00:01:19.030
to nail down why that 15 -year mess seems, well,

00:01:19.430 --> 00:01:22.730
inevitable and why banking on a safe job is actually

00:01:22.730 --> 00:01:26.950
a dangerous idea. Then we'll pinpoint those three

00:01:26.950 --> 00:01:30.239
kinds of core assets that AI just can't replicate.

00:01:30.379 --> 00:01:32.560
Yeah, the stuff it can't copy. And finally, we

00:01:32.560 --> 00:01:34.900
get to what sounds like the most powerful strategy,

00:01:35.319 --> 00:01:37.599
how to actually control the gateways in this

00:01:37.599 --> 00:01:40.739
new AI -powered economy. Let's do it. OK, starting

00:01:40.739 --> 00:01:43.859
with this unavoidable 15 -year reality. We mentioned

00:01:43.859 --> 00:01:46.319
the two common camps, the optimists and the pessimists,

00:01:46.420 --> 00:01:49.959
but this third view, the more, let's say, sober

00:01:49.959 --> 00:01:52.000
perspective, it suggests we're looking at maybe

00:01:52.000 --> 00:01:54.200
a decade and a half where AI is smart enough

00:01:54.200 --> 00:01:58.299
to take jobs, but society just isn't ready structurally

00:01:58.299 --> 00:02:01.359
for what comes next. And that gap, that difference

00:02:01.359 --> 00:02:04.719
in speed? That's the absolute key. Technology

00:02:04.719 --> 00:02:07.500
moves like code. It's exponential. It copies

00:02:07.500 --> 00:02:11.020
itself, improves almost instantly. But governments,

00:02:11.439 --> 00:02:14.259
big institutions, they move at the speed of,

00:02:14.259 --> 00:02:17.400
well, bureaucracy. Committees, debates, voting,

00:02:17.979 --> 00:02:20.689
it's slow. And that pace gap is what creates

00:02:20.689 --> 00:02:23.150
the crisis, isn't it? Like past tech shifts,

00:02:23.629 --> 00:02:26.069
think Uber replacing taxis. Right, it just shifted

00:02:26.069 --> 00:02:29.229
workers sideways. It created a new similar job.

00:02:29.469 --> 00:02:32.069
Exactly. But the analysis we saw makes it really

00:02:32.069 --> 00:02:35.330
clear, AI ends jobs. It doesn't just change them.

00:02:35.469 --> 00:02:37.090
Once self -driving trucks are everywhere. That

00:02:37.090 --> 00:02:39.780
long haul driver job, poof, it's just gone. Yeah.

00:02:39.939 --> 00:02:41.819
No new driving job replaces it for that person.

00:02:41.879 --> 00:02:44.500
And that's the economic problem. If millions

00:02:44.500 --> 00:02:46.280
suddenly lose their income. They lose their buying

00:02:46.280 --> 00:02:48.560
power. Right. And if demand drops off a cliff,

00:02:48.620 --> 00:02:51.740
the whole consumer economy could just seize up,

00:02:52.020 --> 00:02:54.300
collapse even. Which is where something like

00:02:54.300 --> 00:02:57.460
universal basic income, UBI, starts to look less

00:02:57.460 --> 00:02:59.740
like a political ace to have. And more like an

00:02:59.740 --> 00:03:02.740
economic necessity. Exactly. It might be the

00:03:02.740 --> 00:03:04.900
only way that people without jobs can keep buying

00:03:04.900 --> 00:03:07.819
anything, keeping enough demand alive so the

00:03:07.819 --> 00:03:10.169
businesses that do survive. Don't just fold.

00:03:10.909 --> 00:03:13.250
So UBI becomes this kind of mandatory safety

00:03:13.250 --> 00:03:16.250
net for the whole structure. Pretty much. But,

00:03:16.530 --> 00:03:18.469
and this is the crucial part, because institutions

00:03:18.469 --> 00:03:21.449
move so slowly, society won't just flip a switch

00:03:21.449 --> 00:03:24.370
and implement UBI overnight. No. And that lag,

00:03:24.509 --> 00:03:27.469
that gap between the problem hitting and the

00:03:27.469 --> 00:03:30.069
solution being ready, that's what creates these

00:03:30.069 --> 00:03:32.229
tough years. OK. And those tough years, that

00:03:32.229 --> 00:03:35.289
10 to 15 year window, that's basically the last

00:03:35.289 --> 00:03:38.389
chance for regular people to build real financial

00:03:38.389 --> 00:03:40.849
security the old way. That seems to be the argument.

00:03:41.150 --> 00:03:43.430
Because after that, once AI is generating most

00:03:43.430 --> 00:03:46.189
of the new wealth, if you don't already own assets,

00:03:46.870 --> 00:03:48.430
catching up is going to be exponentially harder.

00:03:48.620 --> 00:03:51.280
So if that transition feels inevitable, just

00:03:51.280 --> 00:03:54.240
how critical is getting something like UBI in

00:03:54.240 --> 00:03:57.219
place to stop a really catastrophic economic

00:03:57.219 --> 00:03:59.860
failure globally? It's framed as necessary, really,

00:04:00.180 --> 00:04:02.319
to keep consumer demand from collapsing entirely.

00:04:02.539 --> 00:04:05.340
OK. That inherent uncertainty means just relying

00:04:05.340 --> 00:04:08.400
on a steady job income feels, well, fundamentally

00:04:08.400 --> 00:04:11.319
risky now. Totally. Which brings us to the next

00:04:11.319 --> 00:04:13.870
point. this really dangerous fantasy people have

00:04:13.870 --> 00:04:16.810
about certain jobs being safe from AI. Right,

00:04:16.889 --> 00:04:19.290
people are looking for shelter, naturally, but

00:04:19.290 --> 00:04:22.310
the data seems to debunk these hopes pretty systematically.

00:04:22.490 --> 00:04:25.410
Yeah, let's tackle them. Lie number one, creative

00:04:25.410 --> 00:04:28.670
jobs are safe, you know, because they need feeling

00:04:28.670 --> 00:04:32.290
or originality. But the research suggests maybe

00:04:32.290 --> 00:04:36.129
only the very top, like 1 % of artists, the ones

00:04:36.129 --> 00:04:39.439
with truly unique signature styles might be okay.

00:04:39.759 --> 00:04:42.240
Because AI is getting incredibly good at handling

00:04:42.240 --> 00:04:45.040
the average creative task. Need a basic logo,

00:04:45.500 --> 00:04:47.800
a standard product description, background music

00:04:47.800 --> 00:04:49.879
for a corporate video. AI can do it, faster,

00:04:50.079 --> 00:04:52.360
cheaper. And often good enough for 90 % of needs.

00:04:52.860 --> 00:04:55.500
We're seeing tools now like Suno AI. You give

00:04:55.500 --> 00:04:57.339
it one sentence and it spits out a full song.

00:04:57.560 --> 00:05:01.120
Lyrics, music, vocals, broadcast quality. It's

00:05:01.120 --> 00:05:04.220
wild. Wow. Okay, line number two. Caring jobs

00:05:04.220 --> 00:05:06.790
are safe. You know, teachers, nurses. AI lacks

00:05:06.790 --> 00:05:09.129
empathy, right? True. It lacks genuine feeling.

00:05:09.670 --> 00:05:12.629
But the sources show AI is poised to absorb up

00:05:12.629 --> 00:05:14.649
to 80 % of the administrative and support broodens

00:05:14.649 --> 00:05:17.029
in these fields. 80%. So how does that play out?

00:05:17.290 --> 00:05:19.910
Take teaching. If AI handles all the grading,

00:05:20.430 --> 00:05:22.509
creates personalized lesson plans for each student,

00:05:23.250 --> 00:05:25.879
manages the endless paperwork. Then the human

00:05:25.879 --> 00:05:29.600
teacher's role changes. Massively. Suddenly one

00:05:29.600 --> 00:05:31.699
human teacher might effectively manage a class

00:05:31.699 --> 00:05:35.040
of 100 students, not 30. So you just need fewer

00:05:35.040 --> 00:05:37.560
teachers overall, even if that human connection

00:05:37.560 --> 00:05:39.680
part remains vital for the kids who get it. I

00:05:39.680 --> 00:05:43.399
see. And in medicine. Similar shift. Highly skilled

00:05:43.399 --> 00:05:45.439
doctors might transition from doing the scan

00:05:45.439 --> 00:05:48.560
analysis to managing the AI that reads thousands

00:05:48.560 --> 00:05:52.379
of scans 247 with incredible accuracy. Again,

00:05:52.680 --> 00:05:54.819
efficiency leads to needing fewer humans for

00:05:54.819 --> 00:05:57.160
the same output. OK. And the third big one, line

00:05:57.160 --> 00:06:00.259
number three, manual jobs are safe. You know,

00:06:00.319 --> 00:06:03.279
the classic AI can't swing a hammer or fix my

00:06:03.279 --> 00:06:05.399
leaky faucet. Yeah. But we got to remember the

00:06:05.399 --> 00:06:08.649
simple equation, AI is. The brain. Robots or

00:06:08.649 --> 00:06:11.170
the body. Exactly. This shift will likely take

00:06:11.170 --> 00:06:13.509
longer, no doubt. Getting dexterous, adaptable

00:06:13.509 --> 00:06:16.310
robots is harder, but the progress is just dizzying.

00:06:16.490 --> 00:06:19.189
Like Boston Dynamics. Yeah, look at them. Or

00:06:19.189 --> 00:06:21.490
robot arms and pigeons cooking a thousand perfect

00:06:21.490 --> 00:06:24.089
meals without a break. Construction bots building

00:06:24.089 --> 00:06:26.850
walls faster, straighter than humans. The cost

00:06:26.850 --> 00:06:29.790
is dropping, the capability is rising. You will

00:06:29.790 --> 00:06:32.490
displace manual labor eventually, globally. It

00:06:32.490 --> 00:06:35.050
feels... Honestly, it feels incredibly challenging

00:06:35.050 --> 00:06:37.310
to keep up with how fast everything is changing.

00:06:37.949 --> 00:06:40.550
I'll admit, I still wrestle sometimes with how

00:06:40.550 --> 00:06:42.970
to advise people on learning new skills when

00:06:42.970 --> 00:06:45.649
the target seems to shift every six months, you

00:06:45.649 --> 00:06:49.170
know, trying to find a safe job in this environment.

00:06:49.949 --> 00:06:52.170
One source described it like trying to hide on

00:06:52.170 --> 00:06:54.310
a tiny rubber dinghy during a tsunami. It just

00:06:54.310 --> 00:06:57.569
feels futile. So if pretty much every job category

00:06:57.569 --> 00:07:00.240
is unstable to some degree, What's the single

00:07:00.240 --> 00:07:02.860
most crucial quality an asset needs to have right

00:07:02.860 --> 00:07:05.550
now to survive this transition? Durability, definitely.

00:07:05.649 --> 00:07:07.649
And it must be something AI can't easily make

00:07:07.649 --> 00:07:10.029
or copy itself. Right. So if the job is being

00:07:10.029 --> 00:07:12.230
the rower on the boat and the boat's sinking...

00:07:12.230 --> 00:07:14.589
You need to own the boat itself. Or maybe a lifeboat.

00:07:14.889 --> 00:07:16.670
Exactly. So let's talk about owning the boat.

00:07:17.310 --> 00:07:20.149
These specific assets, the real things, AI can't

00:07:20.149 --> 00:07:22.850
just conjure up. The core strategy seems to be

00:07:22.850 --> 00:07:26.029
investing in kind of boring stuff. Assets with

00:07:26.029 --> 00:07:28.829
a long lifespan that this new, super -efficient

00:07:28.829 --> 00:07:31.589
AI economy has to use. Okay, like what? Group

00:07:31.589 --> 00:07:34.860
one. Physical spaces, but with strict rules attached.

00:07:35.319 --> 00:07:37.699
The key insight here isn't the building itself.

00:07:37.980 --> 00:07:40.939
It's the legal stuff. The licenses, the regulations,

00:07:41.120 --> 00:07:43.560
the zoning laws. Ah, because government moves

00:07:43.560 --> 00:07:47.360
slowly. Glacially. AI moves at light speed, but

00:07:47.360 --> 00:07:49.980
getting Kermits. That takes time, money, lawyers.

00:07:50.339 --> 00:07:52.600
That regulatory bottleneck becomes the valuable

00:07:52.600 --> 00:07:54.920
asset. OK, give me an example. All right. Imagine

00:07:54.920 --> 00:07:57.839
AI instantly creates 100 amazing new cake brands

00:07:57.839 --> 00:08:00.600
overnight. Fantastic. But where do they legally

00:08:00.600 --> 00:08:02.839
bake those cakes for sale? They need a licensed

00:08:02.839 --> 00:08:05.639
commercial kitchen. Bingo. If you own that licensed

00:08:05.639 --> 00:08:08.079
kitchen, all 100 AIK companies might need to

00:08:08.079 --> 00:08:10.899
rent space from you. Or, say an AI discovers

00:08:10.899 --> 00:08:13.180
a revolutionary new drug formula. Still needs

00:08:13.180 --> 00:08:16.560
testing. FDA approval. Right. It must pay your

00:08:16.560 --> 00:08:20.120
licensed lab to do the required FDA trials. The

00:08:20.120 --> 00:08:22.019
value isn't the lab equipment nearly as much

00:08:22.019 --> 00:08:23.920
as the license protected by the Food and Drug

00:08:23.920 --> 00:08:26.939
Administration. AI can't just code its way around

00:08:26.939 --> 00:08:29.759
that. So the asset isn't the physical thing,

00:08:29.819 --> 00:08:32.279
it's the bureaucratic moat around it. The months

00:08:32.279 --> 00:08:35.360
of paperwork, the fees, the audits. AI can't

00:08:35.360 --> 00:08:38.500
fast -track that. Precisely. Okay, group two.

00:08:39.320 --> 00:08:42.019
Human services, but based on trust and status.

00:08:42.500 --> 00:08:44.879
This feels counterintuitive if everything gets

00:08:44.879 --> 00:08:47.340
automated and cheap. Humans will suddenly pay

00:08:47.340 --> 00:08:49.820
a massive premium for genuine human connection,

00:08:50.000 --> 00:08:52.639
reassurance, and, well, peace of mind, status.

00:08:52.740 --> 00:08:55.399
The trust economy. Yeah. AI might give you a

00:08:55.399 --> 00:08:58.419
medical diagnosis in seconds, for free, but wealthy

00:08:58.419 --> 00:09:01.460
clients. They'll still pay, say, $20 ,000 a year

00:09:01.460 --> 00:09:03.759
for Concierge Medicine. Why? For the relationship,

00:09:03.940 --> 00:09:06.240
the trusted human doctor they can call any time.

00:09:06.500 --> 00:09:09.279
Exactly. They're buying access and trust. Or

00:09:09.279 --> 00:09:11.279
think of an elite coach. They aren't just selling

00:09:11.279 --> 00:09:13.519
a workout plan AI could generate. They're selling

00:09:13.519 --> 00:09:16.360
confidence, accountability, that personal connection,

00:09:16.860 --> 00:09:19.460
bespoke relationship management. That makes sense

00:09:19.460 --> 00:09:22.009
for the super rich. But how does the average

00:09:22.009 --> 00:09:24.809
person tap into that trust economy besides just

00:09:24.809 --> 00:09:26.690
being the one providing the high -end service?

00:09:26.769 --> 00:09:28.730
Is it just about selling your personal connection?

00:09:29.149 --> 00:09:31.190
Well, partly, yes, building that reputation.

00:09:31.470 --> 00:09:32.850
Yeah. But think bigger about the value shift

00:09:32.850 --> 00:09:36.889
itself. It's kind of profound. Whoa. Yeah. Imagine

00:09:36.889 --> 00:09:39.629
a world where almost everything material is abundant

00:09:39.629 --> 00:09:42.250
and cheap thanks to AI and robots. Yeah. And

00:09:42.250 --> 00:09:47.070
yet people willingly pay huge sums just for a

00:09:47.070 --> 00:09:49.700
trusted human hand on the shoulder. for authentic

00:09:49.700 --> 00:09:52.379
connection, that willingness, that need for genuine

00:09:52.379 --> 00:09:54.620
human interaction in a hyper -efficient world,

00:09:55.299 --> 00:09:57.940
that's a deep value. Okay. That leads into group

00:09:57.940 --> 00:10:00.799
three, physical goods with a story. Right. If

00:10:00.799 --> 00:10:03.799
AI can manufacture anything new, perfectly on

00:10:03.799 --> 00:10:06.100
demand, then mass -produced items lose their

00:10:06.100 --> 00:10:08.759
cachet, authenticity becomes rare. Exactly. People

00:10:08.759 --> 00:10:10.620
will turn back to things with history, rarity,

00:10:10.840 --> 00:10:13.720
a verifiable story, a human touch, the signature.

00:10:13.840 --> 00:10:15.980
Like that 50 -year -old model whiskey example.

00:10:16.399 --> 00:10:19.740
Perfect example. AI could mimic the taste, maybe

00:10:19.740 --> 00:10:22.720
perfectly, but it can't sell that specific bottle

00:10:22.720 --> 00:10:26.080
for $50 ,000. Why? It doesn't have the story.

00:10:26.340 --> 00:10:28.779
It didn't sit in that barrel for five decades.

00:10:29.240 --> 00:10:31.360
People pay for the history. The uniqueness AI

00:10:31.360 --> 00:10:33.980
can't replicate, or like handmade furniture from

00:10:33.980 --> 00:10:37.080
a famous artisan. The signature matters. OK,

00:10:37.240 --> 00:10:40.820
so beyond just scarcity or history, how can we

00:10:40.820 --> 00:10:43.120
really test if these asset types will hold up

00:10:43.120 --> 00:10:45.519
long term through the whole transition? Apply

00:10:45.519 --> 00:10:48.919
the after UBI test. Ask yourself, once people

00:10:48.919 --> 00:10:50.519
have their basic needs met through something

00:10:50.519 --> 00:10:52.500
like UBI, and they don't have to work just to

00:10:52.500 --> 00:10:55.299
survive, will they still voluntarily choose to

00:10:55.299 --> 00:10:56.899
spend their money on this thing, this asset,

00:10:57.019 --> 00:11:00.039
this service? Ah, the after UBI test. That forces

00:11:00.039 --> 00:11:02.440
a focus on what people truly desire beyond mere

00:11:02.440 --> 00:11:05.320
survival, connection, status, novelty, history.

00:11:05.600 --> 00:11:07.840
Exactly. It filters for durable human desires.

00:11:08.360 --> 00:11:10.679
But we can refine the strategy even more. Owning

00:11:10.679 --> 00:11:13.539
a single asset is good, but controlling a gateway?

00:11:13.899 --> 00:11:15.879
That's even better. A gateway, like a choke point

00:11:15.879 --> 00:11:18.600
everyone has to pass through. Yes, be the gatekeeper.

00:11:18.879 --> 00:11:20.600
It's often better than just owning one nice house

00:11:20.600 --> 00:11:22.659
on the street. Okay, what kinds of gateways are

00:11:22.659 --> 00:11:25.700
there? Gateway one is infrastructure. Think of

00:11:25.700 --> 00:11:28.679
it as the pipe. The goal here is to own some

00:11:28.679 --> 00:11:31.840
physical necessity that all these new AI -driven

00:11:31.840 --> 00:11:34.940
businesses need, no matter what specific product

00:11:34.940 --> 00:11:36.480
they're selling. So you're not competing with

00:11:36.480 --> 00:11:39.740
the hundred AI t -shirt companies? No way. Instead,

00:11:39.799 --> 00:11:41.919
you own the specialized fabric printing factory

00:11:41.919 --> 00:11:45.000
they all have to use. Or you own the key temperature

00:11:45.000 --> 00:11:47.179
-controlled warehouse in the region that all

00:11:47.179 --> 00:11:49.820
the AI gourmet food delivery services rely on.

00:11:49.980 --> 00:11:52.220
And you charge everyone a fee to use your pipe.

00:11:52.360 --> 00:11:55.129
You got it. You're tolling the AI traffic. Gateway

00:11:55.129 --> 00:11:58.889
2. Distribution. The door. Right. AI can create

00:11:58.889 --> 00:12:01.450
the perfect personalized movie or app instantly?

00:12:02.129 --> 00:12:05.190
Great. But how does that amazing creation actually

00:12:05.190 --> 00:12:08.450
reach 100 million potential users? It needs a

00:12:08.450 --> 00:12:11.730
distribution channel. A door. We see this already,

00:12:12.009 --> 00:12:14.669
right? Apple's App Store. It's a classic gateway.

00:12:15.049 --> 00:12:16.809
They don't make most of the apps, but they control

00:12:16.809 --> 00:12:18.970
the main door to the users, and they take their

00:12:18.970 --> 00:12:22.529
30 % cut. A toll for access. Exactly. In the

00:12:22.529 --> 00:12:24.870
physical world, this could be controlling the

00:12:24.870 --> 00:12:27.629
main logistics hubs, the last mile delivery networks

00:12:27.629 --> 00:12:30.929
in a big city. Power concentrates at that access

00:12:30.929 --> 00:12:33.149
point. And the third gateway, this sounds like

00:12:33.149 --> 00:12:36.509
maybe the strongest one, regulation or the rules.

00:12:36.789 --> 00:12:38.929
Yeah, this one's powerful because it's protected

00:12:38.929 --> 00:12:42.240
by government inertia. AI can't lobby Congress,

00:12:42.720 --> 00:12:45.279
it can't vote, it can't rewrite zoning laws itself.

00:12:45.460 --> 00:12:49.039
So if you hold a specific, hard -to -get license.

00:12:49.240 --> 00:12:51.460
Like a license to operate in a highly regulated

00:12:51.460 --> 00:12:54.379
industry. Handling specialized medical waste,

00:12:54.840 --> 00:12:57.059
importing certain controlled goods, running a

00:12:57.059 --> 00:12:59.539
specific type of bank, having mineral rights?

00:12:59.820 --> 00:13:02.200
than any AI company wanting to operate in that

00:13:02.200 --> 00:13:04.460
space. They have to partner with you or maybe

00:13:04.460 --> 00:13:06.399
acquire you. They can't just code a workaround

00:13:06.399 --> 00:13:08.700
for the legal requirement. The license is the

00:13:08.700 --> 00:13:10.600
unavoidable gate. And because getting or changing

00:13:10.600 --> 00:13:12.899
those licenses takes so long through government

00:13:12.899 --> 00:13:15.379
channels. They offer incredible stability, potentially

00:13:15.379 --> 00:13:18.100
for decades, a real financial moat during the

00:13:18.100 --> 00:13:20.679
tough years. That makes a lot of sense. But isn't

00:13:20.679 --> 00:13:23.860
there a risk in relying on bureaucracy? Couldn't

00:13:23.860 --> 00:13:27.240
a future government just regulate away your license,

00:13:27.279 --> 00:13:29.299
your monopoly? Oh, absolutely. All regulation

00:13:29.299 --> 00:13:31.980
carries political risk. But the argument in the

00:13:31.980 --> 00:13:34.879
sources is about timelines. The time it takes

00:13:34.879 --> 00:13:37.860
government to dismantle a critical embedded license,

00:13:38.039 --> 00:13:41.259
say, for a major resource is likely vastly longer

00:13:41.259 --> 00:13:43.919
than the time it takes AI to make a specific

00:13:43.919 --> 00:13:46.759
human skill obsolete. You're betting on inertia

00:13:46.759 --> 00:13:49.700
winning in the medium term. OK. As we look even

00:13:49.700 --> 00:13:51.220
further ahead, the source has brought up a couple

00:13:51.220 --> 00:13:53.779
of really fascinating, almost strange ideas about

00:13:53.779 --> 00:13:56.940
how value itself might shift based on these principles.

00:13:57.240 --> 00:13:59.120
Yeah, two really interesting thought experiments.

00:13:59.480 --> 00:14:03.019
Strange idea one, homes become super cheap. the

00:14:03.019 --> 00:14:05.440
actual structure anyway. How so? Well, imagine

00:14:05.440 --> 00:14:08.779
AI designs flawless optimized houses and robots

00:14:08.779 --> 00:14:11.100
using advanced materials build them on site in,

00:14:11.100 --> 00:14:13.600
say, three days. The cost of the building itself

00:14:13.600 --> 00:14:15.659
could plummet towards almost zero. So the house

00:14:15.659 --> 00:14:17.940
itself isn't worth much anymore. Right. The standard

00:14:17.940 --> 00:14:19.799
housing market might collapse. So where does

00:14:19.799 --> 00:14:21.759
the value go? It shifts entirely to the land.

00:14:22.039 --> 00:14:24.940
Location, location, location. Exactly. You can't

00:14:24.940 --> 00:14:28.000
3D print more beachfront land. You can't AI generate

00:14:28.000 --> 00:14:30.940
more acres in a prime historic district, or right

00:14:30.940 --> 00:14:33.179
next to that critical gateway warehouse we talked

00:14:33.179 --> 00:14:36.820
about. Unique, well -located land becomes the

00:14:36.820 --> 00:14:39.759
scarce, expensive thing. Wow. Okay, strange idea

00:14:39.759 --> 00:14:43.019
two involves a whole industry potentially vanishing,

00:14:43.759 --> 00:14:46.820
like farming. Yeah, imagine this. An AI perfects

00:14:46.820 --> 00:14:49.120
a food printer, like Star Trek Replicator Late.

00:14:49.320 --> 00:14:52.539
It uses cheap, basic nutritional powder, algae,

00:14:52.840 --> 00:14:55.200
insect protein, whatever, and synthesizes meals

00:14:55.200 --> 00:14:57.539
that are delicious, perfectly healthy, and customized

00:14:57.539 --> 00:14:59.840
to your needs. Okay, so world hunger solved,

00:15:00.120 --> 00:15:02.559
maybe? Potentially. But what happens to traditional

00:15:02.559 --> 00:15:05.019
agriculture? Farming, long -haul food transport,

00:15:05.120 --> 00:15:06.820
maybe even grocery stores, we know them. They

00:15:06.820 --> 00:15:09.340
could largely disappear, or shrink massively.

00:15:09.379 --> 00:15:11.879
But we're still human, right? Even with perfect

00:15:11.879 --> 00:15:14.120
nutrient paste available, what becomes a luxury?

00:15:14.360 --> 00:15:17.480
Eating real food, like an actual strawberry grown

00:15:17.480 --> 00:15:21.480
in the dirt. Bingo. Eating a messy, imperfect,

00:15:21.899 --> 00:15:24.740
authentic, ground -grown strawberry suddenly

00:15:24.740 --> 00:15:27.639
becomes a status symbol, a luxury experience.

00:15:28.179 --> 00:15:30.240
So you might see the emergence of luxury farming,

00:15:30.799 --> 00:15:33.759
small -scale, high -end, focused entirely on

00:15:33.759 --> 00:15:36.019
authenticity and the story, selling to people

00:15:36.019 --> 00:15:38.519
who want the real thing. The value shifts completely

00:15:38.519 --> 00:15:41.980
from necessity to narrative and status. You got

00:15:41.980 --> 00:15:45.139
it. Okay, let's try to wrap up this deep dive

00:15:45.139 --> 00:15:48.559
into navigating the financial side of the tough

00:15:48.559 --> 00:15:50.679
years. The road ahead looks messy, definitely

00:15:50.679 --> 00:15:53.940
unstable, but the analysis suggests it is navigable.

00:15:54.019 --> 00:15:55.899
Yeah, if you act now. Well, the current economic

00:15:55.899 --> 00:15:58.259
rules are still, you know, mostly familiar. The

00:15:58.259 --> 00:16:00.980
goal isn't to try and outsmart or outcompete

00:16:00.980 --> 00:16:03.879
AI at its own game of speed and efficiency. No,

00:16:04.080 --> 00:16:06.200
your job is to build a kind of financial fortress.

00:16:06.460 --> 00:16:08.379
something that can withstand that 10 to 15 year

00:16:08.379 --> 00:16:11.039
storm before the new safety nets like UBI are

00:16:11.039 --> 00:16:13.100
fully in place and working. And that seems to

00:16:13.100 --> 00:16:15.559
boil down to three key actions right now. One,

00:16:15.940 --> 00:16:18.019
act now while things are still relatively stable.

00:16:18.080 --> 00:16:20.500
Two, own the gateway, that infrastructure, that

00:16:20.500 --> 00:16:22.940
regulatory license, that distribution choke point.

00:16:23.159 --> 00:16:26.720
And three, focus only on assets that pass that

00:16:26.720 --> 00:16:29.940
after UBI test, things that tap into deep human

00:16:29.940 --> 00:16:33.320
needs for connection, status, history, authenticity.

00:16:33.710 --> 00:16:36.490
So if you manage to prepare successfully, if

00:16:36.490 --> 00:16:39.289
you build that resilience, an AI does eventually

00:16:39.289 --> 00:16:42.710
take over all the necessary boring work for humanity.

00:16:43.029 --> 00:16:45.389
It leads to a really big kind of philosophical

00:16:45.389 --> 00:16:47.909
question for you, the listener, to think about.

00:16:47.909 --> 00:16:50.710
If you no longer have to work just to survive,

00:16:51.610 --> 00:16:54.309
what will you choose to create? What truly human

00:16:54.309 --> 00:16:56.590
project, what passion, what authentic connection

00:16:56.590 --> 00:16:59.149
will you pursue? What becomes your purpose then?

00:16:59.960 --> 00:17:02.039
That's a powerful thought. So we really encourage

00:17:02.039 --> 00:17:04.200
you listening. Start looking now for the unique

00:17:04.200 --> 00:17:06.440
gateways in your own field, in your local area.

00:17:06.500 --> 00:17:08.880
Yeah, where are those hidden licenses? The essential

00:17:08.880 --> 00:17:11.099
warehouses. Yeah. The transport bottlenecks.

00:17:11.140 --> 00:17:13.539
Find the gate. And then figure out how you can

00:17:13.539 --> 00:17:14.000
get the key.
