WEBVTT

00:00:00.000 --> 00:00:01.740
So I want you to imagine something for a second.

00:00:02.020 --> 00:00:05.580
You're standing on a bitch, and it's perfect.

00:00:05.900 --> 00:00:07.860
The sun is out. The air is warm. You can smell

00:00:07.860 --> 00:00:11.519
the salt. But then you notice something a little

00:00:11.519 --> 00:00:14.119
strange. The water pulls back. I mean, it goes

00:00:14.119 --> 00:00:16.239
further out than you've ever seen. And the sand

00:00:16.239 --> 00:00:20.940
is suddenly just dry and exposed. It gets really

00:00:20.940 --> 00:00:23.260
quiet. And if you don't know what that means,

00:00:23.320 --> 00:00:25.940
you might just think, huh, low tide. But that

00:00:25.940 --> 00:00:29.420
quiet, that drawback, That's the warning sign

00:00:29.420 --> 00:00:34.140
for a tsunami a wall of water moving at 800 kilometers

00:00:34.140 --> 00:00:36.939
per hour and that image that exact moment That's

00:00:36.939 --> 00:00:39.700
where the job market is right now last year 1

00:00:39.700 --> 00:00:43.560
.1 million jobs gone at the exact same time Companies

00:00:43.560 --> 00:00:46.939
are just pouring billions into AI. It's not a

00:00:46.939 --> 00:00:49.960
wave It is a wall of water and most people are

00:00:49.960 --> 00:00:51.000
you know, they're just standing there on the

00:00:51.000 --> 00:00:53.600
sand looking at the seashells It's a really powerful

00:00:53.600 --> 00:00:56.520
image and welcome to the deep dive today. We

00:00:56.520 --> 00:00:58.619
are not just going to talk about AI technology

00:00:58.619 --> 00:01:00.380
We're not talking about the newest chatbot We're

00:01:00.380 --> 00:01:02.600
looking at a blueprint for well for your own

00:01:02.600 --> 00:01:05.319
survival for thriving Really in a world that

00:01:05.319 --> 00:01:08.159
is shifting right under our feet, right? And

00:01:08.159 --> 00:01:10.640
our mission for this deep dive is super specific

00:01:10.640 --> 00:01:13.120
We want to move you from what the source material

00:01:13.120 --> 00:01:16.840
calls the flood zone And that's where like 99

00:01:16.840 --> 00:01:20.620
% of people are to the high ground. The top 1

00:01:20.620 --> 00:01:24.019
% who adapt. Exactly. So we have a roadmap. First,

00:01:24.099 --> 00:01:25.700
we're going to look at something called the rail

00:01:25.700 --> 00:01:27.920
test to figure out how safe you are right now.

00:01:28.120 --> 00:01:31.260
A quick diagnostic. Then a framework called script

00:01:31.260 --> 00:01:34.379
versus strategy for your actual work day. And

00:01:34.379 --> 00:01:36.620
finally, we'll get into the specific tools and

00:01:36.620 --> 00:01:39.120
the three Rs you need to, you know, build a career

00:01:39.120 --> 00:01:41.840
fortress. We have to get tactical because that

00:01:41.840 --> 00:01:43.989
water is coming back. And it is coming back fast.

00:01:44.170 --> 00:01:45.870
So let's stay with that metaphor for a second.

00:01:46.230 --> 00:01:49.010
The flood zone versus the high ground, it feels,

00:01:49.569 --> 00:01:51.349
well, it feels a bit dramatic, doesn't it? Is

00:01:51.349 --> 00:01:53.730
it really that black and white? It feels dramatic

00:01:53.730 --> 00:01:56.030
until you look at the economics of it. I mean,

00:01:56.150 --> 00:01:59.549
the source puts it so bluntly. If your job can

00:01:59.549 --> 00:02:02.930
be done 100 % on a computer, you are in a direct

00:02:02.930 --> 00:02:05.969
fight with AI. And here's the really hard truth.

00:02:06.450 --> 00:02:08.389
You're fighting a robot that costs what? Pennies

00:02:08.389 --> 00:02:11.180
to operate. And it never sleeps. You know, I

00:02:11.180 --> 00:02:15.199
have to admit, when I read that, it really gave

00:02:15.199 --> 00:02:19.259
me pause. I think we all have that sort of that

00:02:19.259 --> 00:02:21.580
late night worry, right? I know I do. I'm sitting

00:02:21.580 --> 00:02:24.639
there wondering, could a machine do my research

00:02:24.639 --> 00:02:27.319
faster? Could it structure my thoughts better?

00:02:27.360 --> 00:02:29.800
It's a very human feeling. Oh, totally. I think

00:02:29.800 --> 00:02:32.120
everyone feels that little hum of anxiety. But

00:02:32.120 --> 00:02:35.060
the high ground, it isn't about becoming some

00:02:35.060 --> 00:02:37.819
computer genius overnight. It's not about learning

00:02:37.819 --> 00:02:40.800
to code. It's about positioning. Positioning?

00:02:40.840 --> 00:02:43.080
Yeah. The people who get wiped out are the ones

00:02:43.080 --> 00:02:45.379
who try to compete with the machine on the machine's

00:02:45.379 --> 00:02:47.419
terms. The high -grain is where you do what the

00:02:47.419 --> 00:02:49.900
machine just can't. So how does a listener know

00:02:49.900 --> 00:02:52.259
where they stand right now? You know, am I on

00:02:52.259 --> 00:02:53.939
the beach or am I on the hill? That's where the

00:02:53.939 --> 00:02:56.099
real test comes in. It's like a 50 -second diagnostic.

00:02:56.180 --> 00:03:00.159
Okay. R -A -I -L. Yep. Four questions, and if

00:03:00.159 --> 00:03:03.639
you answer no to two or more of them, you're

00:03:03.639 --> 00:03:05.379
in the danger zone. Okay. Let's walk through

00:03:05.379 --> 00:03:08.419
them. What's R? R is for revenue. Does your work

00:03:08.419 --> 00:03:10.819
make money from real customers right now? So

00:03:10.819 --> 00:03:14.039
not potential money or or future money Exactly.

00:03:14.039 --> 00:03:16.099
We've all been on those projects that are in

00:03:16.099 --> 00:03:18.979
eternal beta right or the department That's just

00:03:18.979 --> 00:03:22.319
researching in this new economy If you aren't

00:03:22.319 --> 00:03:24.419
tied to actual money coming in the door, you're

00:03:24.419 --> 00:03:28.039
vulnerable AI lets companies cut the fat with

00:03:28.039 --> 00:03:31.620
like surgical precision that seems harsh especially

00:03:31.620 --> 00:03:34.539
for people and say R &D or long -term strategy

00:03:34.539 --> 00:03:38.000
is harsh but the window for just pure research

00:03:38.000 --> 00:03:41.379
without a clear path to application, it's shrinking

00:03:41.379 --> 00:03:45.580
fast. OK, that brings us to A, acceleration.

00:03:45.919 --> 00:03:49.860
Right. Can you deliver value to a user in two

00:03:49.860 --> 00:03:52.120
weeks? Two weeks? Wow. I mean, that seems incredibly

00:03:52.120 --> 00:03:55.319
fast for most big companies. It is. But speed

00:03:55.319 --> 00:03:57.360
is a new currency. If you're on a six -month

00:03:57.360 --> 00:03:59.919
cycle to ship a simple update, some AI -assisted

00:03:59.919 --> 00:04:02.120
competitor is going to do it in six days. If

00:04:02.120 --> 00:04:04.259
you can't move fast, you are standing still.

00:04:04.259 --> 00:04:07.099
And if you're standing still, The wave hits you.

00:04:07.460 --> 00:04:10.020
Right. OK. So speed matters. What's I? I is for

00:04:10.020 --> 00:04:12.259
in market. Is the thing you're working on actually

00:04:12.259 --> 00:04:14.219
being used, or is it just sitting on a shelf?

00:04:14.319 --> 00:04:16.139
Is it a hobby, or is it a real business? Let

00:04:16.139 --> 00:04:19.160
me push back on that a little. What if my market

00:04:19.160 --> 00:04:22.399
is internal? You know, my boss is the consumer

00:04:22.399 --> 00:04:25.000
of my weekly report. Does that count? Ooh, that's

00:04:25.000 --> 00:04:27.720
a dangerous gray area. The source argues that

00:04:27.720 --> 00:04:31.139
internal validation is, well, it's awesome political.

00:04:31.279 --> 00:04:33.899
It's polite. Oh, great report. Thanks. But real

00:04:33.899 --> 00:04:36.959
market feedback from a paying customer, that's

00:04:36.959 --> 00:04:40.279
honest. It's brutal. If you're only serving internal

00:04:40.279 --> 00:04:43.000
people, you are much closer to the flood zone

00:04:43.000 --> 00:04:46.199
than you think. OK. And finally, L. Learning.

00:04:46.860 --> 00:04:49.019
Are you learning from real customer mistakes

00:04:49.019 --> 00:04:52.810
every single day? Every day. Yeah. This is so

00:04:52.810 --> 00:04:55.850
crucial. AI models update constantly. If you're

00:04:55.850 --> 00:04:57.990
working off a playbook from three years ago or

00:04:57.990 --> 00:05:00.009
even six months ago, you're becoming obsolete.

00:05:00.189 --> 00:05:02.370
You have to be a learning machine yourself. Revenue

00:05:02.370 --> 00:05:05.170
acceleration in market learning. It's pretty

00:05:05.170 --> 00:05:07.750
tough filter. So if that test comes back red,

00:05:08.670 --> 00:05:11.949
if you realize you're in that danger zone, where's

00:05:11.949 --> 00:05:13.610
the safe ground? You have to climb. You got to

00:05:13.610 --> 00:05:15.250
get with the source calls layer three. Layer

00:05:15.250 --> 00:05:17.029
three. OK, walk us through this map, because

00:05:17.029 --> 00:05:18.910
when people talk about AI, it's usually all about

00:05:18.910 --> 00:05:21.240
the big models or the chips. All right, so picture

00:05:21.240 --> 00:05:24.399
the AI industry like a pyramid. Three layers.

00:05:25.259 --> 00:05:27.180
Layer one, the foundation. That's infrastructure.

00:05:27.300 --> 00:05:30.319
The hardware. The hardware. The chips, the data

00:05:30.319 --> 00:05:34.120
centers. This is Nvidia's world. The trillion

00:05:34.120 --> 00:05:36.860
dollar club. So we should ignore that. Unless

00:05:36.860 --> 00:05:38.199
you have a few billion dollars lying around.

00:05:38.259 --> 00:05:41.160
Yeah. Just ignore it. It's too expensive. OK,

00:05:41.319 --> 00:05:43.920
fair enough. Layer two. Layer two is the models.

00:05:44.379 --> 00:05:47.279
This is where you're open AI, you're Google,

00:05:47.459 --> 00:05:50.120
you're anthropic. This is where they live. They're

00:05:50.120 --> 00:05:52.920
building the brains. Chet, GPT, Claude. And it

00:05:52.920 --> 00:05:54.620
feels like that's where everyone wants to be,

00:05:54.740 --> 00:05:56.240
right? Everyone wants to build their own model.

00:05:56.300 --> 00:05:59.279
And that is the trap. It is so crowded. And it's

00:05:59.279 --> 00:06:01.360
so hard to win. You're fighting absolute giants.

00:06:01.519 --> 00:06:03.339
The margins there are just going to race to zero.

00:06:03.839 --> 00:06:06.480
But then there's layer three. The application

00:06:06.480 --> 00:06:08.740
layer. The app. This is it. This is the big chance.

00:06:09.199 --> 00:06:11.240
This is where you use the intelligence from layer

00:06:11.240 --> 00:06:14.199
two to solve a really boring specific problem.

00:06:14.649 --> 00:06:17.470
for a normal person. Why is that safer? Why isn't

00:06:17.470 --> 00:06:20.050
it just as competitive? Because context is everything.

00:06:20.769 --> 00:06:22.970
See, open AI knows a little bit about everything,

00:06:23.110 --> 00:06:26.329
generally. But it doesn't know the specific zoning

00:06:26.329 --> 00:06:29.550
laws for some small town in Ohio. It doesn't

00:06:29.550 --> 00:06:32.009
know the workflow of a dental practice in London.

00:06:32.769 --> 00:06:34.850
Layer 3 is all about taking that general brain

00:06:34.850 --> 00:06:38.470
and applying it to a very specific mess. So the

00:06:38.470 --> 00:06:41.209
opportunity isn't building the brain. It's like

00:06:41.209 --> 00:06:43.089
being the therapist who helps the brain deal

00:06:43.089 --> 00:06:45.310
with a specific patient's problem. Precisely.

00:06:45.389 --> 00:06:48.129
And there are three huge opportunities here.

00:06:48.410 --> 00:06:51.810
First is specific apps, solving those deep niche

00:06:51.810 --> 00:06:55.490
problems. Second is service and setup, just being

00:06:55.490 --> 00:06:57.629
the person who holds an old school company's

00:06:57.629 --> 00:06:59.889
hand and teaches them how to use AI. And third

00:06:59.889 --> 00:07:02.519
is data help. just cleaning up the mess of information

00:07:02.519 --> 00:07:04.839
so the AI can even use it. It really reframes

00:07:04.839 --> 00:07:06.660
it. You don't have to build the rocket ship.

00:07:06.740 --> 00:07:08.800
You just have to know how to fly it for someone.

00:07:09.019 --> 00:07:11.100
Whoa. I mean, just imagine the scale of that.

00:07:11.300 --> 00:07:13.019
We're not fighting the model builders. We're

00:07:13.019 --> 00:07:14.579
literally standing on their shoulders. You can

00:07:14.579 --> 00:07:16.660
take the smartest intelligence in human history

00:07:16.660 --> 00:07:18.839
and use it to help a local shop owner. That's

00:07:18.839 --> 00:07:21.500
massive. So once we're in that right layer, in

00:07:21.500 --> 00:07:24.959
layer three, how does our day -to -day work change?

00:07:25.519 --> 00:07:28.560
We have to separate script. from strategy. Okay

00:07:28.560 --> 00:07:30.699
the script versus strategy idea. I want to dig

00:07:30.699 --> 00:07:33.180
in here because most jobs are a mix aren't they?

00:07:33.240 --> 00:07:35.160
Oh yeah it's a mix but the ratio is shifting

00:07:35.160 --> 00:07:38.600
like violently. The script parts. Those are the

00:07:38.600 --> 00:07:40.819
boring repetitive tasks. Yeah. You know typing

00:07:40.819 --> 00:07:43.240
data into a spreadsheet, writing the same basic

00:07:43.240 --> 00:07:45.939
email update over and over, summarizing meeting

00:07:45.939 --> 00:07:48.139
notes. It's anything that follows a set of rules.

00:07:48.279 --> 00:07:50.420
That's strategy. Strategy is the human stuff.

00:07:50.699 --> 00:07:54.500
It's creativity, empathy. Figuring out why a

00:07:54.500 --> 00:07:56.399
client is upset, not just reading their email.

00:07:57.019 --> 00:07:58.980
It's building trust. It's connecting dots that

00:07:58.980 --> 00:08:01.500
don't seem related. And the argument is that

00:08:01.500 --> 00:08:04.680
AI just... it eats the script for breakfast.

00:08:04.879 --> 00:08:07.579
It creates this split reality. If you spend eight

00:08:07.579 --> 00:08:09.819
hours a day just following its script, you are

00:08:09.819 --> 00:08:11.920
in so much trouble. The robot does it faster,

00:08:12.079 --> 00:08:14.199
cheaper, and it never complains. So the advice

00:08:14.199 --> 00:08:17.180
is... to fire yourself. That sounds like career

00:08:17.180 --> 00:08:19.540
suicide. It's the big paradox of this whole thing.

00:08:19.560 --> 00:08:21.079
You have to fire yourself from the script to

00:08:21.079 --> 00:08:23.459
save your career. Okay, walk me through the practical

00:08:23.459 --> 00:08:26.139
steps. How do you fire yourself? It's a three

00:08:26.139 --> 00:08:29.740
-step plan. Step one, check your work. Seriously,

00:08:29.879 --> 00:08:33.559
make two columns. Script versus strategy. And

00:08:33.559 --> 00:08:35.039
you have to be honest, if a machine could learn

00:08:35.039 --> 00:08:36.940
to do it, it goes in the script column. And step

00:08:36.940 --> 00:08:41.279
two? Step two is the firing. You use AI to automate

00:08:41.279 --> 00:08:43.889
everything in that left column. You don't wait

00:08:43.889 --> 00:08:45.750
for your boss to do it, you don't wait for IT.

00:08:46.330 --> 00:08:48.330
You become the first person to automate your

00:08:48.330 --> 00:08:51.330
own job. But okay, isn't the big fear here, if

00:08:51.330 --> 00:08:53.370
I automate my job, isn't my boss just gonna see

00:08:53.370 --> 00:08:56.169
that I'm doing less and like, you know, fire

00:08:56.169 --> 00:08:58.950
me for real? That's the fear, 100%. But you have

00:08:58.950 --> 00:09:01.029
to think about the alternative. If you don't

00:09:01.029 --> 00:09:04.049
do it, a competitor will. Or a consultant will

00:09:04.049 --> 00:09:06.049
come in and do it for you. By doing it yourself,

00:09:06.250 --> 00:09:08.379
you buy yourself time for step three. The human

00:09:08.379 --> 00:09:11.259
side. You use all that time you saved for what

00:09:11.259 --> 00:09:14.340
they call deep work. Right. The walking, reading,

00:09:14.720 --> 00:09:17.360
thinking. Exactly. While everyone else is just

00:09:17.360 --> 00:09:19.759
drowning in their inbox, you're the one who's

00:09:19.759 --> 00:09:22.000
actually looking at the horizon, solving the

00:09:22.000 --> 00:09:24.539
really expensive problems. So if we automate

00:09:24.539 --> 00:09:27.320
all those tasks, what are the human traits that

00:09:27.320 --> 00:09:29.519
protect us in the long run? It's what the source

00:09:29.519 --> 00:09:32.159
calls the three Rs. The things computers just

00:09:32.159 --> 00:09:34.200
can't replace. Let's hear them. OK, first is

00:09:34.200 --> 00:09:38.399
rigor. Deep, deep expertise. AI is a fantastic

00:09:38.399 --> 00:09:41.299
liar. It hallucinates. It gives you plausible

00:09:41.299 --> 00:09:44.080
-sounding nonsense. Rigor is your ability to

00:09:44.080 --> 00:09:47.480
look at an AI output and say, nope, that's wrong.

00:09:47.500 --> 00:09:49.399
So you have to be smarter than the machine just

00:09:49.399 --> 00:09:52.379
to use the machine properly. You do. AI exposes

00:09:52.379 --> 00:09:55.940
laziness. Second R is relationships. Trust is

00:09:55.940 --> 00:09:58.700
the ultimate currency. An AI can simulate empathy,

00:09:58.820 --> 00:10:00.879
but it can't sit across from you, share a real

00:10:00.879 --> 00:10:03.039
vulnerability, buy you a coffee, and just listen.

00:10:03.379 --> 00:10:06.059
That human bond. That's what keeps clients. And

00:10:06.059 --> 00:10:08.759
the third R. Resilience. It's the raw ability

00:10:08.759 --> 00:10:10.519
to change. The tools we're talking about today

00:10:10.519 --> 00:10:12.200
might be gone in six months. The whole market

00:10:12.200 --> 00:10:14.259
could shift. If you crumble when your workflow

00:10:14.259 --> 00:10:17.340
changes, you lose. Rigor, relationships, resilience.

00:10:17.580 --> 00:10:19.779
OK, we promised some secret weapons. What are

00:10:19.779 --> 00:10:22.759
the tools? Three specific ones. First up, perplexity

00:10:22.759 --> 00:10:25.840
AI. Right, for sourcing news and answers. It's

00:10:25.840 --> 00:10:28.940
for sourcing truth. See, unlike a normal chatbot,

00:10:29.220 --> 00:10:31.480
it actually cites its sources. It shows you where

00:10:31.480 --> 00:10:34.960
the information came from. In an age of deepfakes,

00:10:35.299 --> 00:10:38.059
perplexity is your anchor for rigor. Got it.

00:10:38.159 --> 00:10:41.220
OK, number two. Cursor. Now, don't tune out if

00:10:41.220 --> 00:10:43.279
you're not a coder. It lets you build little

00:10:43.279 --> 00:10:46.399
apps or fix data just by asking for it in plain

00:10:46.399 --> 00:10:48.440
English. It helps you pass that acceleration

00:10:48.440 --> 00:10:50.600
test we were talking about. And the third tool.

00:10:51.419 --> 00:10:54.149
Gamma. It makes presentations. You just type

00:10:54.149 --> 00:10:56.330
in a topic, and it builds the whole slide deck

00:10:56.330 --> 00:10:58.769
formatting, images, everything. It stops you

00:10:58.769 --> 00:11:00.950
from wasting hours just moving text boxes around.

00:11:01.230 --> 00:11:03.269
It frees you from the script. But it's not just

00:11:03.269 --> 00:11:05.049
having the tool, is it? It's how you talk to

00:11:05.049 --> 00:11:07.350
it. That's everything. It's the prompt. And the

00:11:07.350 --> 00:11:09.409
source has this consultant prompt that I just

00:11:09.409 --> 00:11:11.730
love. How does that one go? So you explain your

00:11:11.730 --> 00:11:13.730
problem to the AI, and then you add this line,

00:11:14.149 --> 00:11:17.210
act like a smart consultant. Tell me why those

00:11:17.210 --> 00:11:19.490
ways might fail so I can prepare a backup plan.

00:11:19.610 --> 00:11:23.309
Tell me why I might fail. That feels counterintuitive.

00:11:23.490 --> 00:11:26.149
It's brilliant. It forces the AI to be critical.

00:11:26.190 --> 00:11:28.269
You don't want a cheerleader. You want a sparring

00:11:28.269 --> 00:11:33.070
partner. So after all this, does the fear just

00:11:33.070 --> 00:11:36.929
go away? No. But the fear turns into action.

00:11:37.190 --> 00:11:39.769
So let's recap this whole journey. We started

00:11:39.769 --> 00:11:42.149
with the tsunami, that threat of the job market

00:11:42.149 --> 00:11:44.690
pulling back. Yep. Then we diagnosed ourselves

00:11:44.690 --> 00:11:47.850
with that rail test. Revenue, acceleration, in

00:11:47.850 --> 00:11:50.129
-market, and learning. And if you failed that

00:11:50.129 --> 00:11:52.610
test, the advice is to move to layer three. You

00:11:52.610 --> 00:11:55.789
build apps, offer services, clean up data. You

00:11:55.789 --> 00:11:58.029
don't try to fight the giants. Right. And you

00:11:58.029 --> 00:12:00.730
fire yourself from the script, all those repetitive

00:12:00.730 --> 00:12:03.710
tasks, so you can double down on strategy and

00:12:03.710 --> 00:12:06.889
the three Rs. Rigor, relationships, and resilience.

00:12:07.289 --> 00:12:09.750
And you use tools like perplexity, cursor, and

00:12:09.750 --> 00:12:11.970
gamma to stay on that high ground. You got it.

00:12:12.070 --> 00:12:13.870
The source ends with this metaphor that I think

00:12:13.870 --> 00:12:15.830
is a great place to leave people. It says, you

00:12:15.830 --> 00:12:17.850
are not a small wave that breaks. You are the

00:12:17.850 --> 00:12:21.340
ocean. I like that. Your value isn't your job

00:12:21.340 --> 00:12:24.000
title, it's your ability to adapt. So tomorrow

00:12:24.000 --> 00:12:26.899
morning, try that consultant prompt on a hard

00:12:26.899 --> 00:12:29.019
problem. See what happens. See you on the high

00:12:29.019 --> 00:12:30.120
ground. Thanks for listening.
