WEBVTT

00:00:00.000 --> 00:00:02.500
Okay, hey there, and welcome to the deep dive.

00:00:02.720 --> 00:00:06.190
Yay. We are jumping into a really fascinating

00:00:06.190 --> 00:00:08.949
stack of sources you shared with us today, all

00:00:08.949 --> 00:00:11.970
about the fast -moving world of AI. It's kind

00:00:11.970 --> 00:00:13.650
of everywhere right now. It really is. Everything

00:00:13.650 --> 00:00:17.750
from warnings about jobs, like really stark ones,

00:00:17.949 --> 00:00:21.170
to some surprisingly personal stuff people are

00:00:21.170 --> 00:00:23.649
doing with it. Exactly. So our mission here is

00:00:23.649 --> 00:00:27.250
to unpack all this material you gave us. We want

00:00:27.250 --> 00:00:29.350
to pull out the most important insights, maybe

00:00:29.350 --> 00:00:31.449
some surprising facts, and just get beyond the

00:00:31.449 --> 00:00:33.530
headlines. figure out what all this actually

00:00:33.530 --> 00:00:35.810
means for you, our listener. Sounds good. Let's

00:00:35.810 --> 00:00:39.329
dive in. So kicking off with what might be the

00:00:39.329 --> 00:00:41.689
most striking finding from your sources, this

00:00:41.689 --> 00:00:44.829
survey, it shows nearly half of Gen Z workers

00:00:44.829 --> 00:00:48.189
are relying more on ChatGPT than their own boss

00:00:48.189 --> 00:00:50.649
for work questions. Like, wow, right? Right.

00:00:50.729 --> 00:00:54.810
It's a resume survey based on like over 8 ,600

00:00:54.810 --> 00:00:58.200
full -time U .S. workers. Pretty big sample.

00:00:58.460 --> 00:01:00.259
Yeah. And, yeah, the numbers are kind of wild

00:01:00.259 --> 00:01:02.119
when you dig in. Kind of wild is definitely putting

00:01:02.119 --> 00:01:07.379
it mildly. Uh -huh. Like 51 % of Gen Z sees ChatGPT

00:01:07.379 --> 00:01:10.260
as, you know, a co -worker or an assistant, which

00:01:10.260 --> 00:01:11.900
makes sense, I guess. Okay, yeah, that part cracks.

00:01:12.140 --> 00:01:14.500
But it goes way beyond that, right? Yeah. 36

00:01:14.500 --> 00:01:17.599
% see it as entertainment. 32 % as a companion.

00:01:18.200 --> 00:01:21.519
and get this 21 as a therapist a therapist seriously

00:01:21.519 --> 00:01:24.219
so they're not just asking like how do i format

00:01:24.219 --> 00:01:26.260
the spreadsheet they're like really maybe talking

00:01:26.260 --> 00:01:28.739
about personal stuff yeah exactly some even spend

00:01:28.739 --> 00:01:32.700
like an hour or more um just chatting or playing

00:01:32.700 --> 00:01:35.799
games with it during the work day an hour the

00:01:35.799 --> 00:01:38.079
source calls it this all -in -one resource they

00:01:38.079 --> 00:01:40.560
lean on for personal and even like emotional

00:01:40.560 --> 00:01:43.060
support okay that's that's different unexpected

00:01:43.060 --> 00:01:47.560
so why why prefer an ai over you know, a human

00:01:47.560 --> 00:01:49.719
manager. What's the thinking there? Well, the

00:01:49.719 --> 00:01:52.799
survey broke that down, too, for 49 % of Gen

00:01:52.799 --> 00:01:54.980
Z and actually 47 % of millennials, too. So it's

00:01:54.980 --> 00:01:57.079
not just Gen Z. Oh, OK. Interesting. Yeah, it's

00:01:57.079 --> 00:01:59.939
simply faster. It's nonjudgmental. You know,

00:01:59.939 --> 00:02:02.079
it's always available 24 -7. Right. And you don't

00:02:02.079 --> 00:02:05.700
feel awkward asking what might seem like maybe

00:02:05.700 --> 00:02:07.859
a basic or even a dumb question to a person.

00:02:07.939 --> 00:02:10.039
Right. No fear of judgment from the AI. That

00:02:10.039 --> 00:02:12.680
makes a lot of sense, actually. And the source

00:02:12.680 --> 00:02:18.139
connects this to maybe weakening trust in managers

00:02:18.139 --> 00:02:20.699
especially with remote work where people might

00:02:20.699 --> 00:02:22.780
feel you know a bit disconnected is that part

00:02:22.780 --> 00:02:25.060
of it precisely it says gen z is actually building

00:02:25.060 --> 00:02:27.659
yeah and i use the phrase a real relationship

00:02:27.659 --> 00:02:31.020
with ai whoa a real relationship okay that's

00:02:31.020 --> 00:02:32.879
where it gets really interesting it's not just

00:02:32.879 --> 00:02:36.199
a tool anymore it's becoming something else entirely

00:02:36.199 --> 00:02:40.699
and uh the why it matters point from the source

00:02:40.699 --> 00:02:43.259
is pretty significant here. It predicts that

00:02:43.259 --> 00:02:45.960
in just two or three years, these kinds of tools,

00:02:46.080 --> 00:02:48.439
like ChatGPT, they're going to be integrated

00:02:48.439 --> 00:02:51.159
as official workplace mentors or assistants,

00:02:51.560 --> 00:02:54.860
even formal mental health support options, especially

00:02:54.860 --> 00:02:57.620
in hybrid or remote companies. Official mentors,

00:02:57.819 --> 00:03:00.039
like assigned by the company. That's the prediction.

00:03:00.199 --> 00:03:04.460
It's quietly changing who we turn to when we

00:03:04.460 --> 00:03:07.319
need help or advice, a really fundamental shift.

00:03:07.520 --> 00:03:12.330
Yeah. Wow. Okay. So from Potentially ditching

00:03:12.330 --> 00:03:16.030
your boss for work questions to AI as your therapist.

00:03:16.090 --> 00:03:20.389
That's huge. Okay, let's broaden out a bit. Moving

00:03:20.389 --> 00:03:22.509
to some of the other significant points raised

00:03:22.509 --> 00:03:25.949
in your sources, and not all of them are quite

00:03:25.949 --> 00:03:30.189
so personal, I guess. No, definitely not. The

00:03:30.189 --> 00:03:32.090
sources also highlighted some pretty serious

00:03:32.090 --> 00:03:35.009
potential downsides. One stark warning came from

00:03:35.009 --> 00:03:37.889
the Anthropic CEO, for instance. Oh, yeah? What

00:03:37.889 --> 00:03:41.379
did he say? He suggested AI could wipe out 50

00:03:41.379 --> 00:03:45.039
% of entry -level white -collar jobs in just

00:03:45.039 --> 00:03:48.879
five years. 50 % in five years? That's a massive

00:03:48.879 --> 00:03:51.460
number. It is. He even put a figure on potential

00:03:51.460 --> 00:03:53.599
unemployment, suggesting it could push it way

00:03:53.599 --> 00:03:56.259
up, like to 20%. 20 % unemployment, like the

00:03:56.259 --> 00:03:59.250
Great Depression. Yeah, similar levels. A really

00:03:59.250 --> 00:04:01.090
serious prediction to grapple with puts the whole

00:04:01.090 --> 00:04:02.729
thing in perspective. Yeah, no kidding. That's

00:04:02.729 --> 00:04:04.550
a different kind of impact altogether. Totally.

00:04:04.610 --> 00:04:06.689
And, you know, while AI is pushing boundaries

00:04:06.689 --> 00:04:09.270
creatively, like with that WSJ film, My Robot

00:04:09.270 --> 00:04:11.150
and Me. Oh, I saw that. They made it using AI

00:04:11.150 --> 00:04:14.650
tools. Right. No cameras. Exactly. Google VO3,

00:04:14.870 --> 00:04:18.730
runway AI, over a thousand clips generated. But

00:04:18.730 --> 00:04:21.449
the source points out that slop still exists.

00:04:21.790 --> 00:04:24.269
Slop. What do they mean by that? Like imperfections

00:04:24.269 --> 00:04:26.970
or glitches? Exactly. The equivalent of maybe

00:04:26.970 --> 00:04:30.149
a weird visual artifact or dialog that doesn't

00:04:30.149 --> 00:04:32.910
quite land right. It's not perfectly polished

00:04:32.910 --> 00:04:35.430
yet. Needs that human touch. OK. So the tools

00:04:35.430 --> 00:04:37.850
are powerful, but you still need that human oversight

00:04:37.850 --> 00:04:41.790
to fix things up. Makes sense. Right. And speaking

00:04:41.790 --> 00:04:44.709
of things not being perfect, the sources also

00:04:44.709 --> 00:04:48.629
brought up Grok, Elon Musk's AI. Yeah. What about

00:04:48.629 --> 00:04:51.870
it? Apparently repeating climate denial narratives.

00:04:52.009 --> 00:04:55.250
Yeah. It was citing debunked and fringe viewpoints

00:04:55.250 --> 00:04:57.709
in the name of providing, you know, balance.

00:04:57.990 --> 00:05:00.350
Hmm. Citing debunked views for balance. That

00:05:00.350 --> 00:05:02.529
really raises some big questions about the objectivity

00:05:02.529 --> 00:05:04.610
of these tools, doesn't it? Like, where did they

00:05:04.610 --> 00:05:06.449
draw the line? Is it starting to sound more like

00:05:06.449 --> 00:05:08.769
Musk than science, as the source kind of hinted?

00:05:08.889 --> 00:05:10.970
Absolutely. It's a huge challenge. And this leads

00:05:10.970 --> 00:05:12.649
us to another crucial point the sources make,

00:05:12.750 --> 00:05:14.970
which is really about the human element behind

00:05:14.970 --> 00:05:17.310
all of this tech. It highlighted the ghost AI

00:05:17.310 --> 00:05:21.189
raiders. the labelers, the moderators, the people

00:05:21.189 --> 00:05:23.290
actually doing the work behind the scenes for

00:05:23.290 --> 00:05:25.579
things like Google's Gemini. Okay. The people

00:05:25.579 --> 00:05:28.199
who are doing the really painstaking work of

00:05:28.199 --> 00:05:30.879
training and refining the AI, what about them?

00:05:31.060 --> 00:05:33.600
Exactly. And they're fighting to unionize because

00:05:33.600 --> 00:05:37.720
of like low pay, no perks, facing layoffs. Yeah.

00:05:37.860 --> 00:05:39.699
Even their team leaders apparently get fired

00:05:39.699 --> 00:05:42.120
pretty easily. Wow. So the people who are literally

00:05:42.120 --> 00:05:44.540
making the AI function, making sure it understands

00:05:44.540 --> 00:05:47.240
the world correctly, are facing these kinds of

00:05:47.240 --> 00:05:49.860
conditions. It's that often invisible human labor

00:05:49.860 --> 00:05:53.019
enabling the tech we all use. That's exactly

00:05:53.019 --> 00:05:56.139
it. It's kind of the hidden human cost, you could

00:05:56.139 --> 00:05:58.899
say, of this rapidly advancing technology. The

00:05:58.899 --> 00:06:00.740
sources also touched on a few other quick developments.

00:06:00.920 --> 00:06:03.540
Oh, yeah. Like what? Let's see. Google Photos

00:06:03.540 --> 00:06:05.639
hitting its 10th anniversary with new tools.

00:06:06.199 --> 00:06:09.120
Superblocks Clark for enterprise AI app building.

00:06:09.339 --> 00:06:12.259
Okay. Humane raising a huge $10 billion venture

00:06:12.259 --> 00:06:15.519
fund backed by NVIDIA and AWS, highlighting Saudi

00:06:15.519 --> 00:06:18.160
Arabia's growing role in AI. $10 billion. Wow.

00:06:18.500 --> 00:06:22.430
Yeah. And XAI paying Telegram. $300 million for

00:06:22.430 --> 00:06:25.050
Grok distribution. Duke University giving students

00:06:25.050 --> 00:06:28.860
free GPT -4 -0 access. So lots of money in access

00:06:28.860 --> 00:06:31.560
changing hands. Definitely. And Anthropic reversed

00:06:31.560 --> 00:06:34.680
its ban on using AI for job hiring, which is

00:06:34.680 --> 00:06:36.759
interesting given the CEO's warning. And Google

00:06:36.759 --> 00:06:39.420
teased something called Sign Gemma for translating

00:06:39.420 --> 00:06:41.759
sign language to text. Sign language translation.

00:06:41.980 --> 00:06:44.399
That's pretty cool. So a real mix of things happening

00:06:44.399 --> 00:06:46.699
across the board. Yeah, a lot going on. Okay.

00:06:46.720 --> 00:06:49.560
So we've looked at the personal connection, the

00:06:49.560 --> 00:06:52.120
potential job disruption, the challenges of bias

00:06:52.120 --> 00:06:54.199
and imperfection, and the human labor involved.

00:06:55.180 --> 00:06:57.360
Let's shift gears a bit now and look at some

00:06:57.360 --> 00:06:59.300
of the practical applications, the kind of new

00:06:59.300 --> 00:07:01.259
tools popping up that your sources highlighted.

00:07:01.480 --> 00:07:03.220
Right. Yeah. The sources listed several examples

00:07:03.220 --> 00:07:07.120
of new playbooks and tools that show how AI is

00:07:07.120 --> 00:07:10.180
being applied to specific everyday tasks right

00:07:10.180 --> 00:07:12.459
now, things people can actually use. Like that

00:07:12.459 --> 00:07:15.100
vibe coding playbook, an AI coding approach inspired

00:07:15.100 --> 00:07:18.500
by like YC founders. That sounds pretty niche.

00:07:18.740 --> 00:07:20.959
Exactly. It's taking AI and focusing it on a

00:07:20.959 --> 00:07:25.600
particular skill set or gamma app. for creating

00:07:25.600 --> 00:07:27.920
content quickly, you know, slides, websites,

00:07:28.100 --> 00:07:30.800
social posts, without needing heavy design skills.

00:07:30.879 --> 00:07:33.079
Oh, yeah. Making content creation way faster.

00:07:33.220 --> 00:07:35.620
I can see the appeal there. And they mentioned

00:07:35.620 --> 00:07:38.480
TapFlow 2 .0 for turning documents into sellable

00:07:38.480 --> 00:07:41.339
guides. Uh -huh. Or... Jog .ai for whipping up

00:07:41.339 --> 00:07:43.639
product photo or video ads instantly. Instant

00:07:43.639 --> 00:07:46.939
ads. Wow. And what else? Tools like Kledo to

00:07:46.939 --> 00:07:49.579
search like over 200 million profiles for sales

00:07:49.579 --> 00:07:52.879
or hiring needs. 200 million. Jeez. Koso .ai

00:07:52.879 --> 00:07:54.899
for automatically creating social posts from

00:07:54.899 --> 00:07:57.259
branded content. And Underlord, which is apparently

00:07:57.259 --> 00:07:59.819
an AI editor agent specifically for Descript,

00:07:59.899 --> 00:08:02.779
the editing software. Okay, so these are just

00:08:02.779 --> 00:08:05.100
quick snapshots from your sources showing how

00:08:05.100 --> 00:08:07.899
AI is being used right now to automate or really

00:08:07.899 --> 00:08:10.980
assist with very specific practical jobs. It

00:08:10.980 --> 00:08:12.600
really shows the breadth of what's happening,

00:08:12.680 --> 00:08:15.079
doesn't it? From code to content to sales. It

00:08:15.079 --> 00:08:17.300
absolutely does. It's embedding itself into workflows

00:08:17.300 --> 00:08:20.740
all over the place. And speaking of how these

00:08:20.740 --> 00:08:22.319
tools work and where they get their information,

00:08:22.740 --> 00:08:26.600
this next point was one of the most surprising

00:08:26.600 --> 00:08:28.680
insights from your sources, I thought, about

00:08:28.680 --> 00:08:32.049
something maybe people figured was... Well, on

00:08:32.049 --> 00:08:34.129
its way out. Ah, you must be talking about SEO,

00:08:34.509 --> 00:08:37.490
search engine optimization. That section title

00:08:37.490 --> 00:08:40.409
really grabbed me. SEO is not dead. Yeah, exactly.

00:08:40.970 --> 00:08:44.389
It really debunks the whole myth that AI killed

00:08:44.389 --> 00:08:47.110
SEO or made it irrelevant. Based on research

00:08:47.110 --> 00:08:50.909
analyzing like 25 ,000 real AI search queries,

00:08:51.009 --> 00:08:54.070
they found this pretty surprising pattern. Okay,

00:08:54.149 --> 00:08:55.649
I'm intrigued. What's the pattern they found?

00:08:55.789 --> 00:08:59.029
Okay, so get this. If your website ranks number

00:08:59.029 --> 00:09:00.950
one on Google for a particular search query,

00:09:01.090 --> 00:09:03.809
you have about a 25 % chance of being chosen

00:09:03.809 --> 00:09:07.389
as a source by AI platforms like ChatGPT, Perplexity,

00:09:07.429 --> 00:09:10.230
and even Google's own AI overviews. Wait, hold

00:09:10.230 --> 00:09:12.990
on. So the AI is like actively pulling its information,

00:09:13.090 --> 00:09:15.230
its source material, directly from the top's

00:09:15.230 --> 00:09:17.110
Google results, even when it's generating its

00:09:17.110 --> 00:09:18.789
own unique response? Precisely. That's what the

00:09:18.789 --> 00:09:20.450
research indicates. The source explains it, and

00:09:20.450 --> 00:09:22.509
apparently some insights came out during Google's

00:09:22.509 --> 00:09:25.710
big antitrust trial. Oh, really? How does it

00:09:25.710 --> 00:09:28.039
work then? Well, it's kind of like a three -step

00:09:28.039 --> 00:09:31.720
thing. Step one, the AI selects the top documents

00:09:31.720 --> 00:09:34.379
based on Google's highest ranking pages for that

00:09:34.379 --> 00:09:37.500
query. Yeah, so it looks at the top of the SRP

00:09:37.500 --> 00:09:41.340
first. Yep. Step two, it pulls the relevant answers

00:09:41.340 --> 00:09:43.899
from those specific top sources it just selected.

00:09:44.019 --> 00:09:46.679
Gotcha. Finds the juicy bits. And step three.

00:09:47.200 --> 00:09:50.000
The models then generate a cohesive answer using

00:09:50.000 --> 00:09:52.299
that information they just gathered from those

00:09:52.299 --> 00:09:54.960
top -ranked pages. Okay, so Google rankings are

00:09:54.960 --> 00:09:58.899
still the foundational layer, the starting point

00:09:58.899 --> 00:10:01.139
for where AI gets its knowledge. That's actually

00:10:01.139 --> 00:10:02.740
kind of counterintuitive, isn't it? That's what

00:10:02.740 --> 00:10:04.919
the research strongly suggests, yeah. The top

00:10:04.919 --> 00:10:07.500
10 results on Google still win. And the higher

00:10:07.500 --> 00:10:09.679
your rank within that top 10, the higher your

00:10:09.679 --> 00:10:11.659
visibility is going to be in these AI platforms.

00:10:11.799 --> 00:10:14.580
Wow. It's like maybe less about having the best

00:10:14.580 --> 00:10:17.279
overall pages anymore and more about having the

00:10:17.279 --> 00:10:20.120
best specific answers that rank high on Google.

00:10:20.759 --> 00:10:22.679
The best answers that rank high. That makes sense.

00:10:22.779 --> 00:10:24.580
And the source mentioned something called micro

00:10:24.580 --> 00:10:27.559
content here, right? Like niche content. Yeah,

00:10:27.620 --> 00:10:30.860
exactly. Micro content, niche, hyper focused

00:10:30.860 --> 00:10:33.720
content that directly answers very specific questions.

00:10:33.879 --> 00:10:36.179
That's becoming like the most valuable digital

00:10:36.179 --> 00:10:38.820
asset. Why is that? Because that's what these

00:10:38.820 --> 00:10:40.919
AI models are looking for within those top ranked

00:10:40.919 --> 00:10:44.000
pages when they're pulling information to synthesize

00:10:44.000 --> 00:10:47.159
an answer. They want clear, direct answers. OK,

00:10:47.240 --> 00:10:49.240
so the big takeaway there is, you know. building

00:10:49.240 --> 00:10:53.500
that kind of modular intent first content that

00:10:53.500 --> 00:10:55.899
answers specific questions and ranks high on

00:10:55.899 --> 00:10:58.919
Google is still absolutely key, maybe even more

00:10:58.919 --> 00:11:01.070
important now. Exactly. It helps you dominate

00:11:01.070 --> 00:11:03.990
both traditional search and this new world of

00:11:03.990 --> 00:11:06.350
AI visibility. It's kind of the underlying layer

00:11:06.350 --> 00:11:08.710
for both, it seems. Fascinating stuff. Yeah,

00:11:08.789 --> 00:11:10.450
absolutely. We covered a lot of ground here,

00:11:10.490 --> 00:11:13.570
didn't we? From Gen Z potentially getting career

00:11:13.570 --> 00:11:15.809
advice and maybe even therapy from AI. Right.

00:11:15.870 --> 00:11:18.169
The personal connection. To these stark warnings

00:11:18.169 --> 00:11:21.389
about potential job shifts, the often unseen

00:11:21.389 --> 00:11:24.769
human labor that's actually powering these tools.

00:11:25.129 --> 00:11:27.710
And the challenges like the AI not always being

00:11:27.710 --> 00:11:30.509
objective, the slop. And then wrapping up with

00:11:30.509 --> 00:11:33.330
this really surprising point about how AI still

00:11:33.330 --> 00:11:35.789
relies so heavily on the existing web structure.

00:11:35.909 --> 00:11:38.049
Yes. Specifically those top Google rankings.

00:11:38.149 --> 00:11:40.850
It's a super complex picture, right? This mix

00:11:40.850 --> 00:11:45.669
of personal reliance on AI for support, the potential

00:11:45.669 --> 00:11:48.690
for massive economic disruption, the human cost

00:11:48.690 --> 00:11:51.429
behind the technology that enables it all. Yeah.

00:11:51.470 --> 00:11:53.669
And then the tech itself relying on existing

00:11:53.669 --> 00:11:56.370
human curated systems like web pages and rankings.

00:11:56.710 --> 00:12:00.519
It's all tangled up. It really is. So thinking

00:12:00.519 --> 00:12:03.080
about all that, this whole tangled web of reliance,

00:12:03.159 --> 00:12:05.419
you know, humans depending on AI, AI depending

00:12:05.419 --> 00:12:07.580
on human systems and human generated content,

00:12:07.659 --> 00:12:09.899
and even the human labor that trains the AI.

00:12:10.960 --> 00:12:13.320
What does that ultimately mean for the future

00:12:13.320 --> 00:12:16.000
of work and how we access information? What does

00:12:16.000 --> 00:12:18.519
it really mean to trust an AI coworker or rely

00:12:18.519 --> 00:12:21.179
on AI results knowing it's built on this foundational

00:12:21.179 --> 00:12:23.460
human effort and existing web structures? And

00:12:23.460 --> 00:12:25.340
knowing, as we saw with Grok, it can even spread

00:12:25.340 --> 00:12:27.919
misinformation sometimes. Yeah, that's the big

00:12:27.919 --> 00:12:30.159
layered question, isn't it? How does that complex

00:12:30.159 --> 00:12:33.480
interplay shape everything? There's no easy answer

00:12:33.480 --> 00:12:35.200
there. It's definitely something to really mull

00:12:35.200 --> 00:12:38.019
over how all those pieces fit together and influence

00:12:38.019 --> 00:12:40.139
each other going forward. Definitely food for

00:12:40.139 --> 00:12:42.779
thought. Well, thanks for taking this deep dive

00:12:42.779 --> 00:12:45.259
into these sources with us today. Really interesting

00:12:45.259 --> 00:12:47.539
stuff. My pleasure. Always fascinating to unpack

00:12:47.539 --> 00:12:47.919
this.
