WEBVTT

00:00:00.000 --> 00:00:03.600
Imagine watching your own dreams. Not just remembering

00:00:03.600 --> 00:00:05.820
them, but actually hitting play. Like a movie.

00:00:06.040 --> 00:00:09.199
Yeah, exactly. Blurry castles, maybe flowers

00:00:09.199 --> 00:00:11.859
made of light. What if AI could help you see

00:00:11.859 --> 00:00:15.859
your imagination? Beat. Okay, let's unpack this

00:00:15.859 --> 00:00:19.679
a bit. Welcome to the Deep Dive. Today, we're

00:00:19.679 --> 00:00:21.679
taking a really fascinating look at the latest

00:00:21.679 --> 00:00:24.500
in AI. We're talking everything from recording

00:00:24.500 --> 00:00:27.730
your, well... fleeting thoughts to understanding

00:00:27.730 --> 00:00:30.350
how AI subtly reshapes our human connections.

00:00:30.609 --> 00:00:32.390
And we've got quite a journey planned for you.

00:00:32.469 --> 00:00:35.289
We'll kick off with that DIY AI dream recorder,

00:00:35.409 --> 00:00:37.439
which sounds... pretty wild. It really does.

00:00:37.579 --> 00:00:39.640
Then we'll explore AI's impact and its broad

00:00:39.640 --> 00:00:43.420
across education, health, creativity, lots of

00:00:43.420 --> 00:00:45.340
areas. After a quick moment, we'll cap it all

00:00:45.340 --> 00:00:48.500
off with a really compelling new study. It asks

00:00:48.500 --> 00:00:51.299
a deeply human question. What actually happens

00:00:51.299 --> 00:00:54.740
to empathy in a world saturated with AI? Yeah,

00:00:54.840 --> 00:00:56.479
this is where things get really thought provoking,

00:00:56.520 --> 00:00:58.280
I think. Okay, let's start there then with that

00:00:58.280 --> 00:01:01.280
really imaginative piece of tech, the DIY AI

00:01:01.280 --> 00:01:03.759
dream recorder. This isn't just some sci -fi

00:01:03.759 --> 00:01:05.879
concept, right? It's a real project. from a Dutch

00:01:05.879 --> 00:01:09.879
design studio called Modem. Modem, right. And

00:01:09.879 --> 00:01:12.519
the core idea, it's actually brilliantly simple.

00:01:12.760 --> 00:01:16.180
It aims to take those, you know, elusive half

00:01:16.180 --> 00:01:18.260
-remembered dreams. The ones that just vanish

00:01:18.260 --> 00:01:20.760
when you wake up. Exactly those. And turn them

00:01:20.760 --> 00:01:24.280
into these grainy, often surreal video replays.

00:01:24.299 --> 00:01:26.099
Sort of like a low -resolution movie of your

00:01:26.099 --> 00:01:28.700
subconscious. What's really neat is how straightforward

00:01:28.700 --> 00:01:31.620
the process sounds. You wake up, still kind of

00:01:31.620 --> 00:01:33.819
groggy, you push a button, and you just speak

00:01:33.819 --> 00:01:35.810
your dream out loud. describe whatever you remember.

00:01:36.609 --> 00:01:39.510
Then the device takes those words, those whispered

00:01:39.510 --> 00:01:42.310
descriptions, and generates a low -res video.

00:01:42.609 --> 00:01:45.450
And what makes it special, they say, is that

00:01:45.450 --> 00:01:47.790
it's deeply personalized. It reflects your style,

00:01:47.909 --> 00:01:51.590
your language, your unique imagination. So yeah,

00:01:51.609 --> 00:01:53.890
those blurry castles, floating rooftops, flowers

00:01:53.890 --> 00:01:56.430
made of light you mentioned, they become visual

00:01:56.430 --> 00:01:59.189
directly from what your brain cooked up during

00:01:59.189 --> 00:02:03.239
REM sleep. Well, like magic almost. And the DIY

00:02:03.239 --> 00:02:05.659
part is really key here. Modem's made everything

00:02:05.659 --> 00:02:08.539
completely open source. Oh, okay. So anyone can

00:02:08.539 --> 00:02:11.319
build it. Pretty much. The code, the hardware

00:02:11.319 --> 00:02:13.240
design, all the instructions are right there

00:02:13.240 --> 00:02:15.939
on GitHub. You literally build it yourself. You'll

00:02:15.939 --> 00:02:20.860
need an HDMI display, an AGB processor, a USB

00:02:20.860 --> 00:02:23.819
mic, a micro SD card, things like that. Standard

00:02:23.819 --> 00:02:25.840
maker stuff. Yeah. And if you're feeling ambitious,

00:02:26.080 --> 00:02:28.240
you can even 3D print your own shell for it.

00:02:28.419 --> 00:02:31.520
The total build cost is surprisingly approachable.

00:02:31.560 --> 00:02:36.020
It's around $285 maybe. Plus just tiny AI fees

00:02:36.020 --> 00:02:39.180
per dream. Like pennings. Yeah, like $0 .01 to

00:02:39.180 --> 00:02:41.879
$0 .14 depending on the quality you want. It

00:02:41.879 --> 00:02:43.819
even stores seven dreams, one for each night

00:02:43.819 --> 00:02:46.300
of the week. A week's worth of dreams, huh? And

00:02:46.300 --> 00:02:48.840
a cool touch. You can add style prompts to sort

00:02:48.840 --> 00:02:50.639
of guide the visuals. Okay, but what's really

00:02:50.639 --> 00:02:52.860
fascinating here, maybe beyond just the tech

00:02:52.860 --> 00:02:56.599
itself, is Modem's philosophy. Why build this?

00:02:56.680 --> 00:02:59.180
They didn't build it to boost screen time, apparently,

00:02:59.319 --> 00:03:01.979
or to collect your personal data for some big

00:03:01.979 --> 00:03:04.479
monetization plan. That's refreshing. It really

00:03:04.479 --> 00:03:06.240
is. Their intention is kind of the opposite.

00:03:06.439 --> 00:03:08.800
They're exploring something called ambient computing.

00:03:09.180 --> 00:03:11.819
Ambient computing. What's that in simple terms?

00:03:12.000 --> 00:03:13.939
Basically, technology that just fades into the

00:03:13.939 --> 00:03:16.719
background gets out of your way. It's a concept

00:03:16.719 --> 00:03:20.680
inspired by Mark Weiser from Xerox PRs way back.

00:03:21.180 --> 00:03:23.460
You know, most new devices try to automate or

00:03:23.460 --> 00:03:26.949
monetize everything. This dream recorder, it

00:03:26.949 --> 00:03:31.629
just listens quietly, patiently. It's a thoughtful

00:03:31.629 --> 00:03:34.449
piece. It kind of implicitly asks us what should

00:03:34.449 --> 00:03:37.969
AI do, not just what can it do. And Modem even

00:03:37.969 --> 00:03:40.810
has this unique approach for their studio. They've

00:03:40.810 --> 00:03:44.189
set a sunset date, 2030. A sunset date for the

00:03:44.189 --> 00:03:47.129
whole studio. Yeah. No IPO, no infinite roadmap

00:03:47.129 --> 00:03:50.229
chasing growth, just thoughtful time boxed invention

00:03:50.229 --> 00:03:53.370
focusing on specific. maybe meaningful project.

00:03:53.569 --> 00:03:55.650
Whoa. Imagine seeing your most abstract thoughts

00:03:55.650 --> 00:03:57.610
come to life like that. What a concept. It really

00:03:57.610 --> 00:03:59.189
makes you stop and think, doesn't it? What a

00:03:59.189 --> 00:04:00.969
unique way to use AI, not for big automation,

00:04:01.229 --> 00:04:03.710
but for really personal introspection. So beyond

00:04:03.710 --> 00:04:05.030
the cool gadget, what's the biggest takeaway

00:04:05.030 --> 00:04:07.830
from Modem's approach here? I'd say it's AI design

00:04:07.830 --> 00:04:10.289
focused squarely on quiet self -understanding,

00:04:10.310 --> 00:04:12.750
not grabbing data or cash. That's a powerful

00:04:12.750 --> 00:04:15.090
contrast, definitely. Okay, moving from those

00:04:15.090 --> 00:04:18.050
personal dreamscapes to the broader landscape,

00:04:18.350 --> 00:04:21.939
AI's real -world impact is just... undeniable

00:04:21.939 --> 00:04:25.399
now, often pretty dramatic. Take education. You

00:04:25.399 --> 00:04:28.199
hear this phrase, chat GPT equals cheat GPT.

00:04:28.300 --> 00:04:30.779
Oh, yeah. And it's catching on for a reason,

00:04:30.800 --> 00:04:34.259
right? Studies show something like 90 % of college

00:04:34.259 --> 00:04:36.420
students are openly using it for assignments.

00:04:37.019 --> 00:04:39.959
You might have seen that viral video. UCLA grad

00:04:39.959 --> 00:04:43.060
student just casually flexing how he uses Chad

00:04:43.060 --> 00:04:45.680
GPT for his work. Yeah, saw that. It's wild.

00:04:46.040 --> 00:04:48.160
It's fundamentally shifting things. How students

00:04:48.160 --> 00:04:50.740
study, how teachers even design courses. And

00:04:50.740 --> 00:04:52.639
that impact stretches right into the professional

00:04:52.639 --> 00:04:54.800
world, too, doesn't it? Especially with these

00:04:54.800 --> 00:04:58.420
intensifying AI talent wars. Meta, for example,

00:04:58.459 --> 00:05:01.540
just poached four more open AI researchers. That

00:05:01.540 --> 00:05:03.879
brings their total from open AI alone to eight.

00:05:04.060 --> 00:05:06.730
Eight. Wow. Yeah. It kind of sheds. new light

00:05:06.730 --> 00:05:09.949
on their CEO's supposed secret list of top AI

00:05:09.949 --> 00:05:12.430
people he's targeting. Though, quick side note,

00:05:12.490 --> 00:05:14.430
at least one of those researchers publicly said

00:05:14.430 --> 00:05:16.470
they definitely didn't get the rumored $100 million

00:05:16.470 --> 00:05:19.069
offer. Right. Always take those huge numbers

00:05:19.069 --> 00:05:21.790
with a grain of salt. Exactly. But still, the

00:05:21.790 --> 00:05:25.649
competition for top AI minds, it's fierce, almost

00:05:25.649 --> 00:05:28.189
like an intellectual arms race. Yeah. Changing

00:05:28.189 --> 00:05:32.350
gears a bit on a more somber note, a really crucial

00:05:32.350 --> 00:05:36.699
AI warning. And this is serious. Please do not

00:05:36.699 --> 00:05:40.379
trust ChatGPT or really any LLM with your mental

00:05:40.379 --> 00:05:42.600
health. Absolutely critical point. There have

00:05:42.600 --> 00:05:44.980
been multiple documented cases now, people being

00:05:44.980 --> 00:05:47.480
involuntarily committed due to psychosis, and

00:05:47.480 --> 00:05:49.579
it was directly triggered by interactions with

00:05:49.579 --> 00:05:52.759
GPT. That's terrifying. It is. It's a powerful

00:05:52.759 --> 00:05:55.800
tool, sure. But it completely lacks the nuance,

00:05:56.019 --> 00:05:58.620
the empathy, frankly, the humanity needed for

00:05:58.620 --> 00:06:00.860
that kind of sensitive stuff. It generates text.

00:06:00.980 --> 00:06:03.899
It cannot offer therapy. Well said. on a maybe

00:06:03.899 --> 00:06:06.300
slightly lighter but still kind of unsettling

00:06:06.300 --> 00:06:10.000
note have you seen those viral ai generated cell

00:06:10.000 --> 00:06:12.000
phone videos the ones that look super real yeah

00:06:12.000 --> 00:06:14.100
hyper realistic they look like they were actually

00:06:14.100 --> 00:06:16.199
shot on phones just capturing random moments

00:06:16.199 --> 00:06:18.000
but comments are full of people saying stuff

00:06:18.000 --> 00:06:20.459
like we are also getting scammed when we're old

00:06:21.240 --> 00:06:23.920
Yeah, I can see that. It really blurs the line.

00:06:24.000 --> 00:06:26.560
It makes you question the authenticity of pretty

00:06:26.560 --> 00:06:28.639
much everything you see online. It's incredible

00:06:28.639 --> 00:06:32.000
how fast that line is just dissolving. But, okay,

00:06:32.060 --> 00:06:35.480
on a very positive note, for AI's real -world

00:06:35.480 --> 00:06:39.180
uses, especially in healthcare, Microsoft's new

00:06:39.180 --> 00:06:42.199
AI, it's apparently diagnosing diseases four

00:06:42.199 --> 00:06:45.139
times more accurately than human doctors. Four

00:06:45.139 --> 00:06:48.209
times. That's huge. Isn't it? And maybe even

00:06:48.209 --> 00:06:51.230
better, it's cutting testing costs by about 20%.

00:06:51.230 --> 00:06:54.009
The method it uses is fascinating, too. It's

00:06:54.009 --> 00:06:56.649
called a chain of debate model. Chain of debate?

00:06:56.870 --> 00:06:59.709
How does that work? It means multiple top AI

00:06:59.709 --> 00:07:02.529
models collaborate, like a virtual medical panel,

00:07:02.709 --> 00:07:05.310
analyzing data together to reach a diagnosis.

00:07:05.790 --> 00:07:07.810
Interesting. So they argue it out. Kind of, yeah.

00:07:07.910 --> 00:07:09.850
And speaking of health care, Tandem Health just

00:07:09.850 --> 00:07:12.889
got $50 million to boost its AI tool. It's aimed

00:07:12.889 --> 00:07:14.889
at reducing clinician burnout in Europe. Oh,

00:07:14.930 --> 00:07:16.670
that's much needed. Burnout's a massive issue.

00:07:16.769 --> 00:07:18.750
Exactly. A significant investment there. And

00:07:18.750 --> 00:07:20.870
hey, for the creatives listening or just anyone

00:07:20.870 --> 00:07:23.389
who uses design tools, Kava now works directly

00:07:23.389 --> 00:07:26.170
inside ChatGPT. Oh, really? How does that work?

00:07:26.310 --> 00:07:28.629
It's the first official design connector for

00:07:28.629 --> 00:07:31.290
ChatGPT. Let's you edit designs without ever

00:07:31.290 --> 00:07:33.689
leaving the chat window. Seamless. Yeah. One

00:07:33.689 --> 00:07:36.629
-click integration. Makes design way more accessible,

00:07:36.730 --> 00:07:39.899
not just for pros. Okay, so looking across all

00:07:39.899 --> 00:07:43.100
these different areas, education, health, creative

00:07:43.100 --> 00:07:46.540
tools. What do you see as the common thread?

00:07:46.720 --> 00:07:50.000
What's the big picture of AI's current impact?

00:07:50.420 --> 00:07:52.420
Well, the common thread isn't just integration,

00:07:52.620 --> 00:07:55.100
I think. It's AI forcing us to fundamentally

00:07:55.100 --> 00:07:58.920
redefine things. Efficiency, originality, even

00:07:58.920 --> 00:08:01.639
what personal connection means day to day. Redefining

00:08:01.639 --> 00:08:03.860
things. Yeah, that feels right. The pace really

00:08:03.860 --> 00:08:06.019
is breathtaking. Let's maybe dive into some of

00:08:06.019 --> 00:08:07.759
the specific new tools making waves right now.

00:08:07.800 --> 00:08:09.720
Let's do it. Quick hits. Yeah, quick hits. We're

00:08:09.720 --> 00:08:13.689
seeing stuff like Pocut. Uses AI for free photo

00:08:13.689 --> 00:08:16.389
creation. Just clicks or prompts. Then there's

00:08:16.389 --> 00:08:19.009
JotForm. Creates AI presentations that can actually

00:08:19.009 --> 00:08:21.589
talk, listen, answer questions from your file.

00:08:21.689 --> 00:08:24.069
Presentations that talk back. Wow. Right. For

00:08:24.069 --> 00:08:27.069
branding, PixArt Ignite 2 .0 instantly generates

00:08:27.069 --> 00:08:29.990
assets, ads, videos, even custom fonts. Making

00:08:29.990 --> 00:08:32.830
branding faster. And if you need a digital twin,

00:08:33.149 --> 00:08:35.190
DemoDazzle gets you an interactive assistant

00:08:35.190 --> 00:08:37.750
that looks and sounds just like you. A digital

00:08:37.750 --> 00:08:40.429
me. Hmm. Not sure how I feel about that one yet.

00:08:40.549 --> 00:08:42.909
Huh. Yeah, maybe a little uncanny valley there.

00:08:43.350 --> 00:08:47.029
And finally, Table 1 .0. This brings a Figma

00:08:47.029 --> 00:08:49.950
-like collaborative feel to basically any app

00:08:49.950 --> 00:08:52.950
on any browser tab. Ooh, that sounds useful for

00:08:52.950 --> 00:08:55.970
teams. Big time. Game changer for remote work,

00:08:56.070 --> 00:08:58.470
potentially. And we've got some rapid -fire AI

00:08:58.470 --> 00:09:00.769
news quick hits, too. Meta just launched something

00:09:00.769 --> 00:09:02.929
called Super Intelligence Labs. They poached

00:09:02.929 --> 00:09:05.049
11 top researchers for it. Super Intelligence

00:09:05.049 --> 00:09:08.490
Labs. Sounds ambitious. Doesn't it? Google expanded

00:09:08.490 --> 00:09:11.190
Gemini for education over 30 new AI features

00:09:11.190 --> 00:09:14.350
for learning. And on a more future gazing note,

00:09:14.450 --> 00:09:16.690
one scientist predicts the AI singularity within

00:09:16.690 --> 00:09:20.190
20 years, by 2045. Singularity. That's the point

00:09:20.190 --> 00:09:21.830
where AI surpasses human intelligence, right?

00:09:22.009 --> 00:09:24.429
Exactly. Comes smarter than us. Big concept.

00:09:24.570 --> 00:09:28.169
Pause. Also for the tech watchers, Elon Musk

00:09:28.169 --> 00:09:30.970
said Grok 4 is planned for release just after

00:09:30.970 --> 00:09:33.269
July 4. Always timing things around holidays.

00:09:33.799 --> 00:09:36.460
Seems like it. And for developers, Krister launched

00:09:36.460 --> 00:09:39.100
a web app to manage a network of AI coding agents,

00:09:39.320 --> 00:09:42.000
basically giving programmers an AI army to help

00:09:42.000 --> 00:09:45.240
them code. AI coding army. OK. When you hear

00:09:45.240 --> 00:09:49.190
about this. much rapid AI advancement, like almost

00:09:49.190 --> 00:09:52.250
daily updates. What's the overall feeling that

00:09:52.250 --> 00:09:54.389
it gives you? You know, it's not just speed.

00:09:54.490 --> 00:09:57.029
It feels like a continuous, almost disorienting

00:09:57.029 --> 00:09:59.950
state of flux. The key thing maybe is how this

00:09:59.950 --> 00:10:02.330
accelerating change demands constant adaptation,

00:10:02.830 --> 00:10:05.330
not just from companies, but from us, you know,

00:10:05.330 --> 00:10:07.769
as individuals trying to navigate work and life.

00:10:07.990 --> 00:10:10.610
Constant adaptation. That sums it up well. Okay,

00:10:10.629 --> 00:10:12.049
let's take a quick moment here for our sponsor.

00:10:12.429 --> 00:10:15.049
Sponsor. Welcome back to the Deep Dive. We've

00:10:15.049 --> 00:10:17.730
talked dreams, tools, industry shifts. But now

00:10:17.730 --> 00:10:19.970
we're going to pivot to something, well, profoundly

00:10:19.970 --> 00:10:22.889
human. Empathy. A new study just came out. Big

00:10:22.889 --> 00:10:26.269
one. Nine experiments. Over 6 ,000 people. And

00:10:26.269 --> 00:10:28.710
it reveals a really interesting truth. We still

00:10:28.710 --> 00:10:31.990
prefer empathy from humans, even if an AI says

00:10:31.990 --> 00:10:34.429
the exact same thing word for word. The study's

00:10:34.429 --> 00:10:36.389
design was actually quite clever. Participants

00:10:36.389 --> 00:10:38.309
got empathic replies in different scenarios,

00:10:38.490 --> 00:10:41.049
like someone sharing a personal struggle. Okay.

00:10:41.149 --> 00:10:43.110
Sometimes they were told the reply came from

00:10:43.110 --> 00:10:46.509
a person, other times from a chat bot. But here's

00:10:46.509 --> 00:10:50.610
the crucial part. All the replies, every single

00:10:50.610 --> 00:10:54.409
one, were actually generated by AI. Ah, so it

00:10:54.409 --> 00:10:57.029
was purely about perception. Exactly. A pure

00:10:57.029 --> 00:11:00.370
test of perception, not the actual source. Kind

00:11:00.370 --> 00:11:02.889
of like a magic trick where you know how it's

00:11:02.889 --> 00:11:05.409
done, but the audience is reacting to the illusion.

00:11:05.750 --> 00:11:08.289
And the results? What happened? Well, they were

00:11:08.289 --> 00:11:10.730
pretty fascinating. Participants consistently

00:11:10.730 --> 00:11:13.370
rated the responses they thought came from a

00:11:13.370 --> 00:11:18.049
human as warmer, more supportive, much more emotionally

00:11:18.049 --> 00:11:20.610
satisfying. Okay. That makes intuitive sense.

00:11:20.789 --> 00:11:22.629
Right. But here's the kicker, the bit that really

00:11:22.629 --> 00:11:25.070
stood out. People were willing to wait days,

00:11:25.110 --> 00:11:28.049
even weeks, for the exact same response if they

00:11:28.049 --> 00:11:30.590
thought it came from a human. Weeks, just based

00:11:30.590 --> 00:11:33.049
on the belief it was human. Yeah. That's a pretty

00:11:33.049 --> 00:11:34.889
powerful statement, isn't it, about how much

00:11:34.889 --> 00:11:37.409
we value that perceived human connection and

00:11:37.409 --> 00:11:40.299
effort? Definitely. But wait, it gets even weirder.

00:11:40.600 --> 00:11:42.679
It highlights what the researchers called the

00:11:42.679 --> 00:11:45.980
AI -touched effect. When people suspected a human

00:11:45.980 --> 00:11:48.100
might have used AI to help write their message,

00:11:48.419 --> 00:11:51.240
trust actually dropped. So even just the hint

00:11:51.240 --> 00:11:54.620
of AI assistance made it feel less sincere. Exactly.

00:11:54.659 --> 00:11:57.220
The mere idea of an AI -touched message made

00:11:57.220 --> 00:11:59.980
it feel less real, less trustworthy. You know,

00:12:00.019 --> 00:12:02.279
I still wrestle with prompt drift myself sometimes,

00:12:02.539 --> 00:12:05.019
which makes it hard to trust the source completely

00:12:05.019 --> 00:12:07.769
on occasion. Prompt drift. Remind us what that

00:12:07.769 --> 00:12:10.730
is. Yeah. Prompt drift is just when an AI's output

00:12:10.730 --> 00:12:13.429
changes subtly over time, even for the same input.

00:12:13.570 --> 00:12:16.309
It can be kind of frustrating. Gotcha. So why

00:12:16.309 --> 00:12:18.149
does all this matter for us day to day? What's

00:12:18.149 --> 00:12:20.110
the takeaway? It tells us it's not just what

00:12:20.110 --> 00:12:23.110
you say, it's who people think said it. The same

00:12:23.110 --> 00:12:25.090
words land differently depending on the perceived

00:12:25.090 --> 00:12:27.909
source. Perceived authenticity seems to be everything.

00:12:28.049 --> 00:12:30.710
A message just feels more real, more genuine

00:12:30.710 --> 00:12:33.509
when we believe someone invested real time and

00:12:33.509 --> 00:12:36.419
actual emotional effort. not just a quick ai

00:12:36.419 --> 00:12:38.500
prompt this study really highlights what you

00:12:38.500 --> 00:12:42.000
could call ai's emotional ceiling ai can scale

00:12:42.000 --> 00:12:44.879
support no question it can generate millions

00:12:44.879 --> 00:12:48.080
of helpful even empathetic sounding responses,

00:12:48.200 --> 00:12:52.419
but it cannot scale deep, authentic human connection.

00:12:52.620 --> 00:12:55.440
That seems to be the inherent boundary. And if

00:12:55.440 --> 00:12:58.840
large language models, LLMs, those powerful AIs

00:12:58.840 --> 00:13:01.159
that generate human -like text, if they write

00:13:01.159 --> 00:13:03.820
more and more of our communications, our emails,

00:13:04.019 --> 00:13:07.259
our DMs, apologies, the risk is pretty clear,

00:13:07.320 --> 00:13:09.200
isn't it? We might just stop believing anyone

00:13:09.200 --> 00:13:12.450
truly means what they say anymore. A crisis of

00:13:12.450 --> 00:13:15.169
authenticity? Potentially. AI simulates empathy,

00:13:15.370 --> 00:13:17.990
but it can't feel it. And that gap, that fundamental

00:13:17.990 --> 00:13:19.830
difference, might be the most important one we

00:13:19.830 --> 00:13:21.830
need to keep in mind. So considering all that,

00:13:21.889 --> 00:13:23.990
what's the single most critical takeaway from

00:13:23.990 --> 00:13:26.289
this empathy study for our daily lives as we

00:13:26.289 --> 00:13:29.529
navigate this increasingly AI -infused world?

00:13:29.899 --> 00:13:31.740
I think it's that authenticity and real human

00:13:31.740 --> 00:13:33.840
connection. They remain paramount, even with

00:13:33.840 --> 00:13:36.759
all of AI's incredible capabilities. So what

00:13:36.759 --> 00:13:38.919
does this all mean for you listening right now?

00:13:39.019 --> 00:13:42.179
From dream recorders designed for, well, quiet

00:13:42.179 --> 00:13:46.100
self -reflection, to AI rapidly changing industries

00:13:46.100 --> 00:13:48.919
like education and healthcare, to this really

00:13:48.919 --> 00:13:53.700
profound study on human empathy, AI... truly

00:13:53.700 --> 00:13:56.100
is everywhere. It's reshaping our world at an

00:13:56.100 --> 00:14:00.019
incredible, almost dizzying pace. It's a rapid,

00:14:00.059 --> 00:14:02.919
relentless evolution, isn't it? But as we integrate

00:14:02.919 --> 00:14:05.200
these tools more deeply into our lives, maybe

00:14:05.200 --> 00:14:07.379
the most crucial distinction is that space between

00:14:07.379 --> 00:14:10.820
what AI can simulate and what only human intention,

00:14:11.220 --> 00:14:14.299
real effort, and genuine feeling can truly deliver.

00:14:14.559 --> 00:14:16.399
So maybe next time you're tempted to run it through

00:14:16.399 --> 00:14:19.000
chat GPT for an important message, or even just

00:14:19.000 --> 00:14:22.120
a casual one. Perhaps just say it like you mean

00:14:22.120 --> 00:14:24.179
it. People can tell. It really raises an important

00:14:24.179 --> 00:14:26.279
question for all of us, I think. In a world with

00:14:26.279 --> 00:14:28.220
more and more digital interactions, how do you

00:14:28.220 --> 00:14:30.659
make sure your messages always carry that authentic

00:14:30.659 --> 00:14:32.799
human weight? Thanks for diving deep with us

00:14:32.799 --> 00:14:36.860
today. We'll catch you next time. Out T -Row

00:14:36.860 --> 00:14:37.139
Music.
