WEBVTT

00:00:00.000 --> 00:00:01.879
What if the biggest opportunity in artificial

00:00:01.879 --> 00:00:04.719
intelligence isn't really about building the

00:00:04.719 --> 00:00:07.500
smartest chat bot or, you know, simply the fastest

00:00:07.500 --> 00:00:10.220
search engine? Yeah. What if it's actually about

00:00:10.220 --> 00:00:12.820
building a truly genuine, evolving relationship

00:00:12.820 --> 00:00:16.620
with technology itself? Two sec silence. I mean,

00:00:16.640 --> 00:00:18.460
just think about that for a second. We're talking

00:00:18.460 --> 00:00:20.730
about an emotional operating system. Welcome

00:00:20.730 --> 00:00:23.050
to the Deep Dive. And today we're diving deep

00:00:23.050 --> 00:00:25.890
into a really thought -provoking perspective

00:00:25.890 --> 00:00:29.370
from Kirsten Green. She's a legendary consumer

00:00:29.370 --> 00:00:31.570
venture capitalist over at Forerunner Ventures.

00:00:31.829 --> 00:00:34.729
She's backed some truly iconic brands, companies

00:00:34.729 --> 00:00:37.090
you definitely know, probably use every day.

00:00:37.250 --> 00:00:41.130
And her bold claim is that right now, well, everyone

00:00:41.130 --> 00:00:43.869
is building the wrong AI products. Her vision,

00:00:43.929 --> 00:00:45.869
as you'll hear, it stretches far beyond just

00:00:45.869 --> 00:00:48.250
smarter chatbots. We're talking about an entirely

00:00:48.250 --> 00:00:51.810
new era, really, one defined by deep personal

00:00:51.810 --> 00:00:54.229
connections with AI. It's a pretty fundamental

00:00:54.229 --> 00:00:57.030
shift. Totally. And our mission for you, the

00:00:57.030 --> 00:01:00.189
listener, is to really grasp this profound shift.

00:01:00.549 --> 00:01:02.990
We'll explore why relationships are kind of emerging

00:01:02.990 --> 00:01:06.200
as the new frontier for AI. What exactly makes

00:01:06.200 --> 00:01:09.420
these connections possible and what this all

00:01:09.420 --> 00:01:12.340
means for building the truly groundbreaking products

00:01:12.340 --> 00:01:14.359
that will shape our future? It's pretty exciting

00:01:14.359 --> 00:01:16.719
stuff. OK, so let's unpack this core idea she

00:01:16.719 --> 00:01:20.640
has. Green defines major tech eras not just by

00:01:20.640 --> 00:01:23.280
the technology itself, but by the fundamental

00:01:23.280 --> 00:01:26.420
human behavior they enable. Right. So, for example,

00:01:26.500 --> 00:01:29.620
the early Internet was all about outcomes. Think

00:01:29.620 --> 00:01:32.670
about it. Buying a book online. Booking a flight,

00:01:32.870 --> 00:01:35.349
finding information. It was about getting a specific

00:01:35.349 --> 00:01:39.189
task done, often instantly. Efficiency. Right.

00:01:39.290 --> 00:01:42.530
Get in, get out. And then the mobile era fundamentally

00:01:42.530 --> 00:01:45.129
shifted us to attention. With a supercomputer

00:01:45.129 --> 00:01:47.549
now in every pocket, the whole game became about

00:01:47.549 --> 00:01:49.989
capturing and holding your focus. It was feeds,

00:01:50.010 --> 00:01:52.890
notifications, endless on -demand content. Every

00:01:52.890 --> 00:01:56.010
app was like locked in this digital gladiatorial

00:01:56.010 --> 00:01:58.290
arena battling for every second of your pocket

00:01:58.290 --> 00:02:00.150
time. And here's where Green argues the current

00:02:00.150 --> 00:02:03.370
AI cycle is profoundly different. It's not primarily

00:02:03.370 --> 00:02:05.810
about faster outcomes or even just more attention

00:02:05.810 --> 00:02:08.430
grabbing. Yeah. She believes this era is really

00:02:08.430 --> 00:02:11.569
about relationships and maybe surprisingly, even

00:02:11.569 --> 00:02:14.120
affection. It's not just about creating better

00:02:14.120 --> 00:02:17.219
tools, but fostering genuine, evolving connections

00:02:17.219 --> 00:02:19.819
with technology itself. That's a big leap. It

00:02:19.819 --> 00:02:23.219
is. Exactly. That's the bedrock. Consider ChatGPT's

00:02:23.219 --> 00:02:26.300
explosive growth. Yes, the underlying tech was

00:02:26.300 --> 00:02:28.840
incredibly impressive, sure. But Green points

00:02:28.840 --> 00:02:32.240
out the real reason it just blew up was the human

00:02:32.240 --> 00:02:34.960
-like conversation. Right. It felt familiar,

00:02:35.120 --> 00:02:37.419
conversational, almost like chatting with a friend.

00:02:37.870 --> 00:02:40.689
As she puts it, it was taking a behavior that

00:02:40.689 --> 00:02:43.370
is fundamental and now making it possible online.

00:02:43.469 --> 00:02:46.349
It just clicked with us on a human level. So

00:02:46.349 --> 00:02:48.310
when we talk about tech, what's the core difference

00:02:48.310 --> 00:02:51.169
between building for a one -time outcome versus

00:02:51.169 --> 00:02:53.650
cultivating an ongoing relationship? Well, an

00:02:53.650 --> 00:02:55.370
outcome is pretty much a transaction, right?

00:02:55.469 --> 00:02:57.449
A relationship, though, it builds dynamically

00:02:57.449 --> 00:03:00.110
over time through interaction. Okay, so for these

00:03:00.110 --> 00:03:03.009
AI relationships to truly exist, to feel meaningful

00:03:03.009 --> 00:03:06.840
and, you know... evolve with us. Green points

00:03:06.840 --> 00:03:10.280
to three crucial unlocks, things that she says

00:03:10.280 --> 00:03:12.560
haven't really come together like this before.

00:03:12.699 --> 00:03:15.539
The first and maybe the most foundational is

00:03:15.539 --> 00:03:17.879
memory that actually matters. Yeah. And this

00:03:17.879 --> 00:03:19.860
isn't just about data recall, like a computer

00:03:19.860 --> 00:03:22.699
perfectly remembering a file name from 10 years

00:03:22.699 --> 00:03:24.919
ago. It's so much deeper than that. Right. It's

00:03:24.919 --> 00:03:27.939
about context that builds over time and intuition

00:03:27.939 --> 00:03:31.240
that moves forward. Imagine an AI that actually

00:03:31.240 --> 00:03:33.840
remembers that half formed business idea you

00:03:33.840 --> 00:03:36.280
mentioned six months ago. Wow. Or that fleeting

00:03:36.280 --> 00:03:38.680
thought you had about a career change last Tuesday.

00:03:38.919 --> 00:03:42.419
And then proactively it connects that to new

00:03:42.419 --> 00:03:44.580
market trends it's observing or maybe your evolving

00:03:44.580 --> 00:03:47.199
goals you've discussed since then. So it's synthesizing.

00:03:47.199 --> 00:03:50.240
It's synthesizing information. Yeah. Like a human

00:03:50.240 --> 00:03:52.719
friend who truly knows you and can anticipate

00:03:52.719 --> 00:03:55.900
your needs. That's a total game changer. The

00:03:55.900 --> 00:03:58.099
second unlock, she calls it data in a continuous

00:03:58.099 --> 00:04:00.580
learning loop. We've heard for years that data

00:04:00.580 --> 00:04:03.520
is a moat, right? Meaning it gives companies

00:04:03.520 --> 00:04:06.259
a competitive advantage. But here, it's different.

00:04:06.400 --> 00:04:09.060
Totally different. Every single interaction you

00:04:09.060 --> 00:04:11.939
have, every preference you express, every goal

00:04:11.939 --> 00:04:16.360
you state, it builds this incredibly rich, evolving

00:04:16.360 --> 00:04:19.379
understanding of you as an individual. Exactly.

00:04:19.519 --> 00:04:22.019
This isn't just a Netflix recommendation based

00:04:22.019 --> 00:04:24.519
on your past viewing history, which is kind of

00:04:24.519 --> 00:04:28.000
static, like a snapshot. This data is actively

00:04:28.000 --> 00:04:32.060
used in real time to deepen and move the relationship

00:04:32.060 --> 00:04:34.860
forward. It makes it richer, more personalized

00:04:34.860 --> 00:04:37.720
with every single conversation. Okay. And that's

00:04:37.720 --> 00:04:39.980
incredibly profound because the switching cost

00:04:39.980 --> 00:04:42.920
for you, the user, suddenly becomes incredibly

00:04:42.920 --> 00:04:45.879
high. Right. You're not just abandoning a piece

00:04:45.879 --> 00:04:47.860
of software for another, you're abandoning a

00:04:47.860 --> 00:04:50.990
relationship. You've invested time, thought,

00:04:51.149 --> 00:04:53.790
and maybe even a little bit of your heart in

00:04:53.790 --> 00:04:55.689
building. You'd have to start all over. You'd

00:04:55.689 --> 00:04:57.910
have to start completely over with a new AI that

00:04:57.910 --> 00:04:59.790
knows absolutely nothing about you. That's a

00:04:59.790 --> 00:05:02.149
huge barrier. And the third unlock, this one

00:05:02.149 --> 00:05:05.250
is truly fascinating. Voice is the great accelerator

00:05:05.250 --> 00:05:08.930
of intimacy. When we type, we often self -edit,

00:05:08.930 --> 00:05:11.410
right? We're precise. We curate our thoughts.

00:05:11.470 --> 00:05:14.120
Maybe we're even a little performative. but when

00:05:14.120 --> 00:05:16.680
we speak. Oh, we're so much less guarded. It's

00:05:16.680 --> 00:05:19.959
true. We ramble, we reveal our tone, our hesitations,

00:05:19.959 --> 00:05:22.000
our half -formed thoughts, even our emotions

00:05:22.000 --> 00:05:25.480
leak through this richer, more human, more intimate

00:05:25.480 --> 00:05:28.379
voice data when you combine it with that deep

00:05:28.379 --> 00:05:30.879
contextual memory and the continuous learning

00:05:30.879 --> 00:05:35.319
loop. Wow. It builds an incredibly deep and powerful

00:05:35.319 --> 00:05:37.800
foundation for relationship building. It's like

00:05:37.800 --> 00:05:40.980
stacking these nuanced Lego blocks of data, each

00:05:40.980 --> 00:05:44.040
one adding a new layer of understanding. So how

00:05:44.040 --> 00:05:46.519
does this memory that matters really differ from

00:05:46.519 --> 00:05:50.180
traditional computer memory, just beyond recalling

00:05:50.180 --> 00:05:53.199
a file? It's dynamic context and intuition, not

00:05:53.199 --> 00:05:56.160
just static recall. It builds an evolving understanding

00:05:56.160 --> 00:05:59.160
of you. Okay. So Green describes where we are

00:05:59.160 --> 00:06:02.240
right now in the evolution of AI products as...

00:06:02.540 --> 00:06:05.639
a messy creative stage. She says it's a really

00:06:05.639 --> 00:06:08.500
fun place, but also one where we need a lot more

00:06:08.500 --> 00:06:10.759
freedom to experiment. Her prediction is pretty

00:06:10.759 --> 00:06:12.720
clear. They're going to be all new. We won't

00:06:12.720 --> 00:06:15.060
just see smarter chatbots kind of bolted inside

00:06:15.060 --> 00:06:17.879
our existing familiar old apps. She believes

00:06:17.879 --> 00:06:19.980
we'll see entirely new product categories, completely

00:06:19.980 --> 00:06:22.600
fresh user experiences built from the ground

00:06:22.600 --> 00:06:24.639
up around this whole paradigm of relationships.

00:06:25.040 --> 00:06:27.600
This means bold, almost wild experimentation

00:06:27.600 --> 00:06:31.480
is absolutely key. We're truly in the like. Our

00:06:31.480 --> 00:06:34.800
primordial soup of a new technological era. It's

00:06:34.800 --> 00:06:37.600
exciting, but messy. Now, when it comes to distribution

00:06:37.600 --> 00:06:41.420
and marketing, founders always ask VCs for the

00:06:41.420 --> 00:06:44.399
secret sauce, right? The magic bullet. Oh, yeah,

00:06:44.459 --> 00:06:46.920
every time. Green's harsh reality response to

00:06:46.920 --> 00:06:48.980
that very common question. You've got to do everything.

00:06:49.279 --> 00:06:51.819
Huh. Yeah, there's no single magic bullet, no

00:06:51.819 --> 00:06:54.579
one trick. It's about creating a mosaic of messages.

00:06:55.180 --> 00:06:57.819
But the absolute foundation, the simplest truth,

00:06:57.959 --> 00:07:00.449
is this. You have to build something that people

00:07:00.449 --> 00:07:02.610
actually need and that people actually like.

00:07:02.649 --> 00:07:05.730
Beat. Seems obvious, but. But it's hard. As she

00:07:05.730 --> 00:07:07.810
notes, with maybe a bit of a sigh, you can't

00:07:07.810 --> 00:07:09.790
market products that are bad. It's so easy to

00:07:09.790 --> 00:07:12.110
get lost in the dazzling tech, the algorithms,

00:07:12.230 --> 00:07:15.490
the models. But that messy human element, what

00:07:15.490 --> 00:07:17.930
people truly need, what they truly, like, that's

00:07:17.930 --> 00:07:19.449
where the real challenge is for founders. And

00:07:19.449 --> 00:07:21.170
honestly, it's something I still grapple with

00:07:21.170 --> 00:07:23.029
myself sometimes. Getting that human connection

00:07:23.029 --> 00:07:26.120
right is really tough. Why then is this considered

00:07:26.120 --> 00:07:29.540
such a messy time for creating new AI products?

00:07:29.720 --> 00:07:31.980
What makes it messy? Because the rules are totally

00:07:31.980 --> 00:07:35.000
unwritten, bold experimentation is really needed

00:07:35.000 --> 00:07:37.560
for these new kinds of experiences. Okay, so

00:07:37.560 --> 00:07:39.560
here's where it connects to the past in a really

00:07:39.560 --> 00:07:42.899
insightful way. Green's investment in Dollar

00:07:42.899 --> 00:07:46.439
Shave Club. It's like a masterclass in seeing

00:07:46.439 --> 00:07:50.139
beyond the obvious product. On the surface, it

00:07:50.139 --> 00:07:53.410
was just razors. And an unbelievably crowded

00:07:53.410 --> 00:07:56.050
market totally dominated by giants like Gillette

00:07:56.050 --> 00:07:59.430
seemed crazy. Right. But the founder, as she

00:07:59.430 --> 00:08:02.370
observed, saw a much deeper cultural shift happening.

00:08:02.649 --> 00:08:05.350
Men were actively entering a broader personal

00:08:05.350 --> 00:08:08.189
care conversation, you know, beyond just the

00:08:08.189 --> 00:08:10.370
razor. The razor itself was just the vehicle,

00:08:10.449 --> 00:08:12.889
not the destination. Plus, you had new business

00:08:12.889 --> 00:08:14.930
models enabled by the social web, meaning they

00:08:14.930 --> 00:08:17.589
could ship directly, bypassing all those traditional

00:08:17.589 --> 00:08:20.350
retail gatekeepers. It was like a perfect storm

00:08:20.350 --> 00:08:22.579
of culture. and technological shifts happening

00:08:22.579 --> 00:08:24.600
at the same time. It's kind of genius, really,

00:08:24.680 --> 00:08:26.500
when you look back. So what does this all mean

00:08:26.500 --> 00:08:29.860
for the future of AI then? Despite ChatGPT's

00:08:29.860 --> 00:08:33.019
incredible, almost overwhelming dominance right

00:08:33.019 --> 00:08:35.460
now, Green doesn't believe everything will just

00:08:35.460 --> 00:08:38.740
end up living in one single universal chat window.

00:08:39.019 --> 00:08:41.769
No. Definitely not. She argues there's relevance

00:08:41.769 --> 00:08:43.950
in the long tail. There's relevance in specialty.

00:08:44.169 --> 00:08:46.669
And people want to have different kinds of experiences

00:08:46.669 --> 00:08:49.210
at the category level. Yeah, that makes sense.

00:08:49.409 --> 00:08:51.330
Think about it. How you'd want to manage your

00:08:51.330 --> 00:08:53.629
most sensitive personal finances versus how you

00:08:53.629 --> 00:08:56.789
might interact with, say, a mental health companion

00:08:56.789 --> 00:08:59.870
AI. Completely different. Totally. You want and

00:08:59.870 --> 00:09:02.769
you absolutely need tailored interfaces, specialized

00:09:02.769 --> 00:09:06.409
experiences. General purpose tools like ChatGPT

00:09:06.409 --> 00:09:08.950
are fantastic for inspiration for general knowledge.

00:09:08.970 --> 00:09:11.029
knowledge, sure, but specialized experiences

00:09:11.029 --> 00:09:14.610
are where the real deep value lies for very specific

00:09:14.610 --> 00:09:17.490
needs. So what was that underlying cultural shift

00:09:17.490 --> 00:09:19.710
that Dollar Shave Club really tapped into beyond

00:09:19.710 --> 00:09:22.870
just selling razors cheaply? Men were increasingly

00:09:22.870 --> 00:09:25.330
embracing personal care and the razor was just

00:09:25.330 --> 00:09:27.570
an entry point to that. broader conversation.

00:09:27.850 --> 00:09:30.830
Got it. OK, so Forerunner Ventures, they've identified

00:09:30.830 --> 00:09:34.009
two massive, overarching consumer trends that

00:09:34.009 --> 00:09:36.809
they see as absolutely ripe for this kind of

00:09:36.809 --> 00:09:40.330
AI innovation. Yeah. Huge trends already reshaping

00:09:40.330 --> 00:09:43.190
how we live. First, the proactive wellness revolution.

00:09:43.570 --> 00:09:46.190
Yeah, this is enormous. We're really shifting

00:09:46.190 --> 00:09:48.129
away from just, you know, going to the doctor

00:09:48.129 --> 00:09:51.610
when we're sick. Now it's about. Daily optimization,

00:09:51.970 --> 00:09:54.669
personalized health, preventing issues before

00:09:54.669 --> 00:09:58.269
they even start. The AI opportunity here is truly

00:09:58.269 --> 00:10:01.710
incredible. Creating an MD and a PhD in your

00:10:01.710 --> 00:10:04.730
pocket, literally. Personalized guidance based

00:10:04.730 --> 00:10:07.470
on all your unique data. Your sleep patterns,

00:10:07.649 --> 00:10:09.590
your food choices, your genetics, your mood,

00:10:09.669 --> 00:10:12.169
all feeding into a system that gives you deeply

00:10:12.169 --> 00:10:14.450
tailored, proactive advice, not just generic

00:10:14.450 --> 00:10:16.779
tips you read online. And the second big trend,

00:10:16.940 --> 00:10:19.779
the quest for personal security. This isn't just

00:10:19.779 --> 00:10:21.539
about physical safety, though that's part of

00:10:21.539 --> 00:10:23.200
it, obviously. Right. It's broader. It's about

00:10:23.200 --> 00:10:26.019
securing your entire life, finding a stable career

00:10:26.019 --> 00:10:28.960
in a rapidly changing world, managing finances

00:10:28.960 --> 00:10:31.940
wisely, maintaining a general sense of life stability

00:10:31.940 --> 00:10:34.659
in what often feels like a very uncertain future.

00:10:35.200 --> 00:10:38.320
Absolutely. AI can build profoundly impactful

00:10:38.320 --> 00:10:40.600
tools to help people navigate this uncertainty

00:10:40.600 --> 00:10:44.159
and truly flourish. Imagine an AI career coach

00:10:44.159 --> 00:10:47.019
that helps you adapt to future job markets. That'd

00:10:47.019 --> 00:10:49.000
be beautiful. Or a financial advisor that genuinely

00:10:49.000 --> 00:10:51.620
understands your unique life goals and helps

00:10:51.620 --> 00:10:54.139
you actually work towards them. This brings us

00:10:54.139 --> 00:10:56.840
right back to Green's biggest, most audacious

00:10:56.840 --> 00:10:59.679
prediction. The dawn of the emotional operating

00:10:59.679 --> 00:11:02.899
system. The emotional operating system. That's

00:11:02.899 --> 00:11:04.460
the phrase that really sticks with you, isn't

00:11:04.460 --> 00:11:07.220
it? It really does. Any area of our lives where

00:11:07.220 --> 00:11:09.440
we inherently need a relationship, whether it's

00:11:09.440 --> 00:11:11.559
with our health, our finances, our learning,

00:11:11.639 --> 00:11:14.580
maybe even companionship, is now ripe for AI

00:11:14.580 --> 00:11:17.860
transformation. It's moving AI beyond just pure

00:11:17.860 --> 00:11:20.179
function to build genuine, evolving connections.

00:11:20.600 --> 00:11:24.000
And the real kicker here. She argues that large

00:11:24.000 --> 00:11:26.120
established platforms are going to struggle to

00:11:26.120 --> 00:11:28.700
simply bolt this kind of relationship centric

00:11:28.700 --> 00:11:31.000
AI onto their existing architectures. Why is

00:11:31.000 --> 00:11:33.909
that? Well, their entire design, their company

00:11:33.909 --> 00:11:36.409
culture, it just wasn't built for it. It's hard

00:11:36.409 --> 00:11:38.690
to retrofit that kind of deep relationship focus.

00:11:39.169 --> 00:11:42.470
This creates a massive opening for entirely new

00:11:42.470 --> 00:11:44.990
companies to be built from the ground up on relationships.

00:11:45.330 --> 00:11:47.830
Whoa. Yeah, whoa is right. Imagine the depth

00:11:47.830 --> 00:11:50.330
of understanding and connection an AI could truly

00:11:50.330 --> 00:11:53.169
build, evolving with you over years, becoming

00:11:53.169 --> 00:11:55.970
an almost irreplaceable part of your life. This

00:11:55.970 --> 00:11:58.419
could genuinely be the new standard. It's kind

00:11:58.419 --> 00:12:01.320
of mind blowing. So how does this emotional operating

00:12:01.320 --> 00:12:04.860
system fundamentally shift what AI can actually

00:12:04.860 --> 00:12:07.259
do for us? Like what's the core change? It moves

00:12:07.259 --> 00:12:10.340
AI beyond mere function, creating genuine, evolving

00:12:10.340 --> 00:12:13.080
and deeply personal relationships with us. OK,

00:12:13.120 --> 00:12:15.539
so for founders, for creators looking to build

00:12:15.539 --> 00:12:17.779
in this new landscape, Green offers some really

00:12:17.779 --> 00:12:20.080
practical advice beyond the foundational questions

00:12:20.080 --> 00:12:22.899
like why do people need this or why will they

00:12:22.899 --> 00:12:25.639
come back? What's specific to building in this

00:12:25.639 --> 00:12:28.779
AI era? OK, good question. First, she says build

00:12:28.779 --> 00:12:31.139
with the LLM tailwind. Don't try to fight it,

00:12:31.200 --> 00:12:33.580
meaning don't try to build a better foundational

00:12:33.580 --> 00:12:36.720
language model yourself. leverage their incredible

00:12:36.720 --> 00:12:40.159
power as a core engine, right? That frees you

00:12:40.159 --> 00:12:42.139
up to innovate on the interface, the specific

00:12:42.139 --> 00:12:44.580
knowledge, and importantly, that relationship

00:12:44.580 --> 00:12:48.379
layer. Then... Focus your efforts squarely on

00:12:48.379 --> 00:12:50.419
relationship opportunities in areas like health,

00:12:50.539 --> 00:12:53.559
finance, education, and companionship. These

00:12:53.559 --> 00:12:56.159
are inherently relational domains. Makes sense.

00:12:56.379 --> 00:12:59.559
And finally, don't be afraid to experiment far

00:12:59.559 --> 00:13:02.860
beyond just a simple chat bar. The winning interfaces

00:13:02.860 --> 00:13:06.200
for these new relationship -based AIs, they probably

00:13:06.200 --> 00:13:08.200
haven't even been invented yet. And that's the

00:13:08.200 --> 00:13:10.080
really exciting part, the discovery. And then

00:13:10.080 --> 00:13:12.919
her perspective on marketing, which is both simple

00:13:12.919 --> 00:13:15.629
and incredibly profound. Yeah. Your product cut

00:13:15.629 --> 00:13:18.529
your marketing. Authenticity is absolutely key

00:13:18.529 --> 00:13:20.889
here. Marketing shouldn't feel like some separate

00:13:20.889 --> 00:13:23.750
glossy layer you slap on at the end. It should

00:13:23.750 --> 00:13:26.409
organically extend the product's inherent value

00:13:26.409 --> 00:13:30.009
and its personality. Consumers today are incredibly

00:13:30.009 --> 00:13:33.019
savvy. They see right through fakeness. Beat.

00:13:33.279 --> 00:13:36.659
Being early in a totally new category like consumer

00:13:36.659 --> 00:13:40.379
AI gives you a temporary timing advantage, a

00:13:40.379 --> 00:13:43.340
unique chance to truly surprise and delight people

00:13:43.340 --> 00:13:46.409
before the space gets super crowded. But that

00:13:46.409 --> 00:13:49.129
initial wow factor, that timing advantage, it

00:13:49.129 --> 00:13:52.409
fades eventually, right? So for long -term defensibility,

00:13:52.789 --> 00:13:54.950
founders really need to consider what she calls

00:13:54.950 --> 00:13:58.789
the network effect question, which is, what about

00:13:58.789 --> 00:14:00.669
your product gets better for one user because

00:14:00.669 --> 00:14:03.110
you bring other people into the experience? Yeah,

00:14:03.129 --> 00:14:05.509
and in AI, this is often deeply data -driven.

00:14:05.669 --> 00:14:07.769
More users interacting means the underlying model

00:14:07.769 --> 00:14:09.830
learns faster, it becomes more nuanced, more

00:14:09.830 --> 00:14:11.889
capable, and ultimately more personalized for

00:14:11.889 --> 00:14:14.210
everyone. It's a kind of collective intelligence

00:14:14.210 --> 00:14:20.929
that ultimately benefits. So what then is the

00:14:20.929 --> 00:14:22.990
most important advice for founders trying to

00:14:22.990 --> 00:14:25.389
build a truly lasting AI product in this new

00:14:25.389 --> 00:14:28.470
relationship era? I'd say leverage the big LLMs,

00:14:28.570 --> 00:14:31.049
focus intensely on relationship building in key

00:14:31.049 --> 00:14:33.590
areas, experiment bravely with the interface,

00:14:33.850 --> 00:14:36.250
and let your product itself be your most powerful,

00:14:36.330 --> 00:14:39.879
authentic marketing sponsor. So what does this

00:14:39.879 --> 00:14:43.320
all mean for us as we try to navigate this rapidly

00:14:43.320 --> 00:14:46.379
evolving tech landscape? Kirsten Green's core

00:14:46.379 --> 00:14:50.039
insight here, it feels... truly profound. The

00:14:50.039 --> 00:14:52.620
real competitive mode in AI isn't necessarily

00:14:52.620 --> 00:14:55.259
about owning the biggest base models or just

00:14:55.259 --> 00:14:57.639
having the most raw computing power. No, it's

00:14:57.639 --> 00:14:59.679
really about the relationship you manage to build

00:14:59.679 --> 00:15:01.879
with your users. Through continuous learning,

00:15:02.039 --> 00:15:04.440
that incredibly deep memory we talked about,

00:15:04.620 --> 00:15:07.340
and increasingly intimate interactions, especially

00:15:07.340 --> 00:15:09.980
using voice AI, has the potential to create what

00:15:09.980 --> 00:15:12.620
she calls an emotional operating system. This

00:15:12.620 --> 00:15:15.200
then creates an immense, almost unbreakable emotional

00:15:15.200 --> 00:15:17.759
switching cost. You're not just ditching a piece

00:15:17.759 --> 00:15:19.779
of software for a competitor. You're actually

00:15:19.779 --> 00:15:22.779
abandoning a co -created relationship, a digital

00:15:22.779 --> 00:15:24.879
partner that understands you uniquely. Yeah,

00:15:24.960 --> 00:15:27.620
imagine the depth of understanding and connection

00:15:27.620 --> 00:15:30.679
an AI could truly build, evolving with you over

00:15:30.679 --> 00:15:33.500
years, anticipating your needs, maybe even before

00:15:33.500 --> 00:15:36.259
you voice them yourself. That's a really powerful

00:15:36.259 --> 00:15:38.399
thought. This is the new standard she's predicting.

00:15:38.600 --> 00:15:41.440
We are truly at the beginning of a massive platform

00:15:41.440 --> 00:15:44.299
shift here. This messy creative stage she talks

00:15:44.299 --> 00:15:46.539
about. It's not just some phase we're passing

00:15:46.539 --> 00:15:49.340
through. It feels like the opportunity. Totally.

00:15:49.440 --> 00:15:52.399
And the question for you, the listener, is. Are

00:15:52.399 --> 00:15:55.259
you bold enough, curious enough to help define

00:15:55.259 --> 00:15:57.960
what comes next? What kind of relationships will

00:15:57.960 --> 00:16:00.200
we form with these new intelligences? It's kind

00:16:00.200 --> 00:16:02.919
of up for grabs. So think about this. What relationship

00:16:02.919 --> 00:16:05.100
in your own life, maybe with your health, your

00:16:05.100 --> 00:16:07.379
finances, your learning, perhaps your creative

00:16:07.379 --> 00:16:09.860
process, what could be profoundly transformed

00:16:09.860 --> 00:16:12.399
by an AI that truly understood you, that built

00:16:12.399 --> 00:16:15.279
an emotional operating system around your unique

00:16:15.279 --> 00:16:18.120
needs and aspirations? Definitely something for

00:16:18.120 --> 00:16:20.480
you to mull over. That's our deep dive for today.

00:16:20.600 --> 00:16:22.419
Thank you so much for joining us. Until next

00:16:22.419 --> 00:16:25.179
time, keep exploring. Outro music.
