WEBVTT

00:00:00.000 --> 00:00:04.059
There's this UK analyst claiming the AI financial

00:00:04.059 --> 00:00:07.540
boom right now. It's 17 times worse than the

00:00:07.540 --> 00:00:11.089
dot -com bubble. 17 times that number. Well,

00:00:11.189 --> 00:00:13.150
it really makes you stop and think, doesn't it?

00:00:13.390 --> 00:00:15.570
Deserves a proper look. It really does. Yeah.

00:00:15.630 --> 00:00:17.269
And thanks for sharing those sources with us.

00:00:17.329 --> 00:00:19.670
It's clear we're dealing with this massive contradiction,

00:00:19.949 --> 00:00:22.829
right? You've got these sky high AI valuations

00:00:22.829 --> 00:00:26.530
and then you look for the actual profitable uses

00:00:26.530 --> 00:00:28.530
right now. And it's well, it's a different picture.

00:00:28.710 --> 00:00:30.710
Exactly. So today, that's what we're going to

00:00:30.710 --> 00:00:32.909
do. We'll unpack that bubble claim, see where

00:00:32.909 --> 00:00:35.289
AI is actually delivering ROI and also touch

00:00:35.289 --> 00:00:38.049
on this really intense work culture popping up

00:00:38.049 --> 00:00:39.969
again in Silicon Valley to try and meet. these

00:00:39.969 --> 00:00:42.590
huge expectations. OK, let's dig into that big

00:00:42.590 --> 00:00:45.609
number first. The 17 times bubble. This analyst,

00:00:45.770 --> 00:00:48.350
Julian Guerin, he's not mincing words. 17 times

00:00:48.350 --> 00:00:51.890
the dotcom era. Four times the 2008 crash. Beep.

00:00:51.890 --> 00:00:55.000
That's a heavy statement. Mm hmm. And. It's crucial

00:00:55.000 --> 00:00:57.140
to get why he's saying that the dotcom thing

00:00:57.140 --> 00:01:00.020
that was built on maybe shaky revenue projections.

00:01:00.219 --> 00:01:04.060
Right. But this A .I. boom, he argues it's built

00:01:04.060 --> 00:01:06.879
more on just sheer market cap momentum. It's

00:01:06.879 --> 00:01:10.480
exploded way, way past any kind of near term

00:01:10.480 --> 00:01:13.739
profit reality. Yeah. You've got what, 10. major

00:01:13.739 --> 00:01:15.799
AI startups, together they've added something

00:01:15.799 --> 00:01:18.560
like a trillion dollars in market value. A trillion.

00:01:18.819 --> 00:01:21.920
But you look closer and almost all of them are

00:01:21.920 --> 00:01:24.599
still, you know, deeply unprofitable. It's this

00:01:24.599 --> 00:01:26.920
profitability paradox. It really is. So you got

00:01:26.920 --> 00:01:29.879
to ask, who is making money here? And right now

00:01:29.879 --> 00:01:32.650
the answer seems pretty clear. It's mostly NVIDIA

00:01:32.650 --> 00:01:34.989
selling the hardware, the shovels for the gold

00:01:34.989 --> 00:01:36.829
rush. Basically, they're getting consistent returns.

00:01:37.109 --> 00:01:39.150
Pretty much everyone else developing the core

00:01:39.150 --> 00:01:41.530
AI models. They're basically just, well, bleeding

00:01:41.530 --> 00:01:43.950
cash, funding operations based on what they hope

00:01:43.950 --> 00:01:45.689
will happen down the line. And all that cash

00:01:45.689 --> 00:01:47.450
burn. Yeah. It's making the traditional money

00:01:47.450 --> 00:01:50.090
people, the VCs, a bit nervous. You're seeing

00:01:50.090 --> 00:01:52.090
them pull back now. They look at the valuations

00:01:52.090 --> 00:01:54.269
and just think they're, well, kind of absurd.

00:01:54.430 --> 00:01:56.530
Right. So the whole ecosystem is increasingly

00:01:56.530 --> 00:02:00.200
propped up by just a few huge players. The mega

00:02:00.200 --> 00:02:02.480
backers, as Garen calls them. Yeah, not your

00:02:02.480 --> 00:02:05.140
usual VCs. We're talking SoftBank, sovereign

00:02:05.140 --> 00:02:08.080
wealth funds. Even NVIDIA itself is investing

00:02:08.080 --> 00:02:10.219
heavily back into the ecosystem. They're the

00:02:10.219 --> 00:02:13.080
capital life support right now. And the bet seems

00:02:13.080 --> 00:02:15.840
to be if they just keep pouring money in, eventually

00:02:15.840 --> 00:02:17.699
one of these companies will crack artificial

00:02:17.699 --> 00:02:21.460
general intelligence or AGI. So Garen lays out

00:02:21.460 --> 00:02:24.280
two possible futures, basically. Option one.

00:02:24.419 --> 00:02:27.020
Yeah. The money keeps flowing, but it never really

00:02:27.020 --> 00:02:29.199
delivers that massive value everyone expects.

00:02:29.719 --> 00:02:33.060
It's just this capital treadmill or option two.

00:02:33.539 --> 00:02:36.960
Someone actually does achieve true AGI, a genuine

00:02:36.960 --> 00:02:38.900
breakthrough. And he's betting on option one,

00:02:38.939 --> 00:02:40.819
isn't he? He seems to think the market hype has

00:02:40.819 --> 00:02:42.740
just gotten way out ahead of the actual tech

00:02:42.740 --> 00:02:44.800
capabilities right now. I'm going to say just

00:02:44.800 --> 00:02:46.500
trying to understand the tech itself can be tough.

00:02:46.599 --> 00:02:48.719
Honestly, sometimes I wrestle with prompt drift

00:02:48.719 --> 00:02:51.060
myself, you know, getting consistent results.

00:02:51.360 --> 00:02:53.219
So when you talk about these billion dollar market

00:02:53.219 --> 00:02:55.219
shifts. Yeah. Yeah. It's a lot to wrap your head

00:02:55.219 --> 00:02:57.520
around. So given all that cash burn right across

00:02:57.520 --> 00:03:00.509
the board. What's the single biggest factor keeping

00:03:00.509 --> 00:03:03.990
these valuations so incredibly high? It really

00:03:03.990 --> 00:03:06.550
seems to boil down to those mega backers. Them

00:03:06.550 --> 00:03:11.310
plus this persistent belief, this hope that true

00:03:11.310 --> 00:03:15.569
AGI is just around the corner. OK, let's shift

00:03:15.569 --> 00:03:17.669
gears a bit away from the financials and towards

00:03:17.669 --> 00:03:19.650
some more practical stuff. If you're trying to

00:03:19.650 --> 00:03:22.129
get a better handle on the tech itself, Stanford

00:03:22.129 --> 00:03:24.349
actually has this really good four part lecture

00:03:24.349 --> 00:03:26.069
series out. It's about five and a half hours

00:03:26.069 --> 00:03:28.169
total. Gives you a pretty solid kind of medium

00:03:28.169 --> 00:03:30.789
depth understanding of LLMs, you know, large

00:03:30.789 --> 00:03:32.650
language models, the AIs that generate human

00:03:32.650 --> 00:03:34.569
like text. And we are seeing some pretty cool

00:03:34.569 --> 00:03:36.710
jumps in what these things can do. There's a

00:03:36.710 --> 00:03:39.310
new tool, DeepSeq OCR, for reading documents.

00:03:39.569 --> 00:03:41.490
Apparently it's really impressing people. It

00:03:41.490 --> 00:03:43.650
managed to accurately read some properly messy

00:03:43.650 --> 00:03:47.349
handwritten. written letters, which is not easy.

00:03:47.530 --> 00:03:50.150
Yeah, that's huge. Imagine for archives or just

00:03:50.150 --> 00:03:53.569
extracting data. But there's always this ethical

00:03:53.569 --> 00:03:55.449
tension simmering underneath, isn't there? Like

00:03:55.449 --> 00:03:58.449
Microsoft's AI chief coming out and publicly

00:03:58.449 --> 00:04:01.449
criticizing ChatGPT's ability to generate erotica.

00:04:01.569 --> 00:04:05.210
That feels kind of awkward given Microsoft put,

00:04:05.310 --> 00:04:09.069
what, $13 billion into open AI? It does. And

00:04:09.069 --> 00:04:10.610
you also mentioned the Australian government.

00:04:11.050 --> 00:04:14.389
suing Microsoft over confusion about AI pricing

00:04:14.389 --> 00:04:16.670
changes. It just shows how regulators are scrambling

00:04:16.670 --> 00:04:18.610
to keep up, trying to figure out how to handle

00:04:18.610 --> 00:04:23.189
the ambiguity of these new AI systems. And here's

00:04:23.189 --> 00:04:26.149
something that feels genuinely concerning in

00:04:26.149 --> 00:04:28.350
the sources you shared. This trend of extreme

00:04:28.350 --> 00:04:30.430
overwork making a comeback in Silicon Valley.

00:04:30.839 --> 00:04:33.180
AI startups are apparently bringing back China's

00:04:33.180 --> 00:04:37.500
infamous 996 grind. That's 9 a .m. to 9 p .m.

00:04:37.500 --> 00:04:39.800
six days a week. Wow. Yeah, and there's even

00:04:39.800 --> 00:04:41.480
that dark joke going around among researchers

00:04:41.480 --> 00:04:44.500
about the 002 schedule, zero sleep, zero days

00:04:44.500 --> 00:04:47.199
off, just two hours maybe. The pressure to deliver,

00:04:47.360 --> 00:04:50.079
to hit those short -term milestones, maybe justify

00:04:50.079 --> 00:04:52.300
those crazy valuations we talked about, it's

00:04:52.300 --> 00:04:54.699
pushing people really, really hard. It seems

00:04:54.699 --> 00:04:56.980
like a dangerous game. That kind of pressure

00:04:56.980 --> 00:04:58.839
cooker environment might get quick results, but...

00:04:59.420 --> 00:05:02.120
Burning out your best people. Yeah, and it introduces

00:05:02.120 --> 00:05:04.439
a huge risk of mistakes, right? Errors creeping

00:05:04.439 --> 00:05:06.620
into the very foundations of the models they're

00:05:06.620 --> 00:05:09.420
building. Not good long term. Meanwhile, the

00:05:09.420 --> 00:05:11.139
big players are still pushing into new areas.

00:05:11.680 --> 00:05:14.439
OpenAI has apparently got a secret project, building

00:05:14.439 --> 00:05:17.680
a generative music tool. Ah, okay. Like SunoAI

00:05:17.680 --> 00:05:21.199
and those existing tools. Turning text prompts

00:05:21.199 --> 00:05:23.980
or even audio snippets into full songs. Always

00:05:23.980 --> 00:05:26.459
looking for that next big creative market to

00:05:26.459 --> 00:05:30.290
disrupt. So if this 996 work culture, this intense

00:05:30.290 --> 00:05:33.490
grind, is becoming widespread again, how does

00:05:33.490 --> 00:05:35.709
that really impact AI innovation in the long

00:05:35.709 --> 00:05:37.689
run? Well, you might get some rapid gains, sure,

00:05:37.790 --> 00:05:40.589
short bursts of progress. But the risk of burnout,

00:05:40.790 --> 00:05:43.370
costly errors, and ultimately maybe less thoughtful,

00:05:43.509 --> 00:05:47.529
lower quality innovation seems pretty high. Okay,

00:05:47.569 --> 00:05:49.990
let's focus now on some specific tools and maybe

00:05:49.990 --> 00:05:51.910
more importantly, some really impactful applications.

00:05:52.329 --> 00:05:55.269
We saw Google talking about vibe coding. in their

00:05:55.269 --> 00:05:57.670
AI studio. Sounds like they're trying to streamline

00:05:57.670 --> 00:05:59.470
how engineers work, make development faster,

00:05:59.569 --> 00:06:01.970
more intuitive. But the applications that really

00:06:01.970 --> 00:06:04.189
jumped out, I think, were in biodefense and health.

00:06:04.389 --> 00:06:07.230
There's this company, Valfos, backed by OpenAI

00:06:07.230 --> 00:06:10.029
folks. They launched with $30 million specifically

00:06:10.029 --> 00:06:13.509
to build AI biodefense systems. Yes, systems

00:06:13.509 --> 00:06:15.509
designed to detect and figure out how to counter

00:06:15.509 --> 00:06:19.889
new pathogens. In just hours. Yeah. Hours. I

00:06:19.889 --> 00:06:21.850
mean, think about that. That's a massive leap

00:06:21.850 --> 00:06:25.370
in how quickly we could respond to, say, a new

00:06:25.370 --> 00:06:27.509
pandemic threat. It really is. And then there

00:06:27.509 --> 00:06:30.329
was that other study from Utah showing AI could

00:06:30.329 --> 00:06:32.930
spot parasites in human stool samples really

00:06:32.930 --> 00:06:37.699
quickly and accurately. Pause. Whoa. I mean,

00:06:37.699 --> 00:06:39.939
just imagine scaling that kind of detection speed,

00:06:40.139 --> 00:06:42.300
applying it globally to public health surveillance.

00:06:42.639 --> 00:06:45.000
That's tangible value. That justifies serious

00:06:45.000 --> 00:06:47.579
investment, you know? Absolutely. And the support

00:06:47.579 --> 00:06:49.319
system for all this is growing, too. Amazon,

00:06:50.000 --> 00:06:52.819
OpenAI, NVIDIA, they're teaming up to turn Cal

00:06:52.819 --> 00:06:55.279
State into a big AI training center, recognizing

00:06:55.279 --> 00:06:57.160
we need more people who actually know how to

00:06:57.160 --> 00:06:58.980
build and use this stuff. There's even a paper

00:06:58.980 --> 00:07:01.620
mapping out like 90 different AI coding tools

00:07:01.620 --> 00:07:03.779
available right now. So thinking about all those

00:07:03.779 --> 00:07:06.509
tools and applications. Beyond the big names,

00:07:06.649 --> 00:07:09.350
what was the most surprising practical use case

00:07:09.350 --> 00:07:11.930
we came across this time? For me, it was that

00:07:11.930 --> 00:07:15.410
health stuff. The AI detecting pathogens and

00:07:15.410 --> 00:07:18.410
parasites so quickly and accurately that speed

00:07:18.410 --> 00:07:20.829
just feels like a potential game changer for

00:07:20.829 --> 00:07:23.870
public health. Mineral sponsor read, insert sponsor

00:07:23.870 --> 00:07:25.689
read here. All right. So we've talked about the

00:07:25.689 --> 00:07:28.689
bubble fears, the potential, the pressures. Let's

00:07:28.689 --> 00:07:30.850
look at some hard data now, specifically from

00:07:30.850 --> 00:07:33.430
industries already using this tech. We're pulling

00:07:33.430 --> 00:07:36.329
this from the Artificial Analysis 2025 State

00:07:36.329 --> 00:07:39.149
of Generative Media report, based on input from

00:07:39.149 --> 00:07:41.430
about 300 developers and creators using these

00:07:41.430 --> 00:07:44.170
tools right now. The data really highlights Google's

00:07:44.170 --> 00:07:46.790
position, especially in creative AI, for making

00:07:46.790 --> 00:07:50.089
images. Google Gemini is number one, 74 % usage

00:07:50.089 --> 00:07:53.149
among those surveyed. OpenAI's model, GPT image,

00:07:53.370 --> 00:07:57.129
is next at 64%. So, strong competition there.

00:07:57.389 --> 00:07:59.610
Yeah, and that lead carries over into video generation

00:07:59.610 --> 00:08:03.089
too. Google VO. Leading the pack with 69 % usage

00:08:03.089 --> 00:08:05.069
in this group, competition's clearly driving

00:08:05.069 --> 00:08:07.589
things forward. And what's driving the choice

00:08:07.589 --> 00:08:09.949
for these creators? What makes them pick one

00:08:09.949 --> 00:08:12.449
model over another? Overwhelmingly, it's quality.

00:08:12.649 --> 00:08:15.269
That's the number one factor, cited by like 76

00:08:15.269 --> 00:08:18.209
% to 82%, depending on the task. Makes sense.

00:08:18.410 --> 00:08:21.569
Cost comes second, yeah. But the message is clear.

00:08:22.439 --> 00:08:24.959
If the output isn't good enough, isn't high quality,

00:08:25.180 --> 00:08:27.680
then people just aren't going to use it regardless

00:08:27.680 --> 00:08:30.120
of price. OK, so let's connect this back to that

00:08:30.120 --> 00:08:32.600
bubble debate we started with. This feels really

00:08:32.600 --> 00:08:35.659
important. Sixty five percent, almost two thirds

00:08:35.659 --> 00:08:38.519
of the organizations surveyed said they're already

00:08:38.519 --> 00:08:40.779
seeing a return on their investment in generative

00:08:40.779 --> 00:08:43.740
AI or they expect to see it within the next year.

00:08:44.120 --> 00:08:47.039
Right now. OK, that's likely self -reported data.

00:08:47.120 --> 00:08:49.279
You got to take it with maybe a small grain of

00:08:49.279 --> 00:08:52.610
salt. But still. 65%. That's pretty strong validation,

00:08:52.809 --> 00:08:55.190
isn't it? It suggests that investment in these

00:08:55.190 --> 00:08:58.330
specific high quality focused applications like

00:08:58.330 --> 00:09:01.169
generative media is paying off for many right

00:09:01.169 --> 00:09:03.210
now. Yeah, it kind of counterbalances the hype

00:09:03.210 --> 00:09:05.710
around the purely research focused arms that

00:09:05.710 --> 00:09:08.830
are still burning cash. There's real measurable

00:09:08.830 --> 00:09:12.379
value being created in certain sectors. If quality

00:09:12.379 --> 00:09:14.919
is the absolute top priority, does that automatically

00:09:14.919 --> 00:09:18.200
mean the big expensive proprietary models from,

00:09:18.279 --> 00:09:21.019
say, Google and OpenAI will just always dominate

00:09:21.019 --> 00:09:23.840
the open source alternatives? Well, quality is

00:09:23.840 --> 00:09:26.000
definitely driving adoption now. But the report

00:09:26.000 --> 00:09:28.220
suggests that the ability to fine tune models

00:09:28.220 --> 00:09:31.259
for specific needs and dealing with IP and licensing

00:09:31.259 --> 00:09:33.960
stuff, those are becoming really important secondary

00:09:33.960 --> 00:09:37.039
factors pretty quickly. So bringing it all together.

00:09:37.960 --> 00:09:39.960
What does this mean for you listening? We've

00:09:39.960 --> 00:09:41.620
really grappled with the central tension today,

00:09:41.679 --> 00:09:43.799
haven't we? On one hand, you've got this immense

00:09:43.799 --> 00:09:46.860
financial speculation that potential 17X bubble

00:09:46.860 --> 00:09:49.620
Garen warned about. Huge risk. But on the other

00:09:49.620 --> 00:09:52.399
hand, you have clear, demonstrable ROI already

00:09:52.399 --> 00:09:54.820
happening in fields like generative media and

00:09:54.820 --> 00:09:56.899
these incredible potential breakthroughs in areas

00:09:56.899 --> 00:09:59.240
like biodefense. Yeah, it's a paradox, like we

00:09:59.240 --> 00:10:02.220
said. Massive financial risk running alongside

00:10:02.220 --> 00:10:05.299
absolutely essential functional progress. The

00:10:05.299 --> 00:10:07.759
debate isn't really if AI is valuable anymore,

00:10:07.899 --> 00:10:09.940
is it? It seems more about where the sustainable

00:10:09.940 --> 00:10:13.340
value is actually landing. Is it with the companies

00:10:13.340 --> 00:10:15.600
selling the infrastructure like NVIDIA and Google

00:10:15.600 --> 00:10:17.899
through its cloud? Or is it eventually going

00:10:17.899 --> 00:10:19.840
to come from those research arms chasing AGI

00:10:19.840 --> 00:10:22.519
even though they're burning cash now? And maybe

00:10:22.519 --> 00:10:25.600
here's a final thought to chew on. If 55 % of

00:10:25.600 --> 00:10:27.600
organizations say they're already getting ROI

00:10:27.600 --> 00:10:31.610
from AI, What do the other 35 % need to figure

00:10:31.610 --> 00:10:34.110
out? What needs to change in their strategy or

00:10:34.110 --> 00:10:36.350
maybe the specific way they're applying AI so

00:10:36.350 --> 00:10:38.789
they don't get left behind? That's the challenge

00:10:38.789 --> 00:10:40.570
of applying this knowledge in real time, isn't

00:10:40.570 --> 00:10:42.669
it? Absolutely. Well, thanks again for sharing

00:10:42.669 --> 00:10:44.570
the sources and letting us dive deep into all

00:10:44.570 --> 00:10:46.269
this with you today. Definitely check out those

00:10:46.269 --> 00:10:48.009
Stanford lectures if you're interested and maybe

00:10:48.009 --> 00:10:50.490
take a look at that generative media report data.

00:10:50.669 --> 00:10:53.169
Lots to think about. Out to your own music.
