WEBVTT

00:00:00.000 --> 00:00:03.759
Imagine a billion dollars, not for the latest

00:00:03.759 --> 00:00:07.160
shiny AI chatbot or some big social network,

00:00:07.320 --> 00:00:11.060
but for, well, for organizing data. What's that

00:00:11.060 --> 00:00:12.900
about? Sounds kind of wild, right? Yeah. It's

00:00:12.900 --> 00:00:15.759
absolutely crucial. It's really the silent force

00:00:15.759 --> 00:00:20.359
behind how AI truly learns and gets better. Welcome

00:00:20.359 --> 00:00:22.899
to the Deep Dive. Today, we're going to follow

00:00:22.899 --> 00:00:25.789
the money. uh take a journey into the world of

00:00:25.789 --> 00:00:28.410
artificial intelligence just looking at july

00:00:28.410 --> 00:00:32.229
2025 alone billions, literally billions of dollars

00:00:32.229 --> 00:00:35.189
flowed into companies that are shaping AI's future.

00:00:35.469 --> 00:00:36.689
Yeah. And we're going to unpack where all that

00:00:36.689 --> 00:00:38.990
financial energy is really going inside the AI

00:00:38.990 --> 00:00:41.469
ecosystem. Think of it like mapping out the nervous

00:00:41.469 --> 00:00:43.490
system of this whole thing as it grows so fast.

00:00:43.609 --> 00:00:45.909
We'll look at everything from the textbooks AI

00:00:45.909 --> 00:00:48.509
learns from, to its muscles, the actual computing

00:00:48.509 --> 00:00:52.009
power, the agents that do tasks for us, and even

00:00:52.009 --> 00:00:54.479
the stuff you don't see. the infrastructure making

00:00:54.479 --> 00:00:56.960
it all work. Yeah. Our mission for you, the listener,

00:00:57.100 --> 00:01:00.060
is to get past just the huge numbers. We want

00:01:00.060 --> 00:01:02.939
to understand the big ideas behind them. What's

00:01:02.939 --> 00:01:05.420
this flood of cash telling us about where AI

00:01:05.420 --> 00:01:08.859
is actually heading and what might it mean for

00:01:08.859 --> 00:01:11.540
us? Let's start exploring. Okay, so our first

00:01:11.540 --> 00:01:14.500
stop is right at the core of AI. When we think

00:01:14.500 --> 00:01:16.400
about artificial intelligence, yeah, our minds

00:01:16.400 --> 00:01:18.719
often jump straight to those huge language models.

00:01:19.019 --> 00:01:23.420
But what about the raw materials, the fundamental

00:01:23.420 --> 00:01:25.640
building blocks that actually make them smart?

00:01:25.980 --> 00:01:28.140
Exactly. We're seeing these massive investments

00:01:28.140 --> 00:01:30.579
in what we could call AI's textbooks, you know,

00:01:30.579 --> 00:01:32.519
the data and its muscles, the specialized computer

00:01:32.519 --> 00:01:35.359
power. It's like like funding the library in

00:01:35.359 --> 00:01:37.640
the gym for this new intelligence. And here's

00:01:37.640 --> 00:01:39.140
where it gets really interesting, maybe even

00:01:39.140 --> 00:01:42.719
a bit counterintuitive. Surge AI landed a staggering

00:01:42.719 --> 00:01:45.379
billion dollars, a billion for data labeling.

00:01:45.480 --> 00:01:47.260
That feels like a huge amount for something that

00:01:47.260 --> 00:01:51.239
sounds so meticulous. Yeah. What exactly is data

00:01:51.239 --> 00:01:53.359
labeling and why is it worth so much right now?

00:01:53.640 --> 00:01:55.879
Right. So data labeling is basically this super

00:01:55.879 --> 00:01:58.500
careful process of organizing and tagging raw

00:01:58.500 --> 00:02:02.459
data. Like imagine feeding an AI thousands of

00:02:02.459 --> 00:02:04.879
pictures. And for every single one, a person

00:02:04.879 --> 00:02:07.500
marks it. This is a cat or that's a stop sign.

00:02:07.959 --> 00:02:10.379
You're giving the AI organized examples so it

00:02:10.379 --> 00:02:13.060
can learn patterns. OK. It's not just about having

00:02:13.060 --> 00:02:15.560
lots of data. It's about having really good,

00:02:15.659 --> 00:02:18.659
high quality data. This human curated stuff is

00:02:18.659 --> 00:02:21.370
like. The bedrock for AI models to get nuance,

00:02:21.669 --> 00:02:24.789
avoid bias, and actually work reliably out in

00:02:24.789 --> 00:02:27.270
the real world. A billion dollar bet here kind

00:02:27.270 --> 00:02:28.949
of tells you that getting that data quality right

00:02:28.949 --> 00:02:31.530
with humans involved is still a major bottleneck.

00:02:31.530 --> 00:02:33.129
It's a really high value piece of the puzzle,

00:02:33.210 --> 00:02:36.090
even as AI gets smarter. So it's the quiet, really

00:02:36.090 --> 00:02:38.490
precise work that truly feeds and refines an

00:02:38.490 --> 00:02:42.349
AI's intelligence. Eat. Fascinating. Then, switching

00:02:42.349 --> 00:02:45.370
gears a bit, you have Grok. They snag $600 million

00:02:45.370 --> 00:02:48.409
for AI computer chips. But these aren't your

00:02:48.409 --> 00:02:49.849
standard chips. They're what we call inference

00:02:49.849 --> 00:02:52.090
chips. Inference chips. Okay. And for our listener,

00:02:52.150 --> 00:02:54.310
what does inference mean in this context? What

00:02:54.310 --> 00:02:57.240
are these chips doing? So inference is when an

00:02:57.240 --> 00:03:00.300
AI model, one that's already been trained, uses

00:03:00.300 --> 00:03:02.439
what it learned to actually make a prediction

00:03:02.439 --> 00:03:04.300
or a decision. And it needs to do that really

00:03:04.300 --> 00:03:06.560
fast. Inference chips are basically specialized

00:03:06.560 --> 00:03:09.280
computer hardware built just for running those

00:03:09.280 --> 00:03:11.219
AI programs quickly. It's different from the

00:03:11.219 --> 00:03:13.780
huge computer clusters used for training the

00:03:13.780 --> 00:03:16.460
AI, which is like the teaching phase. Inference

00:03:16.460 --> 00:03:19.319
chips are for the doing phase, running what the

00:03:19.319 --> 00:03:21.699
AI knows super fast and often using way less

00:03:21.699 --> 00:03:24.520
energy. Grok getting that much money shows there's

00:03:24.520 --> 00:03:26.780
huge demand for putting AI to work efficiently

00:03:26.780 --> 00:03:29.740
in real time, everywhere from your phone to massive

00:03:29.740 --> 00:03:32.520
data centers. It's all about speed and sustainability,

00:03:32.620 --> 00:03:35.219
too. These seem like quite different things,

00:03:35.379 --> 00:03:38.139
carefully labeling data versus building super

00:03:38.139 --> 00:03:40.300
fast chips. What's the common thread? What connects

00:03:40.300 --> 00:03:42.860
these foundational investments? They're both

00:03:42.860 --> 00:03:45.419
core infrastructure. Really, they represent the

00:03:45.419 --> 00:03:48.400
foundational layers that let AI grow and function

00:03:48.400 --> 00:03:51.060
effectively. That makes a lot of sense. OK, moving

00:03:51.060 --> 00:03:54.479
beyond that foundation, we're seeing AI shift

00:03:54.479 --> 00:03:57.159
from just being a tool to becoming more of a

00:03:57.159 --> 00:04:00.500
true agent, something capable of doing complex

00:04:00.500 --> 00:04:03.800
tasks on its own, sometimes with a surprising

00:04:03.800 --> 00:04:05.699
amount of autonomy. Yeah, totally. It's like

00:04:05.699 --> 00:04:08.590
upgrading from, you know. basic calculator to

00:04:08.590 --> 00:04:10.849
having a digital assistant that doesn't just

00:04:10.849 --> 00:04:12.870
give suggestions, but actually does stuff for

00:04:12.870 --> 00:04:15.969
you. It anticipates things. So Kuvi .ai got $700

00:04:15.969 --> 00:04:18.110
million for something called agentic finance.

00:04:18.490 --> 00:04:20.329
What does that actually look like? How would

00:04:20.329 --> 00:04:23.709
that work for, say, a person or a business? Well,

00:04:23.730 --> 00:04:26.009
think of it less like your current financial

00:04:26.009 --> 00:04:28.610
advisor app that gives you tips and more like

00:04:28.610 --> 00:04:30.889
an AI financial manager. It could potentially

00:04:30.889 --> 00:04:33.490
execute trades or rebalance your portfolio or

00:04:33.490 --> 00:04:35.490
manage company expenses sort of automatically

00:04:35.490 --> 00:04:37.649
based on patterns it learns and goals you set.

00:04:37.769 --> 00:04:40.470
Agentic finance basically means AI making smart

00:04:40.470 --> 00:04:42.790
financial decisions independently. It signals

00:04:42.790 --> 00:04:45.300
a shift. You know, from AI just assisting us

00:04:45.300 --> 00:04:47.540
passively to becoming an active participant,

00:04:47.699 --> 00:04:49.779
especially in finance. That sounds incredibly

00:04:49.779 --> 00:04:52.660
powerful, definitely useful. But I can also imagine

00:04:52.660 --> 00:04:55.660
some businesses maybe feeling a bit nervous about

00:04:55.660 --> 00:04:59.439
privacy or maybe losing control if an AI is managing

00:04:59.439 --> 00:05:02.240
spending all by itself. How do companies like

00:05:02.240 --> 00:05:05.879
RAMP, which uses AI for spend management, how

00:05:05.879 --> 00:05:08.019
do they handle those worries? Yeah, that's a

00:05:08.019 --> 00:05:10.540
really important point. Companies like RAMP usually

00:05:10.540 --> 00:05:12.540
tackle this by focusing hard on transparency.

00:05:12.910 --> 00:05:15.709
and giving users lots of control. The AI isn't

00:05:15.709 --> 00:05:17.870
meant to be some rogue agent. It's more like

00:05:17.870 --> 00:05:20.250
an intelligent layer on top of existing processes.

00:05:20.589 --> 00:05:24.110
It spots patterns, flags things that look weird

00:05:24.110 --> 00:05:26.769
in spending, maybe auto -approves small things

00:05:26.769 --> 00:05:29.269
within rules you set up. But the human oversight

00:05:29.269 --> 00:05:31.430
is still there, crucially. The AI just takes

00:05:31.430 --> 00:05:33.350
away a ton of the manual grunt work, letting

00:05:33.350 --> 00:05:35.250
people focus on the bigger picture, the strategy,

00:05:35.410 --> 00:05:37.649
not just typing numbers. Right. That makes sense.

00:05:37.850 --> 00:05:39.790
And then in healthcare, you mentioned Ambien's

00:05:39.790 --> 00:05:42.129
healthcare, using AI to automatically write...

00:05:42.120 --> 00:05:44.579
medical notes from doctor -patient talks. That

00:05:44.579 --> 00:05:46.680
sounds like it could be a massive help for doctors

00:05:46.680 --> 00:05:49.420
drowning in paperwork. Absolutely. It saves doctors

00:05:49.420 --> 00:05:51.420
so much time, lets them actually focus more on

00:05:51.420 --> 00:05:53.259
the patient in front of them instead of constantly

00:05:53.259 --> 00:05:55.420
thinking about the documentation. It's a really

00:05:55.420 --> 00:05:57.319
tangible quality of life improvement for them.

00:05:57.579 --> 00:06:00.639
And Motive, applying AI in tracking vehicles,

00:06:00.920 --> 00:06:04.019
managing driver safety, that seems critical for

00:06:04.019 --> 00:06:06.360
logistics. Definitely. And then there's NAN.

00:06:06.990 --> 00:06:08.829
which helps different software apps talk to each

00:06:08.829 --> 00:06:10.689
other and work together automatically. Think

00:06:10.689 --> 00:06:13.689
of it like connecting Lego blocks of data, but

00:06:13.689 --> 00:06:16.410
intelligently automating these really complex

00:06:16.410 --> 00:06:18.629
workflows between systems that normally don't

00:06:18.629 --> 00:06:20.990
cooperate. These examples really show AI moving

00:06:20.990 --> 00:06:24.290
into more proactive roles. But beyond just automating

00:06:24.290 --> 00:06:27.290
tasks, how do they fundamentally show AI taking

00:06:27.290 --> 00:06:30.819
initiative? They highlight AI managing complex,

00:06:30.860 --> 00:06:33.279
you know, multi -step tasks independently across

00:06:33.279 --> 00:06:35.180
different industries. Okay, let's shift focus

00:06:35.180 --> 00:06:37.980
again. How is AI directly touching our lives

00:06:37.980 --> 00:06:40.560
in more visible, maybe more personal ways? Thinking

00:06:40.560 --> 00:06:43.220
about things like creativity or even critical

00:06:43.220 --> 00:06:45.569
public services. Yeah, these are the applications

00:06:45.569 --> 00:06:47.649
where you really start to feel the immediate,

00:06:47.790 --> 00:06:50.589
tangible effects of this technology, sometimes

00:06:50.589 --> 00:06:53.509
in pretty profound ways. For example, Falves

00:06:53.509 --> 00:06:56.569
is working deep in generative AI for media. They

00:06:56.569 --> 00:06:58.610
use something called diffusion models. Diffusion

00:06:58.610 --> 00:07:00.129
models? Yeah, diffusion models are basically

00:07:00.129 --> 00:07:03.269
AI systems that can create totally new media,

00:07:03.329 --> 00:07:06.670
like images, video, even music, just from a simple

00:07:06.670 --> 00:07:08.879
text description you give them. It's radically

00:07:08.879 --> 00:07:11.160
changing creative fields, artists, designers.

00:07:11.379 --> 00:07:14.399
They can quickly try out ideas or generate completely

00:07:14.399 --> 00:07:17.019
new content at a scale that just wasn't possible

00:07:17.019 --> 00:07:19.920
before. And Carbon, that's fascinating. Using

00:07:19.920 --> 00:07:23.300
AI for emergency response, helping 911 dispatchers

00:07:23.300 --> 00:07:25.759
find callers faster, maybe sharing live video

00:07:25.759 --> 00:07:27.839
from a scene. That sounds like it could literally

00:07:27.839 --> 00:07:30.560
save lives. It really underscores the potential

00:07:30.560 --> 00:07:32.660
scale and impact, doesn't it? I mean, just thinking

00:07:32.660 --> 00:07:35.579
about AI speeding up emergency response by even

00:07:35.579 --> 00:07:38.399
a few seconds, but across millions and millions

00:07:38.399 --> 00:07:40.699
of calls, you're talking about fundamentally

00:07:40.699 --> 00:07:44.540
saving lives. Whoa. Imagine scaling that to a

00:07:44.540 --> 00:07:47.180
billion queries or just the sheer number of lives

00:07:47.180 --> 00:07:50.079
potentially saved through faster response. That's

00:07:50.079 --> 00:07:52.819
kind of a moment of true wonder, right? A really

00:07:52.819 --> 00:07:55.259
profound example of AI's potential positive impact.

00:07:55.579 --> 00:07:58.439
Wow. Yeah. And then Slingshot AI is looking at

00:07:59.000 --> 00:08:01.560
Using AI for mental health coaching, like a personal

00:08:01.560 --> 00:08:04.000
AI guide for well -being accessible almost anywhere.

00:08:04.019 --> 00:08:06.000
That's another very personal application. For

00:08:06.000 --> 00:08:08.899
sure. And then Ultramix uses AI to analyze heart

00:08:08.899 --> 00:08:11.379
scans. looking for early signs of heart disease.

00:08:11.639 --> 00:08:14.120
That's incredibly vital. Catching things earlier

00:08:14.120 --> 00:08:16.120
means better outcomes for patients. These are

00:08:16.120 --> 00:08:19.339
intensely personal uses. How does AI really impact

00:08:19.339 --> 00:08:21.199
our daily lives through these kinds of innovations

00:08:21.199 --> 00:08:23.560
and creative work and critical services? From

00:08:23.560 --> 00:08:25.420
how we create art to how we manage our health,

00:08:25.540 --> 00:08:28.279
AI is deeply transforming personal experiences.

00:08:28.660 --> 00:08:31.420
Okay, finally, let's explore the areas that maybe

00:08:31.420 --> 00:08:34.840
aren't as flashy, but are just as crucial. The

00:08:34.840 --> 00:08:37.379
funding pouring into the behind the scenes infrastructure

00:08:37.379 --> 00:08:40.919
and the vital security that holds everything

00:08:40.919 --> 00:08:43.360
together. Yeah. This is the essential scaffolding,

00:08:43.379 --> 00:08:45.980
you know, the unseen guardians for the whole

00:08:45.980 --> 00:08:48.980
AI world. Without this stuff working reliably

00:08:48.980 --> 00:08:51.500
and securely, none of the cool applications would

00:08:51.500 --> 00:08:54.620
function at scale. Right. So Observe helps companies

00:08:54.620 --> 00:08:57.139
manage and actually understand the massive amounts

00:08:57.139 --> 00:08:59.769
of data they have stored in the cloud. making

00:08:59.769 --> 00:09:04.470
sense of all that digital noise. And 1KOMMA5Degrees,

00:09:04.470 --> 00:09:06.389
that's a clean energy company using technology,

00:09:06.649 --> 00:09:09.730
often involving AI, to make homes smarter, more

00:09:09.730 --> 00:09:11.850
energy efficient. Kind of a quiet revolution,

00:09:11.889 --> 00:09:15.029
but really powerful for sustainability. Anaconda

00:09:15.029 --> 00:09:17.710
provides open source tools for building AI applications.

00:09:17.909 --> 00:09:19.870
That sounds like it makes complex AI development

00:09:19.870 --> 00:09:22.009
more accessible for everyone, right? Not just

00:09:22.009 --> 00:09:24.889
big companies. Exactly. It democratizes it a

00:09:24.889 --> 00:09:28.960
bit. Inheritives provide security for older software

00:09:28.960 --> 00:09:31.120
legacy systems that maybe aren't supported anymore

00:09:31.120 --> 00:09:33.559
but are still critical. Keeps companies safe

00:09:33.559 --> 00:09:36.159
from hackers. It's like a digital bodyguard for

00:09:36.159 --> 00:09:38.519
aging tech infrastructure. And Oxide Computer

00:09:38.519 --> 00:09:40.500
Company, interesting, they actually build the

00:09:40.500 --> 00:09:43.340
servers for cloud computing from the ground up,

00:09:43.379 --> 00:09:45.600
not just assemble parts, designing the hardware

00:09:45.600 --> 00:09:48.039
itself. Then there's Noma Security, building

00:09:48.039 --> 00:09:50.299
what they call an autonomous security platform.

00:09:50.559 --> 00:09:53.279
And autonomous here means it works by itself.

00:09:53.360 --> 00:09:55.240
Pretty much, yeah. Autonomous security means

00:09:55.240 --> 00:09:57.179
AI protecting computer systems automatically,

00:09:57.519 --> 00:10:00.399
without needing constant human oversight. It

00:10:00.399 --> 00:10:02.960
learns what normal looks like for a system, spots

00:10:02.960 --> 00:10:05.580
weird stuff, anomalies, and can actually shut

00:10:05.580 --> 00:10:08.220
down threats before they cause damage. All on

00:10:08.220 --> 00:10:11.299
its own. 2047. That seems like a massive change

00:10:11.299 --> 00:10:13.960
in how we approach cybersecurity, moving from

00:10:13.960 --> 00:10:16.200
just reacting to threats to actually predicting

00:10:16.200 --> 00:10:19.519
and stopping them proactively. Huge shift. Absolutely.

00:10:19.799 --> 00:10:23.399
And then SAFE gives businesses sort of a cybersecurity

00:10:23.399 --> 00:10:26.580
credit score, helps them understand their specific

00:10:26.580 --> 00:10:30.269
risks in a clear, quantifiable way. Looking at

00:10:30.269 --> 00:10:32.490
their vulnerabilities, I have to admit, I still

00:10:32.490 --> 00:10:35.250
wrestle with prompt drift myself sometimes, trying

00:10:35.250 --> 00:10:37.169
to get an AI to consistently do what I want.

00:10:37.250 --> 00:10:39.009
So understanding vulnerabilities and managing

00:10:39.009 --> 00:10:42.269
risk proactively is just key in this complex

00:10:42.269 --> 00:10:45.610
world. And GUPS focuses on conversational AI,

00:10:45.850 --> 00:10:48.070
powering those chatbots we're all interacting

00:10:48.070 --> 00:10:50.009
with more and more for customer service and things

00:10:50.009 --> 00:10:52.409
like that. Right. And Ashby helps companies hire

00:10:52.409 --> 00:10:54.389
better with an all -in -one recruiting platform,

00:10:54.649 --> 00:10:57.830
often using AI, you know, to sift through applications

00:10:57.830 --> 00:11:00.490
and find the best potential candidates much faster.

00:11:00.649 --> 00:11:02.750
So we've got data management, clean energy tech.

00:11:03.240 --> 00:11:05.860
open source tools, security for old and new systems,

00:11:06.000 --> 00:11:08.480
even recruiting. What's the common thread tying

00:11:08.480 --> 00:11:10.379
these diverse companies together in this sort

00:11:10.379 --> 00:11:13.440
of unseen pillar category? They really represent

00:11:13.440 --> 00:11:16.600
the essential, often invisible systems that enable

00:11:16.600 --> 00:11:19.379
both AI itself and just modern business operations

00:11:19.379 --> 00:11:22.299
in general. Sponsor. So pulling this all together,

00:11:22.580 --> 00:11:26.840
what does it all mean? This deep dive into where

00:11:26.840 --> 00:11:29.299
the AI money's been flowing recently. It paints

00:11:29.299 --> 00:11:31.320
a pretty clear picture, doesn't it? The cash

00:11:31.320 --> 00:11:33.960
isn't just going into the brains, those big language

00:11:33.960 --> 00:11:35.799
models everyone talks about. Right, exactly.

00:11:35.840 --> 00:11:38.620
It's flowing into the entire ecosystem, like

00:11:38.620 --> 00:11:42.379
every single layer. From the textbooks that meticulously

00:11:42.379 --> 00:11:46.600
label data, teaching the AI to the muscles, those

00:11:46.600 --> 00:11:50.450
super fast, efficient computer chips. the agents

00:11:50.450 --> 00:11:52.669
that are starting to perform complex tasks for

00:11:52.669 --> 00:11:54.909
us kind of on their own. We've seen these huge

00:11:54.909 --> 00:11:57.370
investments covering everything from the absolute

00:11:57.370 --> 00:12:00.789
bedrock foundations of AI, like preparing data

00:12:00.789 --> 00:12:03.169
and building specialized hardware. all the way

00:12:03.169 --> 00:12:04.750
through to these really advanced applications

00:12:04.750 --> 00:12:08.110
that directly touch our daily health, our safety,

00:12:08.230 --> 00:12:10.210
even our creativity. It's a truly distributed

00:12:10.210 --> 00:12:12.730
effort. It feels like a systemic re -architecture

00:12:12.730 --> 00:12:14.490
happening across pretty much every industry.

00:12:14.730 --> 00:12:17.009
This isn't just some tech bubble, you know. It

00:12:17.009 --> 00:12:18.950
feels more like a fundamental rethinking of how

00:12:18.950 --> 00:12:20.990
businesses run and how we interact with the world.

00:12:21.049 --> 00:12:23.429
It's pretty profound when you look at it that

00:12:23.429 --> 00:12:25.919
way. This explosion of funding seems to tell

00:12:25.919 --> 00:12:29.080
us that AI isn't just a passing trend. It feels

00:12:29.080 --> 00:12:31.700
like a profound, undeniable shift that's already

00:12:31.700 --> 00:12:34.779
well underway. It truly is. We're basically watching

00:12:34.779 --> 00:12:38.019
the building blocks of an AI first future get

00:12:38.019 --> 00:12:41.179
put into place right now, sometimes quietly,

00:12:41.259 --> 00:12:43.580
but very, very rapidly. So for you, the listener,

00:12:43.679 --> 00:12:45.320
maybe here's a provocative thought to leave you

00:12:45.320 --> 00:12:48.240
with. As this huge amount of capital keeps pouring

00:12:48.240 --> 00:12:51.899
in and reshaping the landscape. How will this

00:12:51.899 --> 00:12:54.759
AI -driven future really change the fabric of

00:12:54.759 --> 00:12:57.159
our society over the next, say, five, ten years?

00:12:57.340 --> 00:13:00.179
What completely new applications, things we maybe

00:13:00.179 --> 00:13:02.759
can't even properly imagine yet, might emerge

00:13:02.759 --> 00:13:04.840
from these investments and fundamentally alter

00:13:04.840 --> 00:13:07.399
our daily lives? Keep your eyes open, because

00:13:07.399 --> 00:13:09.179
the implications are definitely far -reaching.

00:13:09.220 --> 00:13:11.340
And honestly, the journey feels like it's just

00:13:11.340 --> 00:13:13.440
getting started. Thank you for joining us on

00:13:13.440 --> 00:13:16.100
this deep dive into the latest AI funding landscape.

00:13:16.259 --> 00:13:18.039
We hope you feel a bit more informed and maybe

00:13:18.039 --> 00:13:20.720
just a little more curious about the world that's

00:13:20.720 --> 00:13:22.940
being built all around us. Out Hero Music.
