WEBVTT

00:00:00.000 --> 00:00:03.520
Imagine you're driving home from work. It's,

00:00:03.520 --> 00:00:06.740
um, it's dusk. Right. The sun is just starting

00:00:06.740 --> 00:00:09.400
to dip below the horizon. Exactly. Casting those

00:00:09.400 --> 00:00:12.240
really long shadows across the pavement. Now

00:00:12.240 --> 00:00:15.439
for you, as a human driver, this is a pretty

00:00:15.439 --> 00:00:17.640
standard, relatively safe time to be on the road.

00:00:17.800 --> 00:00:19.780
Yeah. You just flip your headlights on, adjust

00:00:19.780 --> 00:00:21.859
your visor, and you don't even think twice about

00:00:21.859 --> 00:00:24.239
it. You really don't. Yeah. But if you are a

00:00:24.239 --> 00:00:27.530
self -driving car, This exact time of day is

00:00:27.530 --> 00:00:30.149
actually the deadliest. Which is wild to think

00:00:30.149 --> 00:00:32.770
about. It is. Statistically, autonomous vehicles

00:00:32.770 --> 00:00:35.229
are like more than five times more likely to

00:00:35.229 --> 00:00:37.229
crash when the sun goes down compared to human

00:00:37.229 --> 00:00:40.170
drivers. It completely upends our intuitive understanding

00:00:40.170 --> 00:00:42.710
of safety, honestly. I mean, we tend to assume

00:00:42.710 --> 00:00:46.310
a machine equipped with infrared sensors, radar,

00:00:46.509 --> 00:00:49.229
and 360 -degree cameras would practically own

00:00:49.229 --> 00:00:51.369
the night. Right. We assume the tech is infallible.

00:00:51.590 --> 00:00:53.950
Exactly. But the data tells a drastically different,

00:00:54.090 --> 00:00:57.130
much more complicated story. Welcome to today's

00:00:57.130 --> 00:01:01.289
Deep Dive. Today is Thursday, March 26, 2026,

00:01:01.289 --> 00:01:04.870
and we are staring down a massive, comprehensive

00:01:04.870 --> 00:01:08.769
Wikipedia repository that details the exact current

00:01:08.769 --> 00:01:11.530
state of self -driving cars. There is a lot of

00:01:11.530 --> 00:01:14.469
material here. So much. And our mission for you

00:01:14.469 --> 00:01:17.030
today is to cut through the sci -fi marketing

00:01:17.030 --> 00:01:20.010
hype, bypass those terrifying news headlines,

00:01:20.189 --> 00:01:23.310
and extract the absolute reality of autonomous

00:01:23.310 --> 00:01:25.340
vehicles right now. Yeah, we're going to get

00:01:25.340 --> 00:01:27.980
into how they actually work, why they still crash.

00:01:28.239 --> 00:01:31.280
The incredibly messy ethical dilemmas they create,

00:01:31.540 --> 00:01:34.120
and what this all means for you and your morning

00:01:34.120 --> 00:01:36.840
commute. It's a lot to navigate. I mean, we're

00:01:36.840 --> 00:01:39.939
looking at a web of rapidly evolving tech, deeply

00:01:39.939 --> 00:01:42.500
complex legal frameworks, and shifting social

00:01:42.500 --> 00:01:45.180
data. Right. But understanding the actual mechanics

00:01:45.180 --> 00:01:48.099
behind the headlines is really the only way to

00:01:48.099 --> 00:01:50.239
grasp where this is all heading. Which is exactly

00:01:50.239 --> 00:01:52.319
why you are the perfect guide to help us synthesize

00:01:52.319 --> 00:01:54.519
all of this today. Okay, let's unpack this. Let's

00:01:54.519 --> 00:01:56.280
do it. Before we can even judge if self -driving

00:01:56.280 --> 00:01:58.680
cars are successful, we have to define what self

00:01:58.680 --> 00:02:01.680
-driving actually means. And surprisingly, reading

00:02:01.680 --> 00:02:03.579
through these sources, the industry doesn't even

00:02:03.579 --> 00:02:05.900
have a universally agreed -upon standard right

00:02:05.900 --> 00:02:08.000
now. Yeah, that is the core of the problem right

00:02:08.000 --> 00:02:10.639
there. For years, the technical community relied

00:02:10.639 --> 00:02:14.360
on the SAE, the Society of Automotive Engineers.

00:02:14.860 --> 00:02:16.599
Right. They created that scale of automation,

00:02:16.840 --> 00:02:18.919
right? Level zero to level five. Exactly. But

00:02:18.919 --> 00:02:21.719
for anyone who follows technology, we know this

00:02:21.719 --> 00:02:24.500
scale has basically become this convoluted marketing

00:02:24.500 --> 00:02:26.919
tool. Oh, totally. The critical part that most

00:02:26.919 --> 00:02:29.500
consumers miss is that these levels aren't just

00:02:29.500 --> 00:02:32.560
some checklist of cool features. They are strictly

00:02:32.560 --> 00:02:35.560
based on the division of legal responsibility

00:02:35.560 --> 00:02:38.169
between the human and the machine. Right, so

00:02:38.169 --> 00:02:39.710
let's breeze through those real quick because

00:02:39.710 --> 00:02:42.270
they are everywhere in the marketing. Level zero

00:02:42.270 --> 00:02:46.229
through two. You are the driver. You are legally

00:02:46.229 --> 00:02:49.110
responsible, period. Even if the car is actively

00:02:49.110 --> 00:02:50.770
steering and braking to keep you in the lane,

00:02:51.050 --> 00:02:53.449
your eyes must be on the road. Exactly. And level

00:02:53.449 --> 00:02:55.949
three is where the car genuinely drives, but

00:02:55.949 --> 00:02:57.770
you have to be ready to intervene. Right, like

00:02:57.770 --> 00:02:59.969
if the car gets confused, it pings you. Yeah,

00:03:00.289 --> 00:03:02.889
and level four means the system drives and can

00:03:02.889 --> 00:03:05.330
safely pull itself over if you ignore that alarm.

00:03:05.710 --> 00:03:09.150
And finally, level five, which is just science

00:03:09.150 --> 00:03:11.150
fiction right now. That's a car that can drive

00:03:11.150 --> 00:03:14.449
anywhere, anytime, in a blizzard or a hurricane,

00:03:14.750 --> 00:03:17.189
without a steering wheel at all. But the confusion

00:03:17.189 --> 00:03:19.969
comes from how these levels interact with this

00:03:19.969 --> 00:03:22.789
concept called ODD. Operational Design Domain.

00:03:22.849 --> 00:03:25.770
Right. The ODD dictates the exact environmental

00:03:25.770 --> 00:03:28.590
boundaries where a car can operate safely. Yeah.

00:03:28.789 --> 00:03:31.349
Like, is it raining? How fast are we going? Are

00:03:31.349 --> 00:03:34.669
we on a pre -mapped highway or some random...

00:03:34.250 --> 00:03:37.150
dirt road. It's all about context. There's an

00:03:37.150 --> 00:03:39.189
analogy in the source material that perfectly

00:03:39.189 --> 00:03:42.129
nails this mechanism. Think of it like a human

00:03:42.129 --> 00:03:45.210
being asked to stand unassisted on one leg. Okay.

00:03:45.370 --> 00:03:48.090
That action is the dynamic driving requirement.

00:03:48.669 --> 00:03:50.830
Now, if you do that on solid ground, your designated

00:03:50.830 --> 00:03:53.030
operational design domain, you're totally fine.

00:03:53.069 --> 00:03:54.830
You can balance all day. Sure. But if you try

00:03:54.830 --> 00:03:56.770
to stand on one leg on a tightrope suspended

00:03:56.770 --> 00:03:59.129
in the air, suddenly your environment has changed.

00:03:59.330 --> 00:04:01.469
Your capability hasn't changed, right? No. But

00:04:01.469 --> 00:04:04.500
the Duane has. and you need support to keep from

00:04:04.500 --> 00:04:07.229
falling. What's fascinating here is how car makers

00:04:07.229 --> 00:04:10.710
exploit this exact confusion between a vehicle's

00:04:10.710 --> 00:04:13.409
capability and its operational domain. How do

00:04:13.409 --> 00:04:15.610
you mean? Well, a company might loudly advertise

00:04:15.610 --> 00:04:17.930
that they have a level four car. But what they

00:04:17.930 --> 00:04:20.170
actually mean is that the car has a level four

00:04:20.170 --> 00:04:23.209
feature, like, say, automated valley parking.

00:04:23.290 --> 00:04:26.870
Oh, right. But that feature only works inside

00:04:26.870 --> 00:04:30.910
a very specific geofence pre -mapped concrete

00:04:30.910 --> 00:04:34.069
garage. The moment you take that exact same car

00:04:34.069 --> 00:04:36.360
out of the garage, and onto the highway, the

00:04:36.360 --> 00:04:39.480
system downgrades to level two. Wow. Yeah, you

00:04:39.480 --> 00:04:41.480
have to keep your hands on the wheel. which is

00:04:41.480 --> 00:04:44.019
incredibly deceptive. I mean, if I buy a smartphone,

00:04:44.379 --> 00:04:46.560
it's a smartphone, whether I'm in my living room

00:04:46.560 --> 00:04:49.379
or the grocery store, it doesn't downgrade into

00:04:49.379 --> 00:04:51.399
a rotary phone when I cross the street. No, it

00:04:51.399 --> 00:04:53.199
doesn't. It's no wonder consumers are confused,

00:04:53.199 --> 00:04:56.639
which is why the CEO of Mobileye proposed a totally

00:04:56.639 --> 00:04:58.279
different way to talk about this in the sources,

00:04:58.480 --> 00:05:01.000
moving away from those rigid SAE numbers. Yes,

00:05:01.279 --> 00:05:03.560
Mobileye suggested terms that actually describe

00:05:03.560 --> 00:05:05.779
human behavior, which is so much more intuitive.

00:05:06.040 --> 00:05:08.000
Like what? They use categories like eyes on,

00:05:08.000 --> 00:05:10.980
hands on, or eyes on, hands off. Then, eyes off,

00:05:11.040 --> 00:05:15.980
hands off, and finally, no driver. clarifies

00:05:15.980 --> 00:05:18.620
the legal liability immediately. Like if you

00:05:18.620 --> 00:05:20.839
are in an eyes -on system, it doesn't matter

00:05:20.839 --> 00:05:23.300
what the steering wheel is doing. You are the

00:05:23.300 --> 00:05:25.540
driver. If you hit something, it is your fault.

00:05:26.139 --> 00:05:28.639
Exactly. It removes the ambiguity. So now that

00:05:28.639 --> 00:05:30.339
we know the playing field and the terminology,

00:05:30.439 --> 00:05:32.920
how are these companies actually trying to build

00:05:32.920 --> 00:05:35.220
these machines? How do these cars practically,

00:05:35.220 --> 00:05:38.459
you know, see the road? Oh, this is a huge debate.

00:05:38.660 --> 00:05:40.800
Because going through the research, there's a

00:05:40.800 --> 00:05:43.680
massive philosophical civil war happening in

00:05:43.680 --> 00:05:45.889
the tech world. regarding the sensors. There

00:05:45.889 --> 00:05:48.790
really is. It essentially boils down to two main

00:05:48.790 --> 00:05:51.209
sensory approaches. On one side, you have companies

00:05:51.209 --> 00:05:54.410
like Waymo. They use an incredibly hardware -heavy

00:05:54.410 --> 00:05:56.009
approach. Right, the spinning things on the roof.

00:05:56.110 --> 00:05:58.649
Yeah, they rely on LiDAR, which stands for light

00:05:58.649 --> 00:06:01.449
detection and ranging. LiDAR basically bounces

00:06:01.449 --> 00:06:04.149
millions of light lasers off surrounding objects

00:06:04.149 --> 00:06:07.290
every single second to create this hyper -accurate

00:06:07.290 --> 00:06:10.149
3D point cloud of the environment. Sounds intense.

00:06:10.389 --> 00:06:13.529
It is. But Waymo pairs that live LiDAR feed with

00:06:13.870 --> 00:06:16.970
incredibly detailed, pre -recorded maps. We're

00:06:16.970 --> 00:06:19.129
talking maps that know exactly where every lane

00:06:19.129 --> 00:06:22.129
line, stop sign, and curb is, down to the centimeter.

00:06:22.350 --> 00:06:24.990
Before the car even turns onto the street. Exactly.

00:06:25.490 --> 00:06:27.589
But the mechanism there seems totally flawed

00:06:27.589 --> 00:06:30.910
for scalability. I mean, those maps require massive,

00:06:31.110 --> 00:06:33.769
constant upkeep. If a construction crew moves

00:06:33.769 --> 00:06:36.250
a traffic cone overnight, or a storm knocks down

00:06:36.250 --> 00:06:38.649
a stop sign, the pre -recorded map is suddenly

00:06:38.649 --> 00:06:41.129
out of date. And the car's logic could just break.

00:06:41.230 --> 00:06:43.769
Yeah. Although I did read in the sources about

00:06:43.769 --> 00:06:46.879
systems like MIT's MeepLite. They tried to fix

00:06:46.879 --> 00:06:50.199
this by using just basic 2D GPS maps paired with

00:06:50.199 --> 00:06:52.240
live sensors, so it doesn't need to be updated

00:06:52.240 --> 00:06:54.560
constantly by a whole fleet of mapping vans.

00:06:54.620 --> 00:06:57.019
That's true. But then on the totally opposite

00:06:57.019 --> 00:06:59.399
end of the spectrum, you have the vision -only

00:06:59.399 --> 00:07:01.620
approach. Championed most famously by Tesla,

00:07:01.800 --> 00:07:04.480
right? Yes. They have completely stripped out

00:07:04.480 --> 00:07:06.959
the expensive LiDAR arrays, and they've even

00:07:06.959 --> 00:07:09.899
removed traditional radar. They rely entirely

00:07:09.899 --> 00:07:12.439
on optical cameras feeding visual data into what's

00:07:12.439 --> 00:07:15.180
called an end -to -end neural network. OK, let's

00:07:15.180 --> 00:07:17.560
pause right there, because end -to -end neural

00:07:17.560 --> 00:07:21.459
network is one of those massive tech buzzwords

00:07:21.459 --> 00:07:23.759
that gets thrown around a lot. It really does.

00:07:24.060 --> 00:07:26.019
If I understand the source material correctly,

00:07:26.540 --> 00:07:29.980
this means the AI isn't given explicit hard -coded

00:07:29.980 --> 00:07:34.459
rules by a programmer. Right? Like, a human didn't

00:07:34.459 --> 00:07:36.740
write a line of code saying, if you see a red

00:07:36.740 --> 00:07:38.819
octagon with white letters, apply the brakes.

00:07:39.019 --> 00:07:41.500
No, not at all. Instead, they feed the computer

00:07:41.500 --> 00:07:45.019
millions of hours of video footage of real humans

00:07:45.019 --> 00:07:47.540
driving, and the computer essentially learns

00:07:47.540 --> 00:07:50.540
to mimic that behavior organically. That's right.

00:07:50.699 --> 00:07:52.420
It's like learning a language by living in a

00:07:52.420 --> 00:07:54.180
foreign country rather than reading a grammar

00:07:54.180 --> 00:07:56.500
textbook. That is a perfect way to describe it.

00:07:56.569 --> 00:07:59.230
The system organically learns the concept of

00:07:59.230 --> 00:08:01.870
stopping at an intersection by processing how

00:08:01.870 --> 00:08:03.930
human drivers navigate thousands of different

00:08:03.930 --> 00:08:06.610
intersections. Oh, wow. And Tesla's rationale

00:08:06.610 --> 00:08:09.009
for this is simple. Human beings navigate the

00:08:09.009 --> 00:08:11.689
world using only our eyes and our brain. Therefore,

00:08:11.990 --> 00:08:14.189
a car should be able to do the exact same thing

00:08:14.189 --> 00:08:16.949
using cameras and a neural network. Plus optical

00:08:16.949 --> 00:08:19.550
cameras are vastly cheaper than lidar rigs. Oh,

00:08:19.930 --> 00:08:21.870
significantly cheaper. But if you're listening

00:08:21.870 --> 00:08:23.959
to this and thinking, wait a minute... I rely

00:08:23.959 --> 00:08:26.639
on my rear -view mirrors, my peripheral vision,

00:08:26.839 --> 00:08:29.860
and my hearing to merge onto a highway. I listen

00:08:29.860 --> 00:08:33.139
for sirens. I use intuition to guess that a parked

00:08:33.139 --> 00:08:36.159
car is about to pull out. Yeah. You aren't crazy.

00:08:36.379 --> 00:08:38.539
No, you're absolutely right. That is exactly

00:08:38.539 --> 00:08:41.059
why engineers are fighting over this. I have

00:08:41.059 --> 00:08:43.940
to push back on the vision -only logic. If a

00:08:43.940 --> 00:08:46.740
radar sensor can ping solid objects through a

00:08:46.740 --> 00:08:49.059
nighttime snowstorm that completely blinds an

00:08:49.059 --> 00:08:52.240
optical camera, Why in the world would you limit

00:08:52.240 --> 00:08:55.779
a two -ton vehicle to just eyes? It is a highly

00:08:55.779 --> 00:08:58.379
debated engineering choice, but the technical

00:08:58.379 --> 00:09:00.620
justification for avoiding multiple types of

00:09:00.620 --> 00:09:02.419
sensors comes down to the processing burden,

00:09:02.659 --> 00:09:04.980
and something called sensor fusion. Sensor fusion.

00:09:05.039 --> 00:09:07.919
Yeah. When you have cameras, radar, and lidar

00:09:07.919 --> 00:09:10.639
all feeding gigabytes of information to the car's

00:09:10.639 --> 00:09:13.299
computer simultaneously, the computer has to

00:09:13.299 --> 00:09:16.519
fuse all that data together to paint one unified

00:09:16.519 --> 00:09:18.899
picture of reality. OK, that makes sense. But

00:09:18.899 --> 00:09:20.960
what happens when the optical camera says the

00:09:20.960 --> 00:09:23.279
road ahead is perfectly clear, but the radar

00:09:23.279 --> 00:09:25.100
bounces off a metallic object? sign overhead

00:09:25.100 --> 00:09:27.860
and insists there is a stationary object dead

00:09:27.860 --> 00:09:30.960
ahead. Oh, the sensors disagree with each other.

00:09:31.120 --> 00:09:34.519
Yes. So it's like a jury where the two star witnesses

00:09:34.519 --> 00:09:37.320
are giving entirely conflicting testimonies and

00:09:37.320 --> 00:09:39.379
the car's brain just freezes trying to figure

00:09:39.379 --> 00:09:42.210
out who was lying. Exactly. Synthesizing that

00:09:42.210 --> 00:09:44.909
conflicting data takes computational time. It

00:09:44.909 --> 00:09:47.590
can cause the AI to hesitate, break suddenly,

00:09:47.789 --> 00:09:50.309
or make these dangerous jerky corrections. Thanks.

00:09:50.690 --> 00:09:52.889
It's like a human trying to read a book, text

00:09:52.889 --> 00:09:55.149
a friend, and listen to a deep dive all at the

00:09:55.149 --> 00:09:58.230
exact same time. Eventually, the processing lag

00:09:58.230 --> 00:10:01.490
causes a mistake. So by using only vision, the

00:10:01.490 --> 00:10:04.190
system theoretically processes one type of data

00:10:04.190 --> 00:10:07.750
much faster. Right. But as you pointed out, it

00:10:07.750 --> 00:10:10.470
sacrifices the superhuman penetrating capability

00:10:10.379 --> 00:10:13.460
of radar. And ultimately, the hardest part of

00:10:13.460 --> 00:10:15.759
all this isn't just seeing the road, it's behavior

00:10:15.759 --> 00:10:18.159
prediction. Meaning, it's not enough to just

00:10:18.159 --> 00:10:20.500
identify a pedestrian standing on the sidewalk.

00:10:20.659 --> 00:10:22.860
Right. An autonomous vehicle has to calculate

00:10:22.860 --> 00:10:25.379
that pedestrian's future trajectory. Like, are

00:10:25.379 --> 00:10:26.779
they waiting for the light? Are they looking

00:10:26.779 --> 00:10:29.340
at their phone? Exactly. Are they going to suddenly

00:10:29.340 --> 00:10:31.759
dart into the street to chase a rolling ball?

00:10:32.179 --> 00:10:34.600
The computer has to compute the position, speed,

00:10:34.700 --> 00:10:37.059
and likely intent of every single object around

00:10:37.059 --> 00:10:39.299
it multiple times a second. So we have these

00:10:39.299 --> 00:10:41.860
two computers. the competing philosophies, the

00:10:41.860 --> 00:10:44.200
heavy light arm mappers, and the vision only

00:10:44.200 --> 00:10:47.320
purists. But how are they actually performing

00:10:47.320 --> 00:10:49.500
out in the wild? That's the real question. Because

00:10:49.500 --> 00:10:51.960
when you read the news, it feels like autonomous

00:10:51.960 --> 00:10:54.340
vehicles are crashing into things every other

00:10:54.340 --> 00:10:57.120
day. But the statistics in the sources tell a

00:10:57.120 --> 00:11:00.360
much more nuanced story. They do. The gap between

00:11:00.360 --> 00:11:04.159
statistical safety and human trust is just staggering

00:11:04.159 --> 00:11:07.379
right now. Let's look at that massive 2024 study

00:11:07.379 --> 00:11:10.019
published in Nature Communications. OK. They

00:11:10.019 --> 00:11:12.100
compared the safety records of autonomous vehicles

00:11:12.100 --> 00:11:14.399
to human driven vehicles across tens of thousands

00:11:14.399 --> 00:11:16.480
of incident reports. The good news is actually

00:11:16.480 --> 00:11:19.259
phenomenal. Yeah. Autonomous vehicles have far

00:11:19.259 --> 00:11:22.100
fewer crashes involving pedestrians. About three

00:11:22.100 --> 00:11:24.519
percent of crashes compared to 15 percent for

00:11:24.519 --> 00:11:27.730
human drivers. That's a huge drop. And the machines

00:11:27.730 --> 00:11:30.789
are significantly safer than humans in heavy

00:11:30.789 --> 00:11:33.429
rain or fog. I mean, they don't get distracted

00:11:33.429 --> 00:11:35.610
by their phones, they don't drive drunk, and

00:11:35.610 --> 00:11:38.159
they don't get tired after a 12 hour shift. But

00:11:38.159 --> 00:11:40.200
the bad news is deeply concerning, which brings

00:11:40.200 --> 00:11:42.539
us right back to your opening point. The study

00:11:42.539 --> 00:11:44.700
found that autonomous vehicles are more than

00:11:44.700 --> 00:11:47.379
five times more vulnerable to collisions at dawn

00:11:47.379 --> 00:11:49.879
and dusk. Because of the shadows. Yeah, the rapidly

00:11:49.879 --> 00:11:52.480
changing light, the long shadows, the sun glare.

00:11:52.539 --> 00:11:54.799
It directly messes with the contrast required

00:11:54.799 --> 00:11:57.740
by cameras and optical sensors. And when these

00:11:57.740 --> 00:12:00.679
systems do fail, they fail in ways that humans

00:12:00.679 --> 00:12:03.360
find entirely illogical, which just shatters

00:12:03.360 --> 00:12:06.019
public trust. Let's look at exactly how they

00:12:06.019 --> 00:12:08.870
fail. because the mechanism is fascinating. The

00:12:08.870 --> 00:12:11.929
sources detail a 2022 incident where a cruise

00:12:11.929 --> 00:12:14.889
robotaxi completely blocked a fire engine that

00:12:14.889 --> 00:12:17.570
was responding to an emergency. Why did it do

00:12:17.570 --> 00:12:19.549
that? Why did it just freeze in the middle of

00:12:19.549 --> 00:12:22.090
the road? Well, the AI is programmed to obey

00:12:22.090 --> 00:12:25.309
traffic laws flawlessly. When a fire engine turns

00:12:25.309 --> 00:12:27.389
its sirens on and drives on the wrong side of

00:12:27.389 --> 00:12:30.549
the road to bypass traffic, it breaks the AI's

00:12:30.549 --> 00:12:33.070
programmed logic tree. It literally doesn't know

00:12:33.070 --> 00:12:35.909
what to do. Right. The AI's hard -coded rules

00:12:35.909 --> 00:12:39.529
hit a conflict. Fire trucks are vehicles. Vehicles

00:12:39.529 --> 00:12:41.950
must stay in the right lane. This vehicle is

00:12:41.950 --> 00:12:44.370
in the wrong lane moving toward me. I cannot

00:12:44.370 --> 00:12:47.389
predict its next legal move. So the system defaults

00:12:47.389 --> 00:12:50.289
to the only safe action it knows. It just halts

00:12:50.289 --> 00:12:52.429
in confusion. If we connect this to the bigger

00:12:52.429 --> 00:12:55.490
picture, we have actually been through this exact

00:12:55.490 --> 00:12:58.149
psychological and technological barrier before.

00:12:58.610 --> 00:13:00.830
Think about the automatic elevator. Elevators.

00:13:01.049 --> 00:13:04.230
Like in an office building. Yes. Automatic elevators

00:13:04.230 --> 00:13:07.509
were invented around 1900. But for decades, people

00:13:07.509 --> 00:13:10.730
were absolutely terrified to step into a metal

00:13:10.730 --> 00:13:13.929
box suspended in a dark shaft without a human

00:13:13.929 --> 00:13:16.309
operator standing there physically pushing the

00:13:16.309 --> 00:13:18.830
levers. I mean, I kind of get that. Right. They

00:13:18.830 --> 00:13:21.070
didn't trust the machine's logic to stop at the

00:13:21.070 --> 00:13:23.809
right floor. It took operator strikes, massive

00:13:23.809 --> 00:13:26.669
advertising campaigns and the invention of a

00:13:26.669 --> 00:13:29.830
giant reassuring red emergency stop button to

00:13:29.830 --> 00:13:32.190
finally build public trust. Oh that makes so

00:13:32.190 --> 00:13:33.870
much sense. We are currently in the elevator

00:13:33.870 --> 00:13:36.269
phase of self -driving cars. People want to know

00:13:36.269 --> 00:13:38.129
they can hit a button to stop the machine when

00:13:38.129 --> 00:13:40.490
its logic fails. Here's where it gets really

00:13:40.490 --> 00:13:43.070
interesting. Let's look at the data from the

00:13:43.070 --> 00:13:45.149
National Highway Traffic Safety Administration.

00:13:45.470 --> 00:13:49.649
Out of roughly 4 ,000 autonomous vehicle incidents

00:13:49.649 --> 00:13:54.110
reported between 2019 and 2024, nearly 54 % of

00:13:54.110 --> 00:13:56.750
them involve Teslas. More than half. More than

00:13:56.750 --> 00:14:00.149
half. Yet Tesla aggressively markets its level

00:14:00.149 --> 00:14:02.769
two system under the brand name full self -driving,

00:14:03.110 --> 00:14:05.289
which has led to multiple federal investigations

00:14:05.289 --> 00:14:08.990
for deceptive marketing. Because legally, under

00:14:08.990 --> 00:14:11.110
that mobile IFA framework we talk about, you

00:14:11.110 --> 00:14:12.840
still have to have your eyes on the road. It

00:14:12.840 --> 00:14:15.139
perfectly illustrates how terminology shapes

00:14:15.139 --> 00:14:17.980
consumer behavior. If you name a software package

00:14:17.980 --> 00:14:20.179
full self -driving, people are naturally going

00:14:20.179 --> 00:14:22.399
to treat it like it's fully self -driving. Regardless

00:14:22.399 --> 00:14:24.299
of the fine print in the owner's manual telling

00:14:24.299 --> 00:14:26.320
them to keep their hands on the wheel. Exactly.

00:14:26.559 --> 00:14:29.220
And the sources bring up some really tragic incidents.

00:14:29.759 --> 00:14:32.639
Tesla's first fatal crash in 2016 happened because

00:14:32.639 --> 00:14:35.740
the car literally didn't distinguish a white

00:14:35.740 --> 00:14:38.200
tractor trailer crossing the highway against

00:14:38.200 --> 00:14:41.019
a brightly lit sky. It just didn't see it. Right.

00:14:41.279 --> 00:14:44.200
And there's that 2025 Cybertruck crash on full

00:14:44.200 --> 00:14:47.399
self -driving. Waymo hit a bus in 2016 trying

00:14:47.399 --> 00:14:49.940
to avoid sandbags in the road. And we have to

00:14:49.940 --> 00:14:52.600
mention the 2018 Uber Advanced Technologies group

00:14:52.600 --> 00:14:55.659
fatality. That was a perfect storm of human and

00:14:55.659 --> 00:14:58.440
machine failure. The safety driver inside the

00:14:58.440 --> 00:15:00.080
car was looking down at their phone, watching

00:15:00.080 --> 00:15:02.879
a show, and the pedestrian victim crossed the

00:15:02.879 --> 00:15:05.440
street illegally with meth in her system. The

00:15:05.440 --> 00:15:07.720
car's sensors detected her, but the software

00:15:07.720 --> 00:15:09.720
couldn't classify what she was until it was too

00:15:09.720 --> 00:15:12.429
late. It's just awful. This brings us to the

00:15:12.429 --> 00:15:14.809
underlying decision making of the AI, moving

00:15:14.809 --> 00:15:17.070
beyond just the mechanics of safety into the

00:15:17.070 --> 00:15:19.830
ethics and the economics. It's not just about

00:15:19.830 --> 00:15:22.250
a car freezing in front of a fire truck. It's

00:15:22.250 --> 00:15:25.570
about how the AI decides who or what to prioritize

00:15:25.570 --> 00:15:28.519
when a crash is mathematically unavoidable. This

00:15:28.519 --> 00:15:31.379
is where the engineering hits hard societal reality.

00:15:31.779 --> 00:15:33.740
For example, a study from researchers at Georgia

00:15:33.740 --> 00:15:36.679
Tech revealed a deeply troubling flaw in AI detection

00:15:36.679 --> 00:15:39.399
models. I read this part. It's wild. They found

00:15:39.399 --> 00:15:41.600
that autonomous detection systems were about

00:15:41.600 --> 00:15:44.539
5 % less effective at recognizing pedestrians

00:15:44.539 --> 00:15:46.799
with darker skin tones compared to lighter skin

00:15:46.799 --> 00:15:49.320
tones. Which is an astonishing engineering oversight.

00:15:49.360 --> 00:15:51.840
Yeah. But if we look at the mechanism of why

00:15:51.840 --> 00:15:53.980
that happens, it goes right back to those neural

00:15:53.980 --> 00:15:56.629
networks. Yeah. If an AI is trained primarily

00:15:56.629 --> 00:15:59.909
on data sets or video footage that lacks diverse

00:15:59.909 --> 00:16:03.129
demographic representation, the AI literally

00:16:03.129 --> 00:16:06.350
develops a blind spot. It only knows what it

00:16:06.350 --> 00:16:09.370
has been shown. Exactly. It forces us to confront

00:16:09.370 --> 00:16:12.490
the classic philosophical trolley problem, but

00:16:12.490 --> 00:16:14.870
programmed into real life. Oh, the trolley problem.

00:16:15.009 --> 00:16:17.409
Right. Imagine an autonomous vehicle is driving

00:16:17.409 --> 00:16:20.269
down a narrow street. A pedestrian suddenly darts

00:16:20.269 --> 00:16:22.950
out. The car computes that it cannot physically

00:16:22.950 --> 00:16:26.379
break in time. It has two choices. Hit the pedestrian,

00:16:26.700 --> 00:16:29.100
or swerve into a solid brick wall, which will

00:16:29.100 --> 00:16:30.940
almost certainly kill the passenger inside the

00:16:30.940 --> 00:16:33.940
car. How do you program that decision? The sources

00:16:33.940 --> 00:16:36.399
cite extensive surveys on this exact dilemma,

00:16:36.500 --> 00:16:39.000
and the results are so revealing about human

00:16:39.000 --> 00:16:41.220
psychology. They really are. Broadly, people

00:16:41.220 --> 00:16:43.720
say they want autonomous cars programmed to minimize

00:16:43.720 --> 00:16:47.340
overall harm. So, logically, sacrifice the one

00:16:47.340 --> 00:16:49.539
passenger to save three pedestrians on the sidewalk.

00:16:50.200 --> 00:16:52.860
It's pure utilitarianism. Unless the person taking

00:16:52.860 --> 00:16:55.379
the survey is the hypothetical passenger in the

00:16:55.379 --> 00:16:57.980
car, then suddenly they want the car programmed

00:16:57.980 --> 00:17:00.299
to protect them at all costs. Naturally. But

00:17:00.299 --> 00:17:02.879
I have to push back on that logic. If every single

00:17:02.879 --> 00:17:05.660
car on the road is programmed by its manufacturer

00:17:05.660 --> 00:17:08.900
to be entirely selfish, to prioritize its own

00:17:08.900 --> 00:17:11.859
passengers at the expense of everyone else, doesn't

00:17:11.859 --> 00:17:15.799
that make the roads vastly more chaotic and dangerous

00:17:15.799 --> 00:17:18.990
overall? It's like an arms race of selfish algorithms.

00:17:19.490 --> 00:17:22.109
This raises an important question. When an AI

00:17:22.109 --> 00:17:24.710
ultimately does make the wrong choice and someone

00:17:24.710 --> 00:17:27.130
gets hurt, who actually goes to jail? Yeah, who

00:17:27.130 --> 00:17:30.349
is at fault? Is it the human occupant who was

00:17:30.349 --> 00:17:33.150
legally eyes off and wasn't driving? Is it the

00:17:33.150 --> 00:17:35.630
corporate car manufacturer? Or is it the software

00:17:35.630 --> 00:17:37.809
developer who wrote the logic tree code five

00:17:37.809 --> 00:17:40.250
years ago? That's a legal nightmare. Our entire

00:17:40.250 --> 00:17:42.529
legal framework of liability is fundamentally

00:17:42.529 --> 00:17:45.230
built around human error, human negligence, and

00:17:45.230 --> 00:17:48.750
human intent. It is completely unprepared for

00:17:48.750 --> 00:17:51.059
machine liability. Which brings up a question

00:17:51.059 --> 00:17:53.420
for you listening right now. Would you put your

00:17:53.420 --> 00:17:55.799
family in a car that is legally programmed to

00:17:55.799 --> 00:17:58.059
sacrifice you in order to save three strangers

00:17:58.059 --> 00:18:00.079
on the street? It's chilling to think about.

00:18:00.319 --> 00:18:03.240
It really is. But beyond the ethics of the code,

00:18:03.799 --> 00:18:06.400
the economic reality of this technology is also

00:18:06.400 --> 00:18:09.170
hitting a massive wall. The sources note that

00:18:09.170 --> 00:18:11.789
Waymo robotaxes currently operate at a staggering

00:18:11.789 --> 00:18:15.369
loss. It cost them between $7 to $9 per mile

00:18:15.369 --> 00:18:18.569
to run a robotaxi, compared to about $1 a mile

00:18:18.569 --> 00:18:21.390
for a standard personal car. And we have to look

00:18:21.390 --> 00:18:23.569
at why it's so expensive. It's not just the electricity

00:18:23.569 --> 00:18:25.910
to run the car. It's the fact that those LiDAR

00:18:25.910 --> 00:18:28.650
arrays cost thousands of dollars. It's the massive

00:18:28.650 --> 00:18:31.910
data storage required. And crucially, it's the

00:18:31.910 --> 00:18:35.049
remote human assistance centers. Whenever a robotaxi

00:18:35.049 --> 00:18:37.509
gets confused by a construction zone, a human

00:18:37.509 --> 00:18:40.529
sitting in a control room miles away has to digitally

00:18:40.529 --> 00:18:43.289
step in and guide it. McKinsey estimates it's

00:18:43.289 --> 00:18:46.509
going to take until 2035 at the earliest to scale

00:18:46.509 --> 00:18:48.250
the technology enough to get those operating

00:18:48.250 --> 00:18:50.839
costs under $2 a mile. And if they do manage

00:18:50.839 --> 00:18:53.180
to get those costs down and scale up, the labor

00:18:53.180 --> 00:18:55.420
displacement is going to be historic. The research

00:18:55.420 --> 00:18:58.240
points out there are 2 .9 million driving jobs

00:18:58.240 --> 00:19:01.220
in the US alone. Long -haul truck drivers, taxi

00:19:01.220 --> 00:19:04.900
drivers, city bus operators, if those jobs are

00:19:04.900 --> 00:19:07.680
automated away by fleets of AI, the economic

00:19:07.680 --> 00:19:09.880
displacement would actually surpass the total

00:19:09.880 --> 00:19:13.619
job losses of the 2008 Great Recession. We're

00:19:13.619 --> 00:19:16.180
talking about restructuring the entire blue collar

00:19:16.180 --> 00:19:18.759
labor force. Furthermore, it's not just a domestic

00:19:18.759 --> 00:19:21.319
economic issue. What else is there? The sources

00:19:21.319 --> 00:19:24.220
highlight objective reports of severe geopolitical

00:19:24.220 --> 00:19:26.599
tensions, specifically between the United States

00:19:26.599 --> 00:19:29.789
and China, over autonomous vehicle data. There

00:19:29.789 --> 00:19:32.309
are very real concerns currently being debated

00:19:32.309 --> 00:19:34.549
regarding imported self -driving tech. Right,

00:19:34.549 --> 00:19:37.069
because if you think about how LIDAR works, bouncing

00:19:37.069 --> 00:19:39.569
millions of lasers to create millimeter -accurate

00:19:39.569 --> 00:19:42.210
3D maps of its surroundings, a self -driving

00:19:42.210 --> 00:19:44.650
car is basically a roving internet -connected

00:19:44.650 --> 00:19:46.990
supercomputer. Exactly. The geopolitical fear

00:19:46.990 --> 00:19:49.089
is that imported economist vehicles could be

00:19:49.089 --> 00:19:51.869
used to facilitate esk denige by constantly mapping

00:19:51.869 --> 00:19:54.730
out critical infrastructure, power grids, and

00:19:54.730 --> 00:19:57.029
military bases in foreign countries and then

00:19:57.029 --> 00:19:59.440
just beaming that data back overseas. When a

00:19:59.440 --> 00:20:02.099
car is no longer just a mechanical vehicle, but

00:20:02.099 --> 00:20:04.859
a massive data harvesting node covered in high

00:20:04.859 --> 00:20:07.660
definition sensors, it naturally elevates from

00:20:07.660 --> 00:20:09.819
a transportation issue to a matter of national

00:20:09.819 --> 00:20:12.539
security. Okay, we have covered an incredible

00:20:12.539 --> 00:20:15.299
amount of ground today. We are clearly navigating

00:20:15.299 --> 00:20:19.140
a very messy transitional era. We have cars on

00:20:19.140 --> 00:20:21.799
the road right now that are statistically safer

00:20:21.799 --> 00:20:25.220
than you in a heavy rainstorm, but might be dangerously

00:20:25.220 --> 00:20:28.180
blind when the sun goes down. We have complex

00:20:28.180 --> 00:20:30.940
systems branded and sold to consumers as full

00:20:30.940 --> 00:20:33.980
self -driving that legally require you to babysit

00:20:33.980 --> 00:20:36.900
the steering wheel. And we have brilliant engineering

00:20:36.900 --> 00:20:39.480
that can predict complex traffic patterns, yet

00:20:39.480 --> 00:20:42.180
still struggles with diverse pedestrian recognition

00:20:42.180 --> 00:20:45.140
data and the basic legal frameworks of liability.

00:20:45.319 --> 00:20:47.250
So what does this all mean? It means what we

00:20:47.250 --> 00:20:49.470
always see in this show. Technology is moving

00:20:49.470 --> 00:20:52.170
exponentially faster than human trust and infinitely

00:20:52.170 --> 00:20:54.910
faster than our legal frameworks. We are building

00:20:54.910 --> 00:20:57.690
the airplane while we're flying it, or I guess

00:20:57.690 --> 00:21:00.250
programming the car while it's driving itself.

00:21:00.670 --> 00:21:02.690
And essentially, we are all the beta testers.

00:21:02.910 --> 00:21:04.910
That's a great way to put it. Before we go, I

00:21:04.910 --> 00:21:06.650
want to leave you with one final thought to chew

00:21:06.650 --> 00:21:09.609
on today. We've talked a lot about how self -driving

00:21:09.609 --> 00:21:13.390
cars struggle to predict what erratic, unpredictable

00:21:13.390 --> 00:21:15.799
human drivers are going to do next. Yeah, that's

00:21:15.799 --> 00:21:18.420
the hardest part for them. Well, if these autonomous

00:21:18.420 --> 00:21:21.160
vehicles ultimately prove to be drastically safer

00:21:21.160 --> 00:21:23.920
than humans, but only when they can perfectly

00:21:23.920 --> 00:21:25.900
communicate and predict the movements of other

00:21:25.900 --> 00:21:29.480
robot cars on a closed network, how long will

00:21:29.480 --> 00:21:32.460
it be until human driving is outright banned

00:21:32.460 --> 00:21:35.240
on public highways? Oh, wow. Will governments

00:21:35.240 --> 00:21:38.200
decide that human unpredictability is just too

00:21:38.200 --> 00:21:41.539
dangerous to the machines? Are we rapidly approaching

00:21:41.539 --> 00:21:44.500
a future where driving a car yourself is an illegal,

00:21:44.759 --> 00:21:47.079
obsolete hobby? Just something to think about

00:21:47.079 --> 00:21:49.019
next time you put your hands on the wheel. Thanks

00:21:49.019 --> 00:21:50.920
for joining us on this deep dive and as always

00:21:50.920 --> 00:21:52.640
we hope this keeps you the most well -informed

00:21:52.640 --> 00:21:53.440
person in the room.
