WEBVTT

00:00:00.000 --> 00:00:02.080
It creates this fascinating tension, doesn't

00:00:02.080 --> 00:00:05.700
it? On one side, you have this massive grassroots

00:00:05.700 --> 00:00:10.220
movement triggered by a $25 million donation.

00:00:10.880 --> 00:00:14.400
People are actually canceling subscriptions in

00:00:14.400 --> 00:00:17.670
protest. trying to stop the machine. And at the

00:00:17.670 --> 00:00:20.089
exact same time. At literally the same time,

00:00:20.149 --> 00:00:22.210
on the other side of the country, you have a

00:00:22.210 --> 00:00:25.329
$10 billion construction project breaking ground.

00:00:25.489 --> 00:00:27.530
Right. It's the ultimate contrast. You have the

00:00:27.530 --> 00:00:29.949
human layer trying to pull the emergency brake,

00:00:30.070 --> 00:00:32.630
and then you have the industrial layer hitting

00:00:32.630 --> 00:00:35.630
the gas harder than we have ever seen. Hello

00:00:35.630 --> 00:00:37.649
and welcome back to the Deep Dive. I've been

00:00:37.649 --> 00:00:39.850
sitting with this stack of sources you sent over

00:00:39.850 --> 00:00:42.590
for a few hours now, and I have to be honest,

00:00:42.750 --> 00:00:45.630
the signal -to -noise ratio in the AI space this

00:00:45.630 --> 00:00:48.770
week has been overwhelming. It feels that way.

00:00:48.950 --> 00:00:50.590
It feels like the industry is splitting in two

00:00:50.590 --> 00:00:52.909
directions at once. We have a cultural reckoning

00:00:52.909 --> 00:00:55.329
happening in the software and a physical explosion

00:00:55.329 --> 00:00:57.950
happening in the hardware. It really is an inflection

00:00:57.950 --> 00:01:00.329
point. I mean, usually when we sit down, we're

00:01:00.329 --> 00:01:02.409
obsessing over the models. Is it smarter? Is

00:01:02.409 --> 00:01:06.049
it faster? But today's sources paint a much,

00:01:06.090 --> 00:01:09.069
much messier picture. To really get what's happening

00:01:09.069 --> 00:01:11.349
this week, we have to look at three specific

00:01:11.349 --> 00:01:13.799
pillars. Okay, walk us through the roadmap. First,

00:01:13.879 --> 00:01:15.239
we have to talk about the cultural backlash.

00:01:15.560 --> 00:01:18.299
We're seeing the rise of this quit GPT movement.

00:01:18.620 --> 00:01:21.859
This is activism finally colliding with algorithms.

00:01:22.040 --> 00:01:25.159
Second, the infrastructure explosion. We aren't

00:01:25.159 --> 00:01:27.459
talking about code here. We're talking concrete,

00:01:27.680 --> 00:01:30.959
chips, and power grids, the physical reality

00:01:30.959 --> 00:01:34.200
of the cloud. And third, despite all that noise,

00:01:34.299 --> 00:01:37.319
the capability leap is still happening. We have

00:01:37.319 --> 00:01:40.599
new tools from Alibaba and OpenAI that fundamentally

00:01:40.599 --> 00:01:43.140
change how we actually work. Let's start with

00:01:43.140 --> 00:01:45.859
the friction, the cultural backlash. For a long

00:01:45.859 --> 00:01:48.140
time, the public sentiment around AI was just

00:01:48.140 --> 00:01:51.519
pure awe. Look what this thing can do. But reading

00:01:51.519 --> 00:01:54.000
these notes on QuitGPT, this feels different.

00:01:54.099 --> 00:01:55.840
This isn't just people being afraid of the Terminator.

00:01:55.859 --> 00:01:58.739
This is organized economic pressure. What sparked

00:01:58.739 --> 00:02:01.480
this? It is different because it's political.

00:02:02.250 --> 00:02:04.790
The catalyst here wasn't a software bug or a

00:02:04.790 --> 00:02:08.490
bad answer. It was a $25 million donation to

00:02:08.490 --> 00:02:11.449
Magia Inc. by Greg Brockman, the president of

00:02:11.449 --> 00:02:13.969
OpenAI, and his wife. Okay, let's pause on that.

00:02:14.009 --> 00:02:16.330
So we have a direct financial link from the leadership

00:02:16.330 --> 00:02:21.930
of the leading AI company to a very polarizing

00:02:21.930 --> 00:02:24.750
political campaign. Exactly. And that donation

00:02:24.750 --> 00:02:27.750
basically lit the match. But if you look at the

00:02:27.750 --> 00:02:29.610
source material, it's not just that one check.

00:02:29.689 --> 00:02:32.349
It's a compounding effect. You have the donation,

00:02:32.430 --> 00:02:35.330
but then you also have this new partnership between.

00:02:36.090 --> 00:02:39.310
OpenAI and IC immigration and customs enforcement.

00:02:39.889 --> 00:02:43.629
They're using ChatGPT4 for hiring and for administrative

00:02:43.629 --> 00:02:47.009
processing. Which for a significant portion of

00:02:47.009 --> 00:02:49.349
the tech workforce and that early adopter user

00:02:49.349 --> 00:02:52.750
base represents a massive red line. A huge red

00:02:52.750 --> 00:02:54.610
line. Remember, a lot of the people who pay for

00:02:54.610 --> 00:02:57.810
ChatGPT Plus are in that creative tech adjacent

00:02:57.810 --> 00:03:00.810
demographic. They feel a sense of ownership over

00:03:00.810 --> 00:03:03.669
the tool. So when they see the leadership funding

00:03:03.669 --> 00:03:07.159
one political side. and then the technology being

00:03:07.159 --> 00:03:09.719
deployed by ICE, it just feels like a betrayal

00:03:09.719 --> 00:03:12.680
of that original open and beneficial mission.

00:03:13.419 --> 00:03:15.699
And Scott Galloway is involved now. I saw his

00:03:15.699 --> 00:03:18.199
name in the notes. Yeah, the NYU professor. He's

00:03:18.199 --> 00:03:20.699
very good at capturing the zeitgeist of these

00:03:20.699 --> 00:03:24.039
things. He went viral urging people to resist

00:03:24.039 --> 00:03:27.199
and unsubscribe. Right. He's framing it not just

00:03:27.199 --> 00:03:30.080
as a boycott, but as a resistance against big

00:03:30.080 --> 00:03:33.039
tech tools fueling specific political agendas.

00:03:33.400 --> 00:03:35.900
And it's resonating. The sources say there are

00:03:35.900 --> 00:03:39.479
about 17 ,000 pledges on the QuitGPT site so

00:03:39.479 --> 00:03:41.620
far. Okay, 17 ,000. Let's look at the numbers.

00:03:41.699 --> 00:03:43.400
On a human level, that's... That's a stadium

00:03:43.400 --> 00:03:45.560
full of angry people. That's significant. It

00:03:45.560 --> 00:03:49.139
is. But ChatGPT has something like 900 million

00:03:49.139 --> 00:03:53.199
weekly users. Does 17 ,000 cancellations actually

00:03:53.199 --> 00:03:56.360
matter to a company like OpenAI? Or is this just

00:03:56.360 --> 00:03:59.159
screaming into the void? Financially, no. The

00:03:59.159 --> 00:04:01.379
math is brutal. Only about 5 or 6 % of those

00:04:01.379 --> 00:04:03.340
900 million users are even paid subscribers.

00:04:03.699 --> 00:04:06.900
So even if they lost 100 ,000 subscribers, the

00:04:06.900 --> 00:04:09.879
revenue hit is just a rounding error. It won't

00:04:09.879 --> 00:04:12.340
stop the lights from coming on. OpenAI is burning

00:04:12.340 --> 00:04:15.479
cash on compute, not relying on $20 subscriptions

00:04:15.479 --> 00:04:18.180
to survive. So if it doesn't hurt the wallet,

00:04:18.279 --> 00:04:21.779
is it just performative? I wouldn't call it performative.

00:04:21.779 --> 00:04:24.389
I'd call it a signal. A cultural signal. OK.

00:04:24.490 --> 00:04:26.350
You have to remember, OpenAI isn't just fighting

00:04:26.350 --> 00:04:28.910
for customers. They are fighting for talent.

00:04:29.670 --> 00:04:32.509
The engineers who build these models are often

00:04:32.509 --> 00:04:34.910
very politically active, very ethics focused.

00:04:35.310 --> 00:04:38.660
If the brand becomes toxic. If working for OpenAI

00:04:38.660 --> 00:04:41.259
becomes synonymous with funding the bad guys

00:04:41.259 --> 00:04:44.180
in the eyes of their peers, that hurts them way

00:04:44.180 --> 00:04:47.079
more than losing some subscription revenue. That's

00:04:47.079 --> 00:04:49.019
a great point. The talent war is the real war.

00:04:49.180 --> 00:04:51.879
Exactly. If the smartest 25 year old researcher

00:04:51.879 --> 00:04:54.180
refuses to sign the offer letter because of the

00:04:54.180 --> 00:04:57.339
brand reputation, that's the actual cost. Precisely.

00:04:57.339 --> 00:04:59.060
And it's not just consumers versus companies

00:04:59.060 --> 00:05:01.180
anymore. The sources highlight something fascinating.

00:05:02.329 --> 00:05:04.050
The companies are starting to fight each other

00:05:04.050 --> 00:05:07.290
politically, too. It's a full -on war of the

00:05:07.290 --> 00:05:10.610
pigs. I saw that bit about Anthropic. They frame

00:05:10.610 --> 00:05:13.370
themselves as the safety -first, constitutional

00:05:13.370 --> 00:05:16.529
AI alternative. What are they doing? They just

00:05:16.529 --> 00:05:19.769
funded a $20 million super piece. $80 million?

00:05:20.189 --> 00:05:23.560
Specifically to push for AI regulation. That

00:05:23.560 --> 00:05:25.459
is not a small amount of change for a policy

00:05:25.459 --> 00:05:28.399
push. No. And it's directly opposing the political

00:05:28.399 --> 00:05:31.100
interests backed by open eyes. Fifty million

00:05:31.100 --> 00:05:35.399
dollar push. So you have these two. massive entities

00:05:35.399 --> 00:05:38.560
who are competitors in code now becoming competitors

00:05:38.560 --> 00:05:41.120
in the political lobby. They're literally spending

00:05:41.120 --> 00:05:43.939
tens of millions of dollars to shape the laws

00:05:43.939 --> 00:05:46.360
that will govern the other. It really highlights

00:05:46.360 --> 00:05:48.879
that AI isn't just a tool anymore. It's a political

00:05:48.879 --> 00:05:51.259
battlefield. You can't just build it and they

00:05:51.259 --> 00:05:52.920
will come. You have to build it, you have to

00:05:52.920 --> 00:05:54.699
lobby for it, and you have to defend it from

00:05:54.699 --> 00:05:57.360
your own user base. That's the reality of 2026.

00:05:57.699 --> 00:06:00.180
So just to cap this segment off, does this consumer

00:06:00.180 --> 00:06:02.959
pressure actually work in an age of enterprise

00:06:02.959 --> 00:06:05.600
dominance? It creates a morale crisis, which

00:06:05.600 --> 00:06:07.680
creates a tenant crisis, which eventually creates

00:06:07.680 --> 00:06:10.720
a product crisis. So, yes, it matters. Interesting.

00:06:10.759 --> 00:06:13.279
So while the culture is fighting over donations

00:06:13.279 --> 00:06:16.220
and hiring policies and the lobbyists are fighting

00:06:16.220 --> 00:06:19.720
over regulation, the actual physical machine

00:06:19.720 --> 00:06:22.819
is just getting bigger. Oh, yeah. I want to pivot

00:06:22.819 --> 00:06:25.379
to the infrastructure piece because the numbers

00:06:25.379 --> 00:06:29.360
here are staggering. The hardware reality. This

00:06:29.360 --> 00:06:31.439
is the stuff you can kick. I was looking at this

00:06:31.439 --> 00:06:35.790
note about meta in Indiana. $10 billion for a

00:06:35.790 --> 00:06:37.829
data center. It's monumental. Meta just broke

00:06:37.829 --> 00:06:40.649
ground on this. And data center feels like too

00:06:40.649 --> 00:06:43.910
small a word. Right. It's a campus. 13 buildings,

00:06:44.290 --> 00:06:47.629
4 ,000 construction jobs. This is a physical

00:06:47.629 --> 00:06:50.589
manifestation of their next -gen ambitions. When

00:06:50.589 --> 00:06:52.370
Zuckerberg says they are building the future,

00:06:52.550 --> 00:06:54.889
they are literally pouring the concrete for it

00:06:54.889 --> 00:06:57.470
in Indiana. It changes how you visualize the

00:06:57.470 --> 00:06:59.290
cloud, doesn't it? We tend to think of the cloud

00:06:59.290 --> 00:07:01.410
as this ethereal thing floating above us. Right.

00:07:01.759 --> 00:07:04.920
But it's actually 13 massive buildings in Indiana

00:07:04.920 --> 00:07:07.240
sucking up electricity. And that electricity

00:07:07.240 --> 00:07:09.879
part is becoming the friction point. The power

00:07:09.879 --> 00:07:11.980
demands are so high that these companies are

00:07:11.980 --> 00:07:14.399
having to play good neighbor in a very, very

00:07:14.399 --> 00:07:18.180
expensive way. I saw that in the notes. Anthropic,

00:07:18.379 --> 00:07:22.879
Microsoft, and OpenAI. are all promising to pay

00:07:22.879 --> 00:07:25.379
the extra power bills. They have to. Imagine

00:07:25.379 --> 00:07:27.819
you live in a town, a data center moves in, and

00:07:27.819 --> 00:07:30.120
suddenly your residential electricity rate goes

00:07:30.120 --> 00:07:33.459
up 20 % because demand spiked. That's a political

00:07:33.459 --> 00:07:36.060
nightmare. It is. So they are preemptively saying,

00:07:36.279 --> 00:07:38.620
we will cover the difference. It shows you how

00:07:38.620 --> 00:07:40.399
desperate they are to keep these things running

00:07:40.399 --> 00:07:42.980
without local backlash. They're essentially paying

00:07:42.980 --> 00:07:45.399
rent on the grid itself. It's almost like a pollution

00:07:45.399 --> 00:07:48.439
tax, but for electron consumption. Exactly. And

00:07:48.439 --> 00:07:50.879
speaking of desperate need for speed, we have

00:07:50.879 --> 00:07:53.920
to talk about this OpenAI and Cerebras deal.

00:07:54.120 --> 00:07:56.699
This connects the hardware to the actual user

00:07:56.699 --> 00:07:59.800
experience. This is Codex Spark. Right. OpenAI

00:07:59.800 --> 00:08:02.399
launched Codex Spark. It's a lighter version

00:08:02.399 --> 00:08:05.800
of GPT -5 .3 designed specifically for real -time

00:08:05.800 --> 00:08:08.560
coding. But the moment of wonder here isn't the

00:08:08.560 --> 00:08:11.459
software. It's the chip. It's the chip powering

00:08:11.459 --> 00:08:14.899
it. The Cerebras chip. I saw the Spex 4T transistors.

00:08:14.980 --> 00:08:18.040
Four trillion. Can we contextualize that? Because

00:08:18.040 --> 00:08:21.019
trillion is one of those numbers that human brains

00:08:21.019 --> 00:08:24.220
aren't good at visualizing. What does a four

00:08:24.220 --> 00:08:26.699
trillion transistor chip actually allow you to

00:08:26.699 --> 00:08:30.029
do that a normal chip doesn't? Speed. Specifically,

00:08:30.189 --> 00:08:32.490
inference speed. So training is teaching the

00:08:32.490 --> 00:08:35.009
AI. That takes months. Inference is when you

00:08:35.009 --> 00:08:37.289
ask it a question and it answers. Right. That's

00:08:37.289 --> 00:08:40.330
the daily usage. Usually there's a lag. You type,

00:08:40.350 --> 00:08:43.029
the bubbles spin, the text streams out. With

00:08:43.029 --> 00:08:45.769
a chip this size, the memory bandwidth is so

00:08:45.769 --> 00:08:48.649
high that the latency basically vanishes. So

00:08:48.649 --> 00:08:51.529
it moves from email correspondence to instant

00:08:51.529 --> 00:08:54.710
messaging. It moves to thought speed. If you're

00:08:54.710 --> 00:08:58.029
a coder using Codex Spark. The AI isn't pausing

00:08:58.029 --> 00:09:01.250
to think. It's typing with you. It predicts your

00:09:01.250 --> 00:09:03.690
next logic block before you even finish the syntax.

00:09:04.070 --> 00:09:06.429
Wow. It changes the psychological relationship

00:09:06.429 --> 00:09:08.730
with the tool. It feels less like a tool and

00:09:08.730 --> 00:09:10.830
more like an extension of your own fingers. That's

00:09:10.830 --> 00:09:12.549
incredible. And this ties into that valuation

00:09:12.549 --> 00:09:15.009
for Modal Labs, right? Exactly. Modal Labs is

00:09:15.009 --> 00:09:18.389
raising at a $2 .5 billion valuation. They focus

00:09:18.389 --> 00:09:21.330
entirely on inference infrastructure, basically

00:09:21.330 --> 00:09:23.450
the plumbing that allows these massive models

00:09:23.450 --> 00:09:26.600
to run quickly and cheaply. The market is betting

00:09:26.600 --> 00:09:28.919
heavily that this is the year we stop just training

00:09:28.919 --> 00:09:30.940
models and start running them at massive scale.

00:09:31.120 --> 00:09:34.679
So are we underestimating the physical footprint,

00:09:34.980 --> 00:09:37.220
the concrete and electricity required for digital

00:09:37.220 --> 00:09:39.919
intelligence? Absolutely. Software is now driving

00:09:39.919 --> 00:09:42.659
the largest industrial construction boom in recent

00:09:42.659 --> 00:09:44.620
history. Which brings us to the tools themselves.

00:09:44.879 --> 00:09:47.139
We built the data centers. We fought the political

00:09:47.139 --> 00:09:50.090
battles. Yeah. What can we actually do with this

00:09:50.090 --> 00:09:52.490
stuff today that we couldn't do yesterday? This

00:09:52.490 --> 00:09:55.490
is the fun part. The capabilities are shifting

00:09:55.490 --> 00:09:59.789
from just chatting to doing. And the big story

00:09:59.789 --> 00:10:03.169
here is Alibaba. Quinn Image 2 .0. Catchy name,

00:10:03.190 --> 00:10:05.649
right? Rolls right off the tongue. The note here

00:10:05.649 --> 00:10:07.870
says it's a PowerPoint native model. What does

00:10:07.870 --> 00:10:09.909
that mean exactly? Okay, think about how most

00:10:09.909 --> 00:10:12.389
image generators work right now. Mid -journey,

00:10:12.389 --> 00:10:15.210
daily. You ask for a picture, you get a picture.

00:10:15.330 --> 00:10:18.799
It's art. But if you try to put text on it, it

00:10:18.799 --> 00:10:20.419
looks like alien hieroglyphics. If you try to

00:10:20.419 --> 00:10:22.720
make it a specific resolution for a slide deck,

00:10:22.879 --> 00:10:25.659
you have to crop it. Right. It's creative. But

00:10:25.659 --> 00:10:28.320
it's not productive. Exactly. Quen changes that.

00:10:28.419 --> 00:10:31.980
It merges generation and editing in one shot.

00:10:32.220 --> 00:10:36.000
It handles 2048 by 2048 resolution. So no more

00:10:36.000 --> 00:10:39.320
stitching tiles together. And crucially, it has

00:10:39.320 --> 00:10:41.559
high text fidelity. So you can actually read

00:10:41.559 --> 00:10:43.960
it. You can tell it. Make me a slide about Q3

00:10:43.960 --> 00:10:46.419
earnings with a bar graph and the title growth

00:10:46.419 --> 00:10:49.860
vector in Arial font. And it just does it. So

00:10:49.860 --> 00:10:52.580
it's an AI competitor to Canva. not just an AI

00:10:52.580 --> 00:10:55.039
computer to an artist. That's the promise. It's

00:10:55.039 --> 00:10:57.500
moving the workflow. You aren't just generating

00:10:57.500 --> 00:10:59.399
assets that you then have to take into Photoshop

00:10:59.399 --> 00:11:02.720
to fix. You are producing a final asset right

00:11:02.720 --> 00:11:04.659
in the chat. That is a significant shift for

00:11:04.659 --> 00:11:06.720
enterprise users. I know I struggle with that,

00:11:06.799 --> 00:11:08.779
constantly getting the image right, but having

00:11:08.779 --> 00:11:12.019
the text look ridiculous. Right. And while Alibaba

00:11:12.019 --> 00:11:14.779
is fixing images, Anthropic is coming for your

00:11:14.779 --> 00:11:17.860
Office suite. They just unlocked file tools for

00:11:17.860 --> 00:11:21.179
free users. Excel, PowerPoint, Word, no sh**.

00:11:21.129 --> 00:11:23.409
Previously, you had to pay to have Claude analyze

00:11:23.409 --> 00:11:25.909
these. Now it's open to everyone. That is a big

00:11:25.909 --> 00:11:28.629
democratization move. Suddenly, having an AI

00:11:28.629 --> 00:11:30.570
analyst go through your spreadsheets isn't a

00:11:30.570 --> 00:11:32.610
premium feature. It's a baseline expectation.

00:11:32.830 --> 00:11:35.750
Right. Exactly. But, and there is always a but

00:11:35.750 --> 00:11:38.230
in this industry, with these new capabilities

00:11:38.230 --> 00:11:41.370
comes a darker side. We have to talk about what

00:11:41.370 --> 00:11:43.750
happened to Google's Gemini. The stealth attack.

00:11:44.460 --> 00:11:46.779
I read this twice because I wasn't sure I understood

00:11:46.779 --> 00:11:49.120
it. This wasn't a hack in the traditional sense,

00:11:49.299 --> 00:11:51.159
right? They didn't break a password. No, they

00:11:51.159 --> 00:11:52.879
didn't break in. They invited themselves in.

00:11:53.019 --> 00:11:55.779
It's called a model copying campaign or model

00:11:55.779 --> 00:11:57.799
distillation. Walk us through how that works.

00:11:58.240 --> 00:12:01.080
So Google's Gemini is a trillion dollar brain.

00:12:01.549 --> 00:12:05.250
It's huge, expensive, and smart. A competitor

00:12:05.250 --> 00:12:08.450
or a bad actor wants that intelligence but doesn't

00:12:08.450 --> 00:12:11.129
have the money to train it. Okay. So they hit

00:12:11.129 --> 00:12:14.669
Gemini with over 100 ,000 very specific complex

00:12:14.669 --> 00:12:18.509
prompts. Explain quantum physics. Write a legal

00:12:18.509 --> 00:12:21.480
brief. Debug this code. And they just record

00:12:21.480 --> 00:12:24.139
all the answers? Record the answers. Then they

00:12:24.139 --> 00:12:26.299
take those high -quality answers and feed them

00:12:26.299 --> 00:12:29.139
into their own much smaller, cheaper model. They

00:12:29.139 --> 00:12:31.399
are teaching their cheap students using Google's

00:12:31.399 --> 00:12:33.980
expensive textbook. Wow. Effectively, they are

00:12:33.980 --> 00:12:36.059
siphoning off the intelligence of the master

00:12:36.059 --> 00:12:39.799
model to create a clone. That sounds incredibly

00:12:39.799 --> 00:12:43.159
difficult to police. It is. Google is calling

00:12:43.159 --> 00:12:45.860
it theft. They are warning startups that if it

00:12:45.860 --> 00:12:47.639
can happen to Gemini, it can happen to anyone.

00:12:47.879 --> 00:12:50.440
But the legal framework here is nonexistent.

00:12:50.639 --> 00:12:54.000
Is asking a question theft? Is learning from

00:12:54.000 --> 00:12:57.279
an answer copyright infringement? It raises this

00:12:57.279 --> 00:13:00.059
massive question about what intellectual property

00:13:00.059 --> 00:13:02.299
even looks like when you can steal it just by

00:13:02.299 --> 00:13:05.419
asking questions. If I can distill your secret

00:13:05.419 --> 00:13:08.799
sauce just by talking to your bot, do you even

00:13:08.799 --> 00:13:11.000
have a moat? That is the billion -dollar question.

00:13:11.240 --> 00:13:13.240
And it connects back to that feeling of vulnerability.

00:13:13.559 --> 00:13:16.000
We are building these massive systems, but the

00:13:16.000 --> 00:13:18.460
edges are porous. Speaking of vulnerability,

00:13:18.840 --> 00:13:20.440
reading through these sources made me think about

00:13:20.440 --> 00:13:23.500
my own usage. I have to admit, I still wrestle

00:13:23.500 --> 00:13:26.840
with what they call confident guessing. Oh, hallucinations.

00:13:26.840 --> 00:13:29.080
Yeah, but specifically in high -stakes stuff,

00:13:29.139 --> 00:13:31.860
like I'll use AI for a contract review or a compliance

00:13:31.860 --> 00:13:34.879
check, and it sounds so sure of itself. But there's

00:13:34.879 --> 00:13:37.340
that nagging fear. Is it making this up? The

00:13:37.340 --> 00:13:39.000
source material mentioned a safety prompt rule

00:13:39.000 --> 00:13:41.740
to help with this. Yes, this is a crucial takeaway

00:13:41.740 --> 00:13:44.600
for you listening. The source mentioned a specific

00:13:44.600 --> 00:13:47.000
framework for stopping that confident guessing.

00:13:47.940 --> 00:13:50.279
Essentially, when you are doing something high

00:13:50.279 --> 00:13:53.000
stakes, checking a legal doc, looking for citations,

00:13:53.200 --> 00:13:56.259
you have to explicitly instruct the model on

00:13:56.259 --> 00:13:58.379
the negative constraint. Meaning what? You tell

00:13:58.379 --> 00:14:01.259
it. If you do not find the specific clause in

00:14:01.259 --> 00:14:03.679
the text provided, state that you cannot find

00:14:03.679 --> 00:14:06.919
it, do not infer, do not guess. It sounds so

00:14:06.919 --> 00:14:09.659
simple. But we forget to do it. Yeah. We trust

00:14:09.659 --> 00:14:12.019
the chat interface too much. We treat it like

00:14:12.019 --> 00:14:14.159
a conversation where it's rude to be that blunt.

00:14:14.320 --> 00:14:16.299
But you have to. We need to force it to prove

00:14:16.299 --> 00:14:18.320
its claim. Absolutely. We have to move from being

00:14:18.320 --> 00:14:21.340
passive users, hey, tell me about this, to active

00:14:21.340 --> 00:14:23.820
supervisors. Analyze this. Improve your work.

00:14:24.200 --> 00:14:26.460
So with things like Quinn merging editing and

00:14:26.460 --> 00:14:28.779
generation, what does this actually mean for

00:14:28.779 --> 00:14:31.519
creative workflows? We are moving from generating

00:14:31.519 --> 00:14:35.000
ideas to producing final assets without leaving

00:14:35.000 --> 00:14:37.539
the chat interface. It's a brave new world. It

00:14:37.539 --> 00:14:39.519
certainly is. We're going to take a very quick

00:14:39.519 --> 00:14:41.340
break to thank our partners who help keep this

00:14:41.340 --> 00:14:43.879
deep dive running. And when we come back, we're

00:14:43.879 --> 00:14:46.340
going to try to synthesize all this, the politics,

00:14:46.580 --> 00:14:50.000
the concrete, and the code into one big picture.

00:14:50.639 --> 00:14:54.299
Stay with us. Okay, we're back. We've covered

00:14:54.299 --> 00:14:57.179
a lot of ground today. From the cancellation

00:14:57.179 --> 00:14:59.659
parties of QuitGPT to the massive construction

00:14:59.659 --> 00:15:02.220
sites in Indiana to the model thieves stealing

00:15:02.220 --> 00:15:05.019
intelligence prompt by prompt. Yeah. When you

00:15:05.019 --> 00:15:08.120
look at this entire stack of stories, what is

00:15:08.120 --> 00:15:10.419
the through line for you? I think we are seeing

00:15:10.419 --> 00:15:13.039
a massive divergence. That's the word that keeps

00:15:13.039 --> 00:15:15.039
coming to mind. Avergence between what and what?

00:15:15.159 --> 00:15:17.179
Between the human layer and the machine layer.

00:15:17.320 --> 00:15:18.960
On one hand, you have the human layer pushing

00:15:18.960 --> 00:15:21.460
back. You have the quit GPT movement. You have

00:15:21.460 --> 00:15:23.740
the political maneuvering, the ethical concerns

00:15:23.740 --> 00:15:27.120
about ICE and funding. It's messy. It's emotional.

00:15:27.840 --> 00:15:31.039
It's loud. Right. It's people saying, wait, stop.

00:15:31.200 --> 00:15:32.980
Does this align with our values? And on the other

00:15:32.980 --> 00:15:35.259
hand. On the other hand, the machine layer is

00:15:35.259 --> 00:15:37.460
just accelerating with total indifference to

00:15:37.460 --> 00:15:40.559
that noise. Ten billion dollar chips, massive

00:15:40.559 --> 00:15:43.299
data centers, models that can clone each other.

00:15:43.399 --> 00:15:45.580
Yeah. The technology is no longer just software

00:15:45.580 --> 00:15:48.240
on a screen. It is becoming a physical industrial

00:15:48.240 --> 00:15:51.200
force and a political lightning rod simultaneously.

00:15:51.720 --> 00:15:54.019
It's almost like the technology has gained enough

00:15:54.019 --> 00:15:57.909
momentum that it's. It's decoupling from the

00:15:57.909 --> 00:16:00.370
public sentiment. The boycott is happening, but

00:16:00.370 --> 00:16:02.649
the cement trucks are still pouring. Exactly.

00:16:02.649 --> 00:16:05.350
The train has left the station, and now we are

00:16:05.350 --> 00:16:07.850
just arguing about who gets to sit in the conductor's

00:16:07.850 --> 00:16:11.169
chair. That is a sobering thought. The scale

00:16:11.169 --> 00:16:13.350
of the investment, the sheer physical weight

00:16:13.350 --> 00:16:16.210
of it, makes it very hard to stop or even steer.

00:16:16.490 --> 00:16:19.029
It does. And if I can leave you with one final

00:16:19.029 --> 00:16:21.269
thing that's been sticking in my brain, it's

00:16:21.269 --> 00:16:24.250
that story about the model copying. The Gemini

00:16:24.250 --> 00:16:27.340
theft. Yeah. Think about it. If a trillion -dollar

00:16:27.340 --> 00:16:30.059
company like Google, with all its defenses, can

00:16:30.059 --> 00:16:33.179
have its intelligence stolen via 100 ,000 text

00:16:33.179 --> 00:16:36.200
prompts, what does intellectual property even

00:16:36.200 --> 00:16:39.399
mean in the age of fluid intelligence? If knowledge

00:16:39.399 --> 00:16:41.779
can be decanted from one machine to another just

00:16:41.779 --> 00:16:44.019
by asking questions, the economics of this whole

00:16:44.019 --> 00:16:45.779
industry might be more fragile than we think.

00:16:46.320 --> 00:16:48.580
That is a question I think we're going to be

00:16:48.580 --> 00:16:50.500
wrestling with for the next decade. I think so,

00:16:50.580 --> 00:16:52.620
too. If you want to protect your own workflows,

00:16:53.650 --> 00:16:55.950
especially if you're using these tools for contracts

00:16:55.950 --> 00:16:59.009
or anything legal, I highly recommend you check

00:16:59.009 --> 00:17:02.029
out that safety prompt rule we mentioned. We'll

00:17:02.029 --> 00:17:04.549
have the details and the specific phrasing in

00:17:04.549 --> 00:17:08.250
the show notes. It's a small step, but it might

00:17:08.250 --> 00:17:11.190
save you from a very confident, very wrong answer.

00:17:11.349 --> 00:17:13.450
Worth a click. Thanks for diving Dean with us

00:17:13.450 --> 00:17:16.490
today. It's a complex world out there, but hey,

00:17:16.609 --> 00:17:18.750
at least we're figuring it out together. See

00:17:18.750 --> 00:17:19.829
you next time. Take care, everyone.
