WEBVTT

00:00:00.000 --> 00:00:03.129
if you've ever uh gotten something from generative

00:00:03.129 --> 00:00:05.269
AI that looked OK on the surface, but just felt

00:00:05.269 --> 00:00:07.349
off, like it needed a ton of work. You're definitely

00:00:07.349 --> 00:00:09.650
not alone. We're talking about work slop. It's

00:00:09.650 --> 00:00:11.650
this AI generated stuff that kind of pretends

00:00:11.650 --> 00:00:14.550
to be productive work. But really, it just makes

00:00:14.550 --> 00:00:17.250
the next person down the line waste hours fixing

00:00:17.250 --> 00:00:19.269
it. It's like this invisible tax we're putting

00:00:19.269 --> 00:00:21.570
on ourselves inside companies. And it's a huge

00:00:21.570 --> 00:00:24.609
tax. I mean, one source we look at put a number

00:00:24.609 --> 00:00:27.210
on it for a, say, 10 ,000 person company. We're

00:00:27.210 --> 00:00:29.609
talking about nine million dollars a year lost.

00:00:30.629 --> 00:00:34.130
beat. That's just pure waste. Productivity down

00:00:34.130 --> 00:00:36.429
the drain, basically fueled by taking the easy

00:00:36.429 --> 00:00:39.310
digital way out. Welcome to the Deep Dive. Today,

00:00:39.429 --> 00:00:41.030
we're going to try and cut through some of that

00:00:41.030 --> 00:00:43.530
noise. We're using the sources you sent over

00:00:43.530 --> 00:00:47.289
to look at the hard reality of scaling AI, trying

00:00:47.289 --> 00:00:48.869
to figure out where it's actually delivering

00:00:48.869 --> 00:00:50.969
genius and where it's just making these expensive

00:00:50.969 --> 00:00:54.009
messes. Yeah, we've got a plan. First up, we'll

00:00:54.009 --> 00:00:56.369
dig deeper into this work slop thing, the actual

00:00:56.369 --> 00:00:59.200
cost. financially, but also psychologically.

00:00:59.719 --> 00:01:02.299
Old trust tax idea. Then second, we'll pivot.

00:01:02.420 --> 00:01:04.859
Look at how fast AI capabilities are actually

00:01:04.859 --> 00:01:08.319
jumping forward. I mean, from gaming to even

00:01:08.319 --> 00:01:12.299
warnings about the job market. And finally, we'll

00:01:12.299 --> 00:01:14.700
tackle the big one. This potential AI energy

00:01:14.700 --> 00:01:17.739
apocalypse. Is this massive build -out of power

00:01:17.739 --> 00:01:20.620
infrastructure really needed? Or is it maybe

00:01:20.620 --> 00:01:23.200
speculative hype? Right. The goal here is really

00:01:23.200 --> 00:01:26.480
separating that hype from the... The hard reality

00:01:26.480 --> 00:01:28.239
when you actually try to deploy this stuff at

00:01:28.239 --> 00:01:30.599
scale. Let's get into it. So the basic idea behind

00:01:30.599 --> 00:01:33.000
work slop, it's really just pushing the cost

00:01:33.000 --> 00:01:34.859
onto someone else, isn't it? Someone hits generate,

00:01:35.000 --> 00:01:36.900
maybe glances at it. Feels good because they

00:01:36.900 --> 00:01:38.920
send something. Reductive. Right. But then the

00:01:38.920 --> 00:01:41.379
person who gets it might spend, what, two hours

00:01:41.379 --> 00:01:43.079
cleaning it up? They end up eating the entire

00:01:43.079 --> 00:01:45.579
cost of that quick solution. Exactly. It's the

00:01:45.579 --> 00:01:47.840
hidden cost of their convenience. Passed straight

00:01:47.840 --> 00:01:50.540
to their colleague. And, you know, this isn't

00:01:50.540 --> 00:01:53.780
rare. The data we saw suggests 40 % of employees

00:01:53.780 --> 00:01:56.519
got work slop just last month. and maybe 15 %

00:01:56.519 --> 00:01:57.859
of all the stuff flying around the workplace,

00:01:58.159 --> 00:02:00.859
emails, reports, code, whatever, is basically

00:02:00.859 --> 00:02:04.180
slop now. Scale that up, right? The estimate

00:02:04.180 --> 00:02:07.840
was about $186 per incident per month. That's

00:02:07.840 --> 00:02:09.900
how you get to that $9 million figure pretty

00:02:09.900 --> 00:02:12.080
fast in a big company. This may be less like

00:02:12.080 --> 00:02:14.719
stacking Lego blocks badly and more like, hmm,

00:02:14.860 --> 00:02:17.360
like building software on a shaky foundation.

00:02:17.460 --> 00:02:20.240
The code looks okay, maybe compiles. But try

00:02:20.240 --> 00:02:22.669
adding a new feature. Everything breaks. You

00:02:22.669 --> 00:02:25.110
face huge delays, costly rewrites you have to

00:02:25.110 --> 00:02:27.509
do. It's unstable underneath. Okay. That analogy

00:02:27.509 --> 00:02:30.009
makes more sense for maybe our technical listeners.

00:02:30.189 --> 00:02:32.330
But I still have to push a little on that 9 million

00:02:32.330 --> 00:02:34.189
number. Is there a chance that's consultants

00:02:34.189 --> 00:02:36.310
kind of hyping the problem to sell solutions?

00:02:36.629 --> 00:02:39.030
Or is it a solid measured cost? That's a really

00:02:39.030 --> 00:02:40.930
important question. And yeah, consultants are

00:02:40.930 --> 00:02:42.810
always going to sell solutions, right? But this

00:02:42.810 --> 00:02:47.009
specific calculation seems based on... documented

00:02:47.009 --> 00:02:50.090
time sinks. Things like average hourly wage times

00:02:50.090 --> 00:02:52.689
the hours people report spending just verifying

00:02:52.689 --> 00:02:55.849
or rewriting this AI output. So you can quibble

00:02:55.849 --> 00:02:58.169
with the exact dollar amount maybe, but the cost

00:02:58.169 --> 00:03:00.069
is definitely real. And it's not just money.

00:03:00.169 --> 00:03:02.870
It's the human side. This trust tax idea. You

00:03:02.870 --> 00:03:05.689
keep sending out subpar AI stuff and people notice.

00:03:05.870 --> 00:03:08.789
Over half, 53%, they get annoyed. Okay, fine,

00:03:08.870 --> 00:03:11.830
but worse. 42 % start seeing the sender as less

00:03:11.830 --> 00:03:15.150
trustworthy. And 37 % literally think they're

00:03:15.150 --> 00:03:17.659
less intelligent. Ouch. Your credibility just

00:03:17.659 --> 00:03:19.919
tanks. That's the real danger, systemically.

00:03:20.240 --> 00:03:22.000
And that happens when companies just say use

00:03:22.000 --> 00:03:24.139
AI everywhere, right, without clear rules. So

00:03:24.139 --> 00:03:26.219
people become passengers just using it to dodge

00:03:26.219 --> 00:03:29.159
work instead of being pilots who use it to genuinely

00:03:29.159 --> 00:03:31.879
amplify their own skills, their creativity. I'll

00:03:31.879 --> 00:03:33.759
admit, I still wrestle with this temptation myself

00:03:33.759 --> 00:03:37.020
sometimes. You know, prompt drift, getting that

00:03:37.020 --> 00:03:39.180
easy answer that looks good initially, but, you

00:03:39.180 --> 00:03:41.099
know, deep down, it's going to need five more

00:03:41.099 --> 00:03:42.860
rounds of checks and edits before it's actually

00:03:42.860 --> 00:03:45.500
usable. Good point. Let's quickly define prompt

00:03:45.500 --> 00:03:47.639
drift for everyone because it's key to work slop.

00:03:48.020 --> 00:03:50.229
It's basically when the AI. This output quality

00:03:50.229 --> 00:03:53.469
gets worse over a long chat or lots of edits.

00:03:53.750 --> 00:03:56.569
Like the model loses the plot, loses context,

00:03:56.629 --> 00:03:59.270
and just starts spitting out nonsense. So given

00:03:59.270 --> 00:04:01.509
this trust problem and people acting like passengers,

00:04:01.669 --> 00:04:03.870
what's the single most critical thing leaders

00:04:03.870 --> 00:04:06.590
need to do? to shift people towards being pilots.

00:04:07.610 --> 00:04:09.750
Leaders need to prioritize critical thinking

00:04:09.750 --> 00:04:12.409
and verification over just speed. Okay, so that

00:04:12.409 --> 00:04:14.550
work slop issue, that's one side of the coin.

00:04:14.590 --> 00:04:16.610
It's misusing a powerful tool, basically. But

00:04:16.610 --> 00:04:18.189
the irony is while we're struggling with that,

00:04:18.310 --> 00:04:21.430
the tech itself is just leaping forward exponentially,

00:04:21.930 --> 00:04:24.009
achieving things that are genuinely, you know,

00:04:24.009 --> 00:04:26.649
genius level. Oh, the pace is wild. Look at connectivity.

00:04:27.709 --> 00:04:30.889
ChatGPT connectors, which are just... Tools linking

00:04:30.889 --> 00:04:32.810
language models to other software for automation.

00:04:32.829 --> 00:04:35.310
They now work with over 500 apps. Automation

00:04:35.310 --> 00:04:36.889
isn't some future thing. It's right there. Plug

00:04:36.889 --> 00:04:38.850
and play, pretty much. And the models are getting

00:04:38.850 --> 00:04:41.930
smarter, deeper. Google's Gemini AI, for instance,

00:04:42.050 --> 00:04:44.509
it's showing how it can think through a search

00:04:44.509 --> 00:04:46.810
query, not just matching keywords, but producing

00:04:46.810 --> 00:04:49.329
these detailed summaries. Better reasoning. And

00:04:49.329 --> 00:04:54.250
then there's this. This moment that really makes

00:04:54.250 --> 00:04:57.449
you pause. GPT -5 codex. It apparently one -shotted

00:04:57.449 --> 00:04:59.449
Minecraft. Just think about that. One prompt

00:04:59.449 --> 00:05:01.949
in plain English. And it builds a whole 3D world.

00:05:02.050 --> 00:05:04.529
Textures, water, physics, the works. Whoa. I

00:05:04.529 --> 00:05:07.329
mean, imagine scaling that ability. Solving incredibly

00:05:07.329 --> 00:05:09.689
complex real -world problems with that kind of

00:05:09.689 --> 00:05:12.529
power. That's huge potential. Kind of terrifying,

00:05:12.689 --> 00:05:15.509
too. Yeah, but that power surge has a flip side

00:05:15.509 --> 00:05:18.810
as it filters into daily life. You see meta using

00:05:18.810 --> 00:05:22.199
AI and Facebook dating now. trying to optimize

00:05:22.199 --> 00:05:26.540
human connection. And in the really dark end,

00:05:26.660 --> 00:05:30.920
those hyper -realistic AI -generated murder videos

00:05:30.920 --> 00:05:33.720
made with VO that showed up on YouTube only got

00:05:33.720 --> 00:05:35.560
taken down because a journalist flagged them.

00:05:35.699 --> 00:05:38.139
The tech moves faster than our ability to manage

00:05:38.139 --> 00:05:40.399
it or even moderate it sometimes. And that speed

00:05:40.399 --> 00:05:43.079
crashes right into the economy. Which brings

00:05:43.079 --> 00:05:45.480
us to that stark warning from the Wharton professor

00:05:45.480 --> 00:05:47.829
you mentioned. Their research suggests this kind

00:05:47.829 --> 00:05:49.730
of exponential leap like the Minecraft thing

00:05:49.730 --> 00:05:52.230
shows is possible could make a lot of human jobs,

00:05:52.310 --> 00:05:55.250
especially white -collar ones, obsolete and fast.

00:05:55.470 --> 00:05:57.329
The timelines seem to be shrinking way faster

00:05:57.329 --> 00:05:59.410
than people thought. Right, so we need to ask,

00:05:59.589 --> 00:06:01.529
does this sheer speed of deployment building

00:06:01.529 --> 00:06:04.050
a whole game world instantly mean job losses

00:06:04.050 --> 00:06:06.269
are coming much, much sooner than most predictions?

00:06:06.610 --> 00:06:09.110
Yeah, seems like it. The exponential leap demands

00:06:09.110 --> 00:06:12.870
we start preparing now for big job shifts. Okay,

00:06:12.949 --> 00:06:15.850
we've covered the software side. The slop and

00:06:15.850 --> 00:06:18.870
the genius. Let's switch gears to the hardware.

00:06:19.110 --> 00:06:22.149
The physical infrastructure. Specifically, the

00:06:22.149 --> 00:06:24.410
massive energy needed to actually run these things.

00:06:24.550 --> 00:06:27.470
Oh, it's staggering. One modern AI data rack.

00:06:27.670 --> 00:06:30.910
It pulls the same power as maybe 80 to 100 average

00:06:30.910 --> 00:06:33.509
homes. Compare that to a traditional server rack,

00:06:33.610 --> 00:06:36.139
maybe three homes worth of power. So this need

00:06:36.139 --> 00:06:39.180
for huge computation is driving these gigantic

00:06:39.180 --> 00:06:41.860
energy requests to utilities everywhere. We're

00:06:41.860 --> 00:06:43.959
talking gigawatts. But here's the tension, right?

00:06:44.040 --> 00:06:47.500
This predicted AI energy apocalypse. Maybe it's

00:06:47.500 --> 00:06:49.319
not all based on solid confirmed need. Maybe

00:06:49.319 --> 00:06:52.300
there's a lot of speculative hype driving overbuilding.

00:06:52.439 --> 00:06:54.259
The signs of potential inflation are definitely

00:06:54.259 --> 00:06:56.800
there. Since this whole AI boom kicked off, proposals

00:06:56.800 --> 00:06:59.319
for new natural gas power plants are up 70%,

00:06:59.319 --> 00:07:01.860
mostly earmarked for future data centers. In

00:07:01.860 --> 00:07:04.120
the U .S. Southeast, the demand utilities are

00:07:04.120 --> 00:07:07.220
forecasting is four times higher than what independent

00:07:07.220 --> 00:07:09.600
researchers think the actual need will be. Four

00:07:09.600 --> 00:07:12.740
times. Four times. That's a massive difference.

00:07:12.879 --> 00:07:15.819
How does that even happen? Why are utilities

00:07:15.819 --> 00:07:18.939
seemingly accepting forecasts that might be wildly

00:07:18.939 --> 00:07:21.500
inflated? What's the mechanism? Well, it seems

00:07:21.500 --> 00:07:23.800
like a mix of things. System incentives, maybe

00:07:23.800 --> 00:07:26.360
some regulatory capture. Developers put in these

00:07:26.360 --> 00:07:28.279
huge requests often before they even have the

00:07:28.279 --> 00:07:31.199
money lined up or firm customers. Speculative

00:07:31.199 --> 00:07:33.939
requests. Utilities often accept them, perhaps

00:07:33.939 --> 00:07:35.860
to lock down the rights to build generations,

00:07:36.019 --> 00:07:38.360
secure land, get permits. They call it nameplate

00:07:38.360 --> 00:07:40.560
capacity. It's like a land grab for future power

00:07:40.560 --> 00:07:43.819
needs based on potential, not guaranteed use.

00:07:44.060 --> 00:07:46.420
And apparently even utilities admit privately

00:07:46.420 --> 00:07:48.459
the real usage might end up being, you know,

00:07:48.459 --> 00:07:50.420
three to five times smaller than what's on paper

00:07:50.420 --> 00:07:52.899
right now. OK, so if that high bubble bursts

00:07:52.899 --> 00:07:55.639
or even just shrinks significantly, who pays

00:07:55.639 --> 00:07:58.360
for all that potentially unnecessary gas infrastructure

00:07:58.360 --> 00:08:00.540
they build? It's going to be the consumers, right?

00:08:00.660 --> 00:08:03.180
Through higher electricity bills, the costs get

00:08:03.180 --> 00:08:06.100
socialized. Exactly. Stranded assets paid for

00:08:06.100 --> 00:08:09.430
by ratepayers and the environmental cost. Also

00:08:09.430 --> 00:08:12.009
huge. One example cited was a metadata center

00:08:12.009 --> 00:08:14.910
project down in Louisiana. If all that speculative

00:08:14.910 --> 00:08:17.350
demand actually happens, it could lock in over

00:08:17.350 --> 00:08:20.889
100 million tons of extra carbon emissions just

00:08:20.889 --> 00:08:23.029
from that one project's power needs over its

00:08:23.029 --> 00:08:25.490
life. So this rush to build feels like it creates

00:08:25.490 --> 00:08:28.259
a major policy clash. It lines up nicely with

00:08:28.259 --> 00:08:32.379
political agendas favoring fossil fuels, but

00:08:32.379 --> 00:08:35.000
it runs directly against stated goals like getting

00:08:35.000 --> 00:08:37.960
to a carbon -free grid by, say, 2035. It feels

00:08:37.960 --> 00:08:39.860
like we're planning a fossil fuel future based

00:08:39.860 --> 00:08:42.139
on maybe a ghost demand. Yeah, it's a real paradox.

00:08:42.220 --> 00:08:44.779
So if utilities know they might be over forecasting

00:08:44.779 --> 00:08:47.980
demand by potentially 4X, what's the single biggest

00:08:47.980 --> 00:08:50.460
systemic risk here for energy policy for all

00:08:50.460 --> 00:08:52.899
of us consumers? Consumers risk paying for unnecessary

00:08:52.899 --> 00:08:56.000
fossil fuel plants based on inflated AI hype.

00:08:57.840 --> 00:09:03.440
Okay, let's try to pull the threads together

00:09:03.440 --> 00:09:05.899
from this deep dive. What we're seeing is this

00:09:05.899 --> 00:09:09.860
fundamental AI paradox, right? On one hand, it's

00:09:09.860 --> 00:09:12.820
creating huge new inefficiencies inside companies

00:09:12.820 --> 00:09:15.639
that work slop, costing millions. But on the

00:09:15.639 --> 00:09:17.600
other hand, it's achieving these incredible genius

00:09:17.600 --> 00:09:20.360
level things that could totally reshape the job

00:09:20.360 --> 00:09:22.639
market. Yeah, the connection seems to be this

00:09:22.639 --> 00:09:25.580
deep misalignment everywhere. Misalignment between

00:09:25.580 --> 00:09:28.120
the tools capability and how it's actually being

00:09:28.120 --> 00:09:30.539
applied. That's the whole passengers versus pilots

00:09:30.539 --> 00:09:33.580
thing causing the slop. And then this huge misalignment

00:09:33.580 --> 00:09:36.360
between the real energy needs right now versus

00:09:36.360 --> 00:09:39.409
this potentially overblown. speculative infrastructure

00:09:39.409 --> 00:09:42.230
build out based on hype. It feels like, based

00:09:42.230 --> 00:09:44.769
on the sources, we're building this massive energy

00:09:44.769 --> 00:09:47.110
future based on the promise of AI, what it might

00:09:47.110 --> 00:09:49.870
consume someday. But maybe not on its proven,

00:09:49.970 --> 00:09:52.799
confirmed necessity today. So here's a provocative

00:09:52.799 --> 00:09:55.000
thought for you to chew on this week. Thinking

00:09:55.000 --> 00:09:58.039
about that work slot, what existing human institution,

00:09:58.259 --> 00:10:00.460
maybe a big government agency, a traditional

00:10:00.460 --> 00:10:02.980
industry, maybe even a university, what would

00:10:02.980 --> 00:10:04.940
be the first one to really start crumbling under

00:10:04.940 --> 00:10:07.299
the weight of all that AI -generated inefficiency?

00:10:07.679 --> 00:10:10.539
That is a fascinating and slightly unsettling

00:10:10.539 --> 00:10:12.059
thought exercise. So your call to action this

00:10:12.059 --> 00:10:14.360
week, just be mindful. Pay attention to the AI

00:10:14.360 --> 00:10:16.720
-generated content you encounter. Ask yourself,

00:10:16.840 --> 00:10:19.929
is this saving time? Is it quality? Or is it

00:10:19.929 --> 00:10:22.029
slop? Are you seeing genius? Or just more digital

00:10:22.029 --> 00:10:24.509
noise? Thanks for joining us for this deep dive

00:10:24.509 --> 00:10:26.889
into AI's hidden costs and its undeniable power.

00:10:27.110 --> 00:10:28.649
Otiro music fades in.
