WEBVTT

00:00:00.000 --> 00:00:02.580
We watched Meta start the year as the open source

00:00:02.580 --> 00:00:05.259
darling, sharing their llama models everywhere,

00:00:05.480 --> 00:00:08.880
inviting the community in. Right. Now they are

00:00:08.880 --> 00:00:11.119
secretly building something called Avocado and

00:00:11.119 --> 00:00:13.560
they are betting the entire company's future

00:00:13.560 --> 00:00:17.239
on keeping it locked down. That pivot from, you

00:00:17.239 --> 00:00:19.920
know, communal sharing and transparency to suddenly

00:00:19.920 --> 00:00:22.870
locking down their biggest asset. is more than

00:00:22.870 --> 00:00:25.629
just corporate strategy. It's an identity crisis

00:00:25.629 --> 00:00:29.109
for Meta, signaling a massive high stakes reversal

00:00:29.109 --> 00:00:32.530
that affects the entire AI landscape. They feel

00:00:32.530 --> 00:00:34.630
they have to win at the very top tier, whatever

00:00:34.630 --> 00:00:37.250
the cost. Welcome to the Deep Dive. Our mission

00:00:37.250 --> 00:00:39.490
today is to unpack these fundamental shifts.

00:00:39.770 --> 00:00:41.909
Using the recent source material you shared with

00:00:41.909 --> 00:00:44.070
us, we have a lot of tension to work through.

00:00:44.189 --> 00:00:46.649
First, we're going to dig deep into Meta's dramatic

00:00:46.649 --> 00:00:49.509
internal culture shock and the mysterious delay

00:00:49.509 --> 00:00:52.329
of that closed source avocado model. Yeah, there's

00:00:52.329 --> 00:00:54.689
a lot there. Then we'll pivot to AI in the wild,

00:00:54.829 --> 00:00:57.670
looking at practical chat GPT tips, the rapid

00:00:57.670 --> 00:00:59.649
commoditization happening in the voice market,

00:00:59.829 --> 00:01:03.189
and some of the unexpected ways AI is showing

00:01:03.189 --> 00:01:06.439
up. In ads and even late -night chats. Exactly.

00:01:06.659 --> 00:01:10.040
And finally, we'll analyze a truly significant

00:01:10.040 --> 00:01:13.260
scientific breakthrough, Microsoft's open -source

00:01:13.260 --> 00:01:15.879
Gigatime model, which is dramatically reducing

00:01:15.879 --> 00:01:18.840
the cost of cancer research. Down to about $10

00:01:18.840 --> 00:01:21.840
per analysis. It's incredible. It is. So we aren't

00:01:21.840 --> 00:01:23.980
just reading headlines today. We are looking

00:01:23.980 --> 00:01:27.540
for the why. Behind the biggest moves connecting

00:01:27.540 --> 00:01:30.359
the docks between, you know, Silicon Valley drama

00:01:30.359 --> 00:01:32.939
and breakthroughs that fundamentally change science.

00:01:33.180 --> 00:01:35.780
So let's start with Meta. Just months ago, they

00:01:35.780 --> 00:01:38.239
were the hero of the open source community hyping

00:01:38.239 --> 00:01:41.459
Lama. But the sources show Lama is barely mentioned

00:01:41.459 --> 00:01:44.420
now. It's almost as if it's been quietly abandoned.

00:01:44.780 --> 00:01:47.519
What's fascinating here is how immediate and

00:01:47.519 --> 00:01:49.939
radical this shift to secrecy has been. And it's

00:01:49.939 --> 00:01:52.819
all about this new model, Avocado. The secretive

00:01:52.819 --> 00:01:55.859
next gen model is Avocado. It was expected to

00:01:55.859 --> 00:01:58.400
launch by the end of 2025, but that timeline

00:01:58.400 --> 00:02:00.900
is already slipping. Push back. Push back to

00:02:00.900 --> 00:02:03.700
Q1 of 2026. And the biggest fear among researchers

00:02:03.700 --> 00:02:05.879
is that this will be a fully closed source model.

00:02:06.000 --> 00:02:08.020
That's a complete dramatic reversal for them.

00:02:08.180 --> 00:02:10.860
It is. I mean, they built their modern AI reputation

00:02:10.860 --> 00:02:14.000
on being the most generous sharer of foundation

00:02:14.000 --> 00:02:17.740
models. So why the sudden loss of faith in their

00:02:17.740 --> 00:02:20.680
own strategy? Well, the core hypothesis outlined

00:02:20.680 --> 00:02:23.419
in your sources is simple, if a little brutal.

00:02:23.520 --> 00:02:26.120
Okay. Meta concluded that open source simply

00:02:26.120 --> 00:02:29.139
didn't pay off. They poured billions in, but

00:02:29.139 --> 00:02:31.020
they didn't get the revenue or the market lead

00:02:31.020 --> 00:02:33.569
they wanted. So they feel they need to own the

00:02:33.569 --> 00:02:36.789
next GPT level model to stay relevant. They feel

00:02:36.789 --> 00:02:39.030
they absolutely have to. That urgency is translating

00:02:39.030 --> 00:02:41.389
into some pretty brutal internal changes. We're

00:02:41.389 --> 00:02:43.509
hearing the culture has really fractured. It

00:02:43.509 --> 00:02:47.870
has. The new mantra inside key AI teams is demo,

00:02:47.969 --> 00:02:50.270
don't memo. No more lengthy workplace updates

00:02:50.270 --> 00:02:52.710
then. Exactly. They're forcing teams to operate

00:02:52.710 --> 00:02:55.669
like these intense, scrappy startups, pushing

00:02:55.669 --> 00:02:58.129
for immediate results. And that pressure cooker

00:02:58.129 --> 00:03:01.030
environment is leading to, what was it, 70 hours?

00:03:01.069 --> 00:03:03.229
Workweeks. Reported 70 -hour workweeks. Yeah.

00:03:03.389 --> 00:03:05.650
Coupled with targeted layoffs that are specifically

00:03:05.650 --> 00:03:07.969
hitting the FAIR division. And FAIR is their

00:03:07.969 --> 00:03:10.409
fundamental AI research group. That's where all

00:03:10.409 --> 00:03:12.909
the academic, open -minded research used to happen.

00:03:13.090 --> 00:03:15.289
That was the heart of it. And that culture is

00:03:15.289 --> 00:03:18.270
being dismantled. It seems less about sustainability

00:03:18.270 --> 00:03:22.800
and more about desperation. Deliver or exit.

00:03:22.939 --> 00:03:24.960
And we're seeing that at the top, too. The sources

00:03:24.960 --> 00:03:28.060
highlight the replacement of longtime meta executives.

00:03:28.199 --> 00:03:30.780
Right. And the departure of chief scientist Jan

00:03:30.780 --> 00:03:34.020
LeCun back in October when a pioneer like LeCun,

00:03:34.120 --> 00:03:35.960
who basically is the champion of open research,

00:03:36.219 --> 00:03:39.520
leaves. Or is pushed out. Or is pushed out. Yeah.

00:03:39.620 --> 00:03:41.780
It signals a fundamental housecleaning. It's

00:03:41.780 --> 00:03:44.120
all about securing results, period. And that

00:03:44.120 --> 00:03:46.240
pressure is also being intensified by some hard

00:03:46.240 --> 00:03:48.199
infrastructure challenges, right? Precisely.

00:03:49.179 --> 00:03:52.180
Meta's massive, what, $27 billion data center

00:03:52.180 --> 00:03:54.919
still isn't fully ready. So they have to rely

00:03:54.919 --> 00:03:57.060
on third -party clouds just to train their own

00:03:57.060 --> 00:03:59.560
models. Which drives up costs and adds latency.

00:03:59.819 --> 00:04:01.520
They're playing catch -up, and they're doing

00:04:01.520 --> 00:04:03.759
it by changing their entire identity. Okay, so

00:04:03.759 --> 00:04:06.280
let's unpack this. The main reason for Meta's

00:04:06.280 --> 00:04:09.039
dramatic cultural overhaul. They believe open

00:04:09.039 --> 00:04:12.659
source failed to pay off, forcing this high -stakes

00:04:12.659 --> 00:04:15.240
closed model push. So that's the heavy corporate

00:04:15.240 --> 00:04:18.689
drama. Let's pivot now. What ties all this together

00:04:18.689 --> 00:04:21.790
is how fast the technology itself is moving outside

00:04:21.790 --> 00:04:24.050
of these big labs. Yeah, let's look at what that

00:04:24.050 --> 00:04:26.089
means for you, the user. Okay. We can start with

00:04:26.089 --> 00:04:29.230
some practical nuggets. OpenAI recently dropped

00:04:29.230 --> 00:04:32.649
six official chat GPT tips for getting better

00:04:32.649 --> 00:04:35.360
output. I saw that. The concepts aren't entirely

00:04:35.360 --> 00:04:38.180
new, but it's a useful summary. It is. Things

00:04:38.180 --> 00:04:40.500
like starting with a clear role for the model.

00:04:40.620 --> 00:04:43.560
And, you know, I'll admit I still wrestle with

00:04:43.560 --> 00:04:46.540
Trump drift myself. Oh, me too. It's that frustrating

00:04:46.540 --> 00:04:48.579
moment when the model just starts losing the

00:04:48.579 --> 00:04:50.800
original context of your conversation. Exactly.

00:04:51.310 --> 00:04:53.250
So having these codified tips is actually pretty

00:04:53.250 --> 00:04:55.930
helpful. And speaking of utility, the Adobe integration

00:04:55.930 --> 00:04:59.709
sounds like a real game changer. Oh, it is. Photoshop

00:04:59.709 --> 00:05:02.430
and Express are finally accessible just using

00:05:02.430 --> 00:05:04.550
natural language prompts. You can literally edit

00:05:04.550 --> 00:05:07.350
images just by talking. Which radically lowers

00:05:07.350 --> 00:05:10.350
the barrier to entry for creative work. No more

00:05:10.350 --> 00:05:13.189
navigating complex menus. Not every deployment

00:05:13.189 --> 00:05:15.410
goes that smoothly. We saw that with McDonald's.

00:05:15.490 --> 00:05:18.750
Right, their AI -generated Christmas ad. The

00:05:18.750 --> 00:05:21.509
sources detail this quick, intense backlash that

00:05:21.509 --> 00:05:23.589
caused them to pull it. People called it creepy

00:05:23.589 --> 00:05:26.050
and god -awful. Yeah. It's a great example of

00:05:26.050 --> 00:05:28.790
how quickly the public rejects AI when it just

00:05:28.790 --> 00:05:32.029
feels unsettling. It's a crucial lesson for marketers,

00:05:32.189 --> 00:05:34.810
for sure. And on the business front, Forbes released

00:05:34.810 --> 00:05:38.230
a surprisingly honest prediction piece for 2026.

00:05:38.430 --> 00:05:41.170
The honesty was the key part. Yes. They explicitly

00:05:41.170 --> 00:05:43.790
admit they got their previous AI timing estimates

00:05:43.790 --> 00:05:47.050
wrong. Now they are focusing on 10 specific bets

00:05:47.050 --> 00:05:49.930
for automation and the future of work. That kind

00:05:49.930 --> 00:05:52.689
of candor is so rare. And we're also seeing the

00:05:52.689 --> 00:05:55.110
competition just heat up globally. Google launched

00:05:55.110 --> 00:05:57.949
a massive move in India. Aggressive. They're

00:05:57.949 --> 00:06:00.449
offering their AI Plus plan with Gemini 3 Pro.

00:06:00.529 --> 00:06:02.649
It starts for new users at about $2 a month.

00:06:02.769 --> 00:06:04.769
Just $2. They are clearly going after market

00:06:04.769 --> 00:06:07.050
share where Meta's open model struggled to get

00:06:07.050 --> 00:06:09.290
a commercial foothold. And all this usage is

00:06:09.290 --> 00:06:12.449
revealing some fascinating things about us. Microsoft's

00:06:12.449 --> 00:06:16.430
end of year report analyzed 37 .5 million chats.

00:06:16.829 --> 00:06:18.970
And they found some highly unexpected patterns.

00:06:19.230 --> 00:06:21.930
Things like 2 a .m. philosophy chats. Philosophy

00:06:21.930 --> 00:06:24.769
chats. That's amazing. Not productivity, but

00:06:24.769 --> 00:06:27.370
deep late night reflection. Exactly. And whoa,

00:06:27.709 --> 00:06:31.750
imagine scaling that analysis to a billion queries

00:06:31.750 --> 00:06:34.430
across different time zones. Right. It paints

00:06:34.430 --> 00:06:36.990
a picture of AI, not just as a tool for drafting

00:06:36.990 --> 00:06:40.160
email. but as this silent late night sounding

00:06:40.160 --> 00:06:43.449
board for You know, human existence. A nonjudgmental

00:06:43.449 --> 00:06:45.550
digital listener. I like that. And this pressure

00:06:45.550 --> 00:06:47.449
is hitting niche companies, too. Eleven Labs,

00:06:47.709 --> 00:06:49.889
the voice synthesis leader, they just raised

00:06:49.889 --> 00:06:53.709
$100 million. But their CEO thinks voice AI itself

00:06:53.709 --> 00:06:56.189
is going to be commoditized. He does. So their

00:06:56.189 --> 00:06:59.149
new bet is on full AI agents, music generation,

00:06:59.389 --> 00:07:02.089
and crucially, deepfake protection tools. So

00:07:02.089 --> 00:07:04.069
what does the Eleven Labs shift, moving away

00:07:04.069 --> 00:07:06.430
from just voice, reveal about the current AI

00:07:06.430 --> 00:07:08.889
market? Even successful niche tech will commoditize

00:07:08.889 --> 00:07:11.819
fast, pushing focus to f***. full AI agents.

00:07:11.980 --> 00:07:13.920
That makes perfect sense. The value moves from

00:07:13.920 --> 00:07:16.399
the parts to the whole system. Okay, let's shift

00:07:16.399 --> 00:07:18.759
gears entirely now. We're moving from commercial

00:07:18.759 --> 00:07:21.660
tension to a highly impactful piece of scientific

00:07:21.660 --> 00:07:24.500
research. This is where that shared knowledge

00:07:24.500 --> 00:07:28.379
model really, really changes lives. We're talking

00:07:28.379 --> 00:07:30.459
about Microsoft's breakthrough with an open source

00:07:30.459 --> 00:07:32.990
model called Gigatime. And what's remarkable

00:07:32.990 --> 00:07:36.750
here is that it democratizes access to high -end

00:07:36.750 --> 00:07:39.170
medical diagnosis. It really does. So how does

00:07:39.170 --> 00:07:42.970
it do that? It takes a basic, low -cost, $10

00:07:42.970 --> 00:07:45.829
tissue slide, the kind any local hospital can

00:07:45.829 --> 00:07:50.269
produce, and transforms it into the rich, detailed

00:07:50.269 --> 00:07:52.930
immune system analysis that previously required

00:07:52.930 --> 00:07:56.089
specialized expensive machines. And human specialists

00:07:56.089 --> 00:07:59.009
and days of work. Right. It essentially turns...

00:07:59.230 --> 00:08:02.529
cheap, widely available data into high -end diagnostic

00:08:02.529 --> 00:08:05.129
data. It maps the immune system and the tumor

00:08:05.129 --> 00:08:07.089
environment. Like creating these high -resolution

00:08:07.089 --> 00:08:10.389
cancer maps. Exactly. To do this, the model was

00:08:10.389 --> 00:08:13.290
trained on a massive data set, 40 million cell

00:08:13.290 --> 00:08:15.889
samples from Providence Health. 40 million! That's

00:08:15.889 --> 00:08:18.509
a huge data scale. It's like stacking Lego blocks

00:08:18.509 --> 00:08:21.370
of data, but for medicine. The depth is unparalleled.

00:08:21.470 --> 00:08:23.329
And this wasn't just a lab exercise. No, it was

00:08:23.329 --> 00:08:26.329
tested on real patients. Over 14 ,000 real patients

00:08:26.329 --> 00:08:28.769
across 24 different cancer types. And the outcome.

00:08:29.029 --> 00:08:31.730
It created a virtual tumor library of more than

00:08:31.730 --> 00:08:35.190
300 ,000 high -resolution images. And it surfaced

00:08:35.190 --> 00:08:39.549
over 1 ,200 entirely new immune patterns. Patterns

00:08:39.549 --> 00:08:41.929
linked directly to cancer stage and survival

00:08:41.929 --> 00:08:45.889
statistics. Yeah. That is profound. And the key

00:08:45.889 --> 00:08:48.269
strategic choice here, again, is that Microsoft

00:08:48.269 --> 00:08:50.820
open -sourced the model. That open source move

00:08:50.820 --> 00:08:53.679
is strategic genius. It accelerates global adoption

00:08:53.679 --> 00:08:57.360
immediately. Creating a massive real world feedback

00:08:57.360 --> 00:09:00.899
loop. Yes. The playbook is clear. Cheaper tools

00:09:00.899 --> 00:09:03.120
mean more hospitals use them, which generates

00:09:03.120 --> 00:09:05.600
better local data, which builds better models

00:09:05.600 --> 00:09:08.820
globally faster. It's a virtuous cycle for science.

00:09:09.000 --> 00:09:10.659
So if we connect this to the bigger picture,

00:09:10.799 --> 00:09:13.519
what is the critical advantage of using Gigatime's

00:09:13.519 --> 00:09:16.330
open source strategy? It builds a massive feedback

00:09:16.330 --> 00:09:19.210
ecosystem, leading to better models faster through

00:09:19.210 --> 00:09:21.669
global use. So what does this all mean for us?

00:09:21.769 --> 00:09:24.350
The AI industry right now is defined by these

00:09:24.350 --> 00:09:26.830
incredibly dramatic internal shifts. The cultural

00:09:26.830 --> 00:09:29.350
demolition inside metaphor avocado. Right. And

00:09:29.350 --> 00:09:32.409
at the same time, this phenomenal, undeniable

00:09:32.409 --> 00:09:35.190
real world utility like Gigatime, we're watching

00:09:35.190 --> 00:09:37.169
two radically different philosophies collide.

00:09:37.470 --> 00:09:40.460
And the key takeaway is this. The race isn't

00:09:40.460 --> 00:09:42.320
simply about building the most powerful model,

00:09:42.440 --> 00:09:45.620
period. It's about choosing the strategy closed

00:09:45.620 --> 00:09:48.500
and proprietary, which is Meta's new bet, or

00:09:48.500 --> 00:09:51.299
open an ecosystem building like Microsoft is

00:09:51.299 --> 00:09:53.539
doing in medicine. Which one wins the long -term

00:09:53.539 --> 00:09:56.840
innovation war? Exactly. And consider this as

00:09:56.840 --> 00:09:59.590
you go about your day. Does the future of critical

00:09:59.590 --> 00:10:02.789
innovation hinge on proprietary control, which

00:10:02.789 --> 00:10:05.289
Meta is betting billions on with a secretive

00:10:05.289 --> 00:10:07.889
model like Avocado? Or does the greatest human

00:10:07.889 --> 00:10:10.490
impact come from shared open knowledge demonstrated

00:10:10.490 --> 00:10:13.730
by life -saving accessible models like Gigatime?

00:10:13.870 --> 00:10:17.370
Which path truly advances society faster? Thank

00:10:17.370 --> 00:10:19.009
you for sharing the sources for this deep dive.

00:10:19.169 --> 00:10:21.250
Keep exploring that intersection of technology

00:10:21.250 --> 00:10:23.250
and human impact because that's where the real

00:10:23.250 --> 00:10:23.789
story lives.
