WEBVTT

00:00:00.000 --> 00:00:03.279
So I spent, I think, an entire evening just recently

00:00:03.279 --> 00:00:05.759
trying to cross -reference these three different

00:00:05.759 --> 00:00:08.320
internal reports, just looking for one specific

00:00:08.320 --> 00:00:10.400
number. We're all just pile up information, aren't

00:00:10.400 --> 00:00:13.339
we? PDFs, articles we save, transcripts from

00:00:13.339 --> 00:00:17.440
videos. Sometimes it feels less like a library

00:00:17.440 --> 00:00:19.719
and more like, I don't know, just drowning in

00:00:19.719 --> 00:00:22.140
data. Oh, yeah, it's the classic research problem,

00:00:22.239 --> 00:00:24.320
right? But it's amplified now with all these

00:00:24.320 --> 00:00:27.359
tools. We need info fast. But then you turn to

00:00:27.359 --> 00:00:30.120
a general AI, and sometimes it just starts. Well,

00:00:30.480 --> 00:00:32.299
making things up because it's looking at the

00:00:32.299 --> 00:00:35.219
whole messy internet. Right. So imagine if you

00:00:35.219 --> 00:00:38.299
had this AI assistant, but it was totally personalized,

00:00:38.659 --> 00:00:42.119
a research guide that only and strictly uses

00:00:42.119 --> 00:00:44.479
your trusted information. It basically stops

00:00:44.479 --> 00:00:46.880
that whole hallucination problem because, well,

00:00:46.939 --> 00:00:48.479
it can't talk about something it hasn't read

00:00:48.479 --> 00:00:50.140
from you. And that's exactly what we're digging

00:00:50.140 --> 00:00:52.859
into today. We're exploring a notebook LM. It's

00:00:52.859 --> 00:00:55.420
this really interesting tool from Google. It

00:00:55.420 --> 00:00:58.060
fundamentally changes how you can organize and,

00:00:58.100 --> 00:01:00.380
more importantly, use all that knowledge you've

00:01:00.380 --> 00:01:02.979
gathered. OK, so today we'll break down the core

00:01:02.979 --> 00:01:04.980
tech behind it. We'll look at the workspace,

00:01:05.060 --> 00:01:07.500
which is actually pretty simple. Then we'll walk

00:01:07.500 --> 00:01:09.359
you through setting up your first notebook focused

00:01:09.359 --> 00:01:11.760
on your stuff. And then we'll get into the cool

00:01:11.760 --> 00:01:14.980
creative tools like making quizzes automatically

00:01:14.980 --> 00:01:18.079
or even presentation videos. This is really about

00:01:18.079 --> 00:01:21.250
giving you a way to master large chunks of information,

00:01:21.609 --> 00:01:23.510
but specifically targeted to the data you've

00:01:23.510 --> 00:01:27.030
already collected. Okay, let's get into it. So

00:01:27.030 --> 00:01:30.049
fundamentally, Notebook LM is built to be your

00:01:30.049 --> 00:01:32.670
personal research assistant, super focused. But

00:01:32.670 --> 00:01:35.670
what's the single biggest tech difference between

00:01:35.670 --> 00:01:38.620
this and just a standard chatbot? It really boils

00:01:38.620 --> 00:01:40.799
down to the constraint. It's forced limitation.

00:01:41.359 --> 00:01:43.260
Standard AI looks everywhere, the whole web.

00:01:43.480 --> 00:01:45.819
Notebook LM is strictly limited. Only the documents

00:01:45.819 --> 00:01:48.659
you upload, your PDFs, your notes, your transcripts,

00:01:48.760 --> 00:01:51.019
nothing else. And that's what makes it genuinely

00:01:51.019 --> 00:01:53.879
personal and maybe most importantly, trustworthy.

00:01:54.260 --> 00:01:57.359
So if I upload, say, 15 really specific files

00:01:57.359 --> 00:02:01.359
about internal project metrics, the AI can only

00:02:01.359 --> 00:02:03.040
quote from those 15 files. It won't bring in

00:02:03.040 --> 00:02:05.730
anything else. Exactly. That's it. And the tech

00:02:05.730 --> 00:02:07.750
that makes this happen, that guarantees it, is

00:02:07.750 --> 00:02:10.949
called RAG -G, Retrieval Augmented Generation.

00:02:11.430 --> 00:02:14.729
Basically, RAG just means in simple terms. The

00:02:14.729 --> 00:02:17.729
AI first reads and kind of indexes your documents.

00:02:18.210 --> 00:02:21.050
Then it uses only that reading to generate answers.

00:02:21.330 --> 00:02:23.110
So the answers are accurate. You can check them.

00:02:23.169 --> 00:02:26.289
It's like a closed loop. Huh. So it's almost

00:02:26.289 --> 00:02:29.870
like a lawyer arguing a case. They can only use

00:02:29.870 --> 00:02:32.169
the specific evidence you handed them. They can't

00:02:32.169 --> 00:02:34.530
just speculate or pull something from across

00:02:34.530 --> 00:02:36.569
the street. That's a great analogy, yeah. Perfect.

00:02:36.750 --> 00:02:38.569
The output is always anchored in the evidence

00:02:38.569 --> 00:02:41.870
you provide. OK, so then if the AI is strictly

00:02:41.870 --> 00:02:45.590
limited like that, only using my sources, how

00:02:45.590 --> 00:02:48.030
does that really change the quality, the reliability

00:02:48.030 --> 00:02:50.060
of the answers I get? Well, the answers become

00:02:50.060 --> 00:02:52.340
really accurate. You can trust them. And crucially,

00:02:52.460 --> 00:02:54.520
they're always tied directly back to your original

00:02:54.520 --> 00:02:57.039
source material. So when you first open it up,

00:02:57.159 --> 00:02:58.900
the interface, it's really built for function.

00:02:59.000 --> 00:03:01.780
It avoids confusion. It's basically split into

00:03:01.780 --> 00:03:04.319
three main parts, which makes handling complex

00:03:04.319 --> 00:03:07.000
research, well, simpler. OK, so on the left,

00:03:07.080 --> 00:03:09.379
you've got your library. That's like your personal

00:03:09.379 --> 00:03:12.460
bookshelf, right? For all sorts of files, Google

00:03:12.460 --> 00:03:16.770
Docs, PDFs. website links, even YouTube transcripts,

00:03:17.030 --> 00:03:19.810
that's where your raw stuff lives. Then right

00:03:19.810 --> 00:03:21.669
in the middle, that's the conversation space,

00:03:21.849 --> 00:03:23.569
that's your main workspace, that's where you

00:03:23.569 --> 00:03:26.449
actually chat with the AI, ask it to summarize

00:03:26.449 --> 00:03:29.590
that long technical manual, or maybe compare

00:03:29.590 --> 00:03:31.729
ideas from two different files you uploaded.

00:03:31.969 --> 00:03:34.090
And the really key thing there, the quality check,

00:03:34.490 --> 00:03:37.169
is the citation. Every single thing it says every

00:03:37.169 --> 00:03:39.509
fact has this little number next to it, right?

00:03:39.870 --> 00:03:42.150
And it links directly back to the exact spot

00:03:42.150 --> 00:03:44.069
in your original document So you can check it

00:03:44.069 --> 00:03:46.370
instantly super important. Yep, and then over

00:03:46.370 --> 00:03:49.270
on the right. There's the creative toolbox They

00:03:49.270 --> 00:03:51.270
call it the studio panel. Think of that as the

00:03:51.270 --> 00:03:53.490
workshop That's where the AI takes your simple

00:03:53.490 --> 00:03:56.090
data and turns it into useful things quizzes

00:03:56.090 --> 00:03:58.930
mind maps flashcards that kind of stuff The flexibility

00:03:58.930 --> 00:04:01.710
seems like the big win here. Students could use

00:04:01.710 --> 00:04:04.770
it for prepping for complex exams, easily grabbing

00:04:04.770 --> 00:04:07.530
citations for essays. Writers could use it to

00:04:07.530 --> 00:04:10.469
organize tons of research from different articles

00:04:10.469 --> 00:04:12.689
for a book or song. Absolutely. And for professionals.

00:04:12.830 --> 00:04:15.530
Imagine you get that 50 -page business report

00:04:15.530 --> 00:04:19.439
Monday morning. Dense stuff. Instead of spending

00:04:19.439 --> 00:04:22.379
hours reading it, you just feed it into the conversation

00:04:22.379 --> 00:04:24.819
space and ask, okay, what are the three main

00:04:24.819 --> 00:04:27.399
risks here and how do they suggest mitigating

00:04:27.399 --> 00:04:29.720
them? Boom, you're ready for the meeting. Right.

00:04:30.680 --> 00:04:33.240
So beyond just studying your reading summaries,

00:04:34.100 --> 00:04:36.899
what's maybe the fastest way a professional could

00:04:36.899 --> 00:04:39.699
use that citation feature? especially when they're

00:04:39.699 --> 00:04:41.980
on a tight deadline. Those quick citation checks

00:04:41.980 --> 00:04:44.560
just build credibility instantly. You know, I'll

00:04:44.560 --> 00:04:46.560
admit, I still wrestle with prompt drift myself

00:04:46.560 --> 00:04:49.459
sometimes. So being able to quickly verify the

00:04:49.459 --> 00:04:51.680
AI's claim against the source, that's essential.

00:04:52.079 --> 00:04:54.040
Okay, let's make this practical. Let's walk through

00:04:54.040 --> 00:04:56.220
setting up a notebook. Say we're researching.

00:04:56.660 --> 00:04:59.439
Sustainable living. Pretty common topic. Sure.

00:05:00.000 --> 00:05:02.959
So step one is easy. Sign in, click PUP plus

00:05:02.959 --> 00:05:05.620
new notebook. Step two is all about the sourcing.

00:05:05.660 --> 00:05:07.519
This is where you feed the AI your knowledge

00:05:07.519 --> 00:05:10.339
base. You can pull files from Google Drive, upload

00:05:10.339 --> 00:05:12.920
that important PDF guide you found, paste in

00:05:12.920 --> 00:05:15.379
a Wikipedia link. And this is cool. Paste a YouTube

00:05:15.379 --> 00:05:17.339
video link. It'll grab the whole transcript.

00:05:17.579 --> 00:05:19.899
Okay, so everything's loaded in. Now you start

00:05:19.899 --> 00:05:21.779
the conversation, right, in that chat window.

00:05:22.439 --> 00:05:25.300
But the key is getting specific with your prompts,

00:05:25.459 --> 00:05:27.480
isn't it? Not just general questions. Absolutely.

00:05:27.540 --> 00:05:29.399
You've got to move beyond what is sustainable

00:05:29.399 --> 00:05:32.800
living. Ask for something specific like, are

00:05:32.800 --> 00:05:34.779
there any tips about saving water mentioned in

00:05:34.779 --> 00:05:37.560
the PDF and the transcript? Or maybe a comparison

00:05:37.560 --> 00:05:40.259
prompt. Compare how the Wikipedia article defines

00:05:40.259 --> 00:05:42.500
sustainable living versus how the YouTube video

00:05:42.500 --> 00:05:45.910
explains it. Get specific. And again, every answer

00:05:45.910 --> 00:05:48.910
it gives is grounded by those little clickable

00:05:48.910 --> 00:05:51.449
citation numbers. It's like built -in fact checking.

00:05:51.709 --> 00:05:53.889
Makes the whole thing reliable. Yeah, it's like

00:05:53.889 --> 00:05:56.129
having a super efficient librarian who points

00:05:56.129 --> 00:05:58.550
you to the exact sentence on the right page every

00:05:58.550 --> 00:06:00.750
single time. Cuts down massively on that time

00:06:00.750 --> 00:06:02.910
you'd normally spend checking footnotes. That

00:06:02.910 --> 00:06:05.870
sounds really useful. But what about, what if

00:06:05.870 --> 00:06:08.819
I feed it two sources that totally clash? Like,

00:06:09.060 --> 00:06:11.459
one article says plastic recycling is great and

00:06:11.459 --> 00:06:13.680
another says it's basically useless. Does the

00:06:13.680 --> 00:06:16.300
AI try to pick a side or smooth it over? Good

00:06:16.300 --> 00:06:19.019
question. No, it's designed to actually highlight

00:06:19.019 --> 00:06:21.740
the conflict. It'll usually say something like,

00:06:21.860 --> 00:06:25.720
source one says X, but source two says Y. It

00:06:25.720 --> 00:06:28.779
presents the differing views with the citations

00:06:28.779 --> 00:06:31.420
so you can see the conflict and decide based

00:06:31.420 --> 00:06:33.740
on the evidence. It won't create some kind of

00:06:33.740 --> 00:06:36.620
false consensus. Okay, that makes sense. So,

00:06:36.620 --> 00:06:38.360
say I've just started looking into this topic.

00:06:38.939 --> 00:06:41.639
How can I find more trustworthy info without

00:06:41.639 --> 00:06:43.959
having to leave the notebook interface? Ah, yeah.

00:06:43.959 --> 00:06:46.000
You can use the new Discover Sources feature.

00:06:46.319 --> 00:06:48.660
It suggests related, high -quality sources, often

00:06:48.660 --> 00:06:50.800
from places like universities or research groups

00:06:50.800 --> 00:06:52.819
based on what you've already uploaded. All right.

00:06:52.819 --> 00:06:54.360
So we've talked about pulling information out.

00:06:54.600 --> 00:06:57.120
Now let's shift to creating things with it. That's

00:06:57.120 --> 00:06:59.079
where the Studio Panel, the creative toolkit,

00:06:59.160 --> 00:07:02.160
comes in. This seems like where the AI goes beyond

00:07:02.160 --> 00:07:04.259
just summarizing. It actually starts generating

00:07:04.259 --> 00:07:07.180
useful, finished, knowledge products for you.

00:07:07.230 --> 00:07:09.689
Yeah, and the strategic value here is important.

00:07:09.750 --> 00:07:12.970
It's not just about, say, making flashcards.

00:07:13.050 --> 00:07:15.949
It's about making flashcards based only on the

00:07:15.949 --> 00:07:18.769
specific technical terms from your proprietary

00:07:18.769 --> 00:07:20.990
documents that you need to memorize. It's targeted.

00:07:21.509 --> 00:07:23.750
OK, like that discover sources feature we just

00:07:23.750 --> 00:07:26.029
mentioned. Yeah. That sounds like a real research

00:07:26.029 --> 00:07:28.850
booster. It looks at your stuff and then actively

00:07:28.850 --> 00:07:31.449
suggests related academic sources. And you can

00:07:31.449 --> 00:07:32.990
get really specific with the prompt there, too.

00:07:33.089 --> 00:07:34.550
Oh, yeah, definitely. You could ask something

00:07:34.550 --> 00:07:37.699
like, Based on my documents about recycling and

00:07:37.699 --> 00:07:40.620
composting, find me five recent academic papers

00:07:40.620 --> 00:07:44.420
on the circular economy. And also effective waste

00:07:44.420 --> 00:07:47.040
management policies, specifically in major port

00:07:47.040 --> 00:07:49.980
cities in Southeast Asia. I mean, that's a pretty

00:07:49.980 --> 00:07:52.839
sophisticated targeted research request in just

00:07:52.839 --> 00:07:56.079
one go. Then you got the study tools, like quizzes.

00:07:56.379 --> 00:07:58.079
It just makes them automatically to test your

00:07:58.079 --> 00:08:01.100
knowledge. Exactly. And you can demand really

00:08:01.100 --> 00:08:03.199
specific kinds of quizzes, too. You might tell

00:08:03.199 --> 00:08:06.459
it, create a quiz, make it 15 difficult questions,

00:08:06.660 --> 00:08:09.279
mix of multiple choice and true -false, and focus

00:08:09.279 --> 00:08:11.779
only on the stats and scientific terms from these

00:08:11.779 --> 00:08:14.819
specific science reports I uploaded. Ignore the

00:08:14.819 --> 00:08:17.100
general concepts. It tests exactly what you need

00:08:17.100 --> 00:08:20.759
it to test. And flashcards for just straightforward

00:08:20.759 --> 00:08:23.360
memorization. Like, make me 30 flashcards to

00:08:23.360 --> 00:08:26.319
memorize key definitions about sustainable farming

00:08:26.319 --> 00:08:30.139
techniques. on things like permaculture or hydroponics

00:08:30.139 --> 00:08:32.220
from my sources. Yeah. And personally, I really

00:08:32.220 --> 00:08:34.539
like the mind maps. For visual learners, seeing

00:08:34.539 --> 00:08:37.720
that big picture is so helpful. It lays it out

00:08:37.720 --> 00:08:39.860
visually. You know, like sustainable living in

00:08:39.860 --> 00:08:42.399
the middle, then main branches like reduce waste,

00:08:42.600 --> 00:08:44.519
save energy, and then smaller branches off those,

00:08:44.659 --> 00:08:46.679
like composting, recycling. It turns a dense

00:08:46.679 --> 00:08:48.759
topic into something you can see. These tools

00:08:48.759 --> 00:08:51.379
really do seem to move way beyond just summarizing.

00:08:51.700 --> 00:08:53.679
So how quickly can they actually take us from

00:08:53.679 --> 00:08:57.080
just raw data, I mean, messy notes, to a finished

00:08:57.230 --> 00:09:00.029
maybe even professional -level output. Whoa.

00:09:00.710 --> 00:09:03.309
I mean, imagine creating a whole structured presentation

00:09:03.309 --> 00:09:06.409
video, fully sourced from Documency Trust, in

00:09:06.409 --> 00:09:09.730
just minutes. It drastically speeds up both the

00:09:09.730 --> 00:09:11.750
learning part and the actual production part.

00:09:12.370 --> 00:09:14.389
And that speed in production brings us to the

00:09:14.389 --> 00:09:17.549
really polished, media -ready outputs. First

00:09:17.549 --> 00:09:20.450
up, there's the audio overview. This thing turns

00:09:20.450 --> 00:09:22.870
your documents into a short, kind of flexible

00:09:22.870 --> 00:09:25.909
audio summary, like a custom mini podcast just

00:09:25.909 --> 00:09:28.549
for you, based on your stuff. That sounds great

00:09:28.549 --> 00:09:31.029
for absorbing complex info while you're, like,

00:09:31.129 --> 00:09:32.929
commuting or exercising. You could prompt it

00:09:32.929 --> 00:09:35.129
something like, make a seven -minute audio overview,

00:09:35.549 --> 00:09:37.730
use the style of a thoughtful documentary narrator,

00:09:38.269 --> 00:09:40.230
tell a story about plastic pollution's impact,

00:09:40.509 --> 00:09:42.110
and present the solutions found in my notes.

00:09:42.519 --> 00:09:45.259
Pretty cool and then the video overview. Honestly,

00:09:45.399 --> 00:09:47.779
this one feels pretty groundbreaking. It automatically

00:09:47.779 --> 00:09:49.659
generates a three to four minute presentation

00:09:49.659 --> 00:09:51.840
video We're talking dynamic slides a voiceover

00:09:51.840 --> 00:09:54.399
all created automatically and based only on your

00:09:54.399 --> 00:09:57.559
source documents Wow, so you could build a professional

00:09:57.559 --> 00:10:01.379
looking deck on say the impact of fast fashion

00:10:01.529 --> 00:10:04.590
You'd tell it to structure the video into maybe

00:10:04.590 --> 00:10:07.470
environmental costs, social impacts, and sustainable

00:10:07.470 --> 00:10:09.929
alternatives, feed it the source articles and

00:10:09.929 --> 00:10:12.350
reports, and you've got a presentation ready

00:10:12.350 --> 00:10:14.370
in minutes. Pretty much. And it's not just for

00:10:14.370 --> 00:10:16.570
English speakers. You can generate the script

00:10:16.570 --> 00:10:18.649
and slides in other languages too, like Vietnamese,

00:10:18.730 --> 00:10:21.090
for example. Makes your research accessible much

00:10:21.090 --> 00:10:23.629
more broadly. That speed is definitely incredible.

00:10:23.850 --> 00:10:27.070
But it does make me wonder, if you generate a

00:10:27.070 --> 00:10:30.100
video presentation, in, say, three minutes, instead

00:10:30.100 --> 00:10:33.460
of spending 20 hours crafting it, does that maybe

00:10:33.460 --> 00:10:36.860
bypass some of the critical thinking needed to

00:10:36.860 --> 00:10:39.320
really internalize the material? That's a really

00:10:39.320 --> 00:10:41.360
thoughtful point. I think the value shifts, you

00:10:41.360 --> 00:10:43.740
know? You save a ton of time on the design aspect,

00:10:44.200 --> 00:10:46.259
but you probably need to spend more focused time

00:10:46.259 --> 00:10:48.480
on the prompting, making sure the structure is

00:10:48.480 --> 00:10:50.980
exactly right. and then critically verifying

00:10:50.980 --> 00:10:53.779
the content using those citations. So the critical

00:10:53.779 --> 00:10:56.100
thought moves maybe from creation to validation

00:10:56.100 --> 00:10:58.080
and synthesis. It's a different kind of engagement.

00:10:58.940 --> 00:11:01.100
OK, so to make sure that synthesis is high quality,

00:11:01.539 --> 00:11:03.879
we have those three best practices. First one,

00:11:04.620 --> 00:11:07.720
quality over quantity. Garbage in, garbage out.

00:11:08.059 --> 00:11:10.480
Always holds true, right? Absolutely. Second,

00:11:10.879 --> 00:11:14.000
be specific. Don't just ask vaguely, Tell me

00:11:14.000 --> 00:11:17.100
about recycling. Ask. List the step -by -step

00:11:17.100 --> 00:11:19.500
process for recycling plastic bottles correctly,

00:11:19.940 --> 00:11:23.320
based only on the guidelines in Document X. Specific

00:11:23.320 --> 00:11:25.919
prompts get accurate results. Makes sense. And

00:11:25.919 --> 00:11:28.279
third, break down big topics. If you're tackling

00:11:28.279 --> 00:11:30.000
something huge like, I don't know, the entire

00:11:30.000 --> 00:11:31.840
history of the space shuttle program, maybe split

00:11:31.840 --> 00:11:34.500
it up. Make smaller, focused notebooks, one for

00:11:34.500 --> 00:11:36.279
design, one for missions, one for the end of

00:11:36.279 --> 00:11:38.580
the program. Keep the AI focused on manageable

00:11:38.580 --> 00:11:41.659
chunks. Right. So given how fast these AI tools

00:11:41.659 --> 00:11:43.919
are changing all the time, what should listeners

00:11:43.919 --> 00:11:46.139
maybe keep an eye out for? What should they prioritize

00:11:46.139 --> 00:11:48.779
checking for next week with Notebook LM? Yeah,

00:11:48.860 --> 00:11:51.200
good question. I'd say just check back often.

00:11:51.340 --> 00:11:54.159
See what new creative tools pop up or what new

00:11:54.159 --> 00:11:56.740
kinds of sources you can upload. The pace of

00:11:56.740 --> 00:11:59.039
updates is pretty fast. And it's all aimed at

00:11:59.039 --> 00:12:01.100
helping you learn faster and get more inventive

00:12:01.100 --> 00:12:04.669
with your own knowledge. So when you step back,

00:12:04.889 --> 00:12:06.970
Notebook LM really feels like it changes how

00:12:06.970 --> 00:12:08.850
we interact with our own knowledge. It's not

00:12:08.850 --> 00:12:11.330
just another general search tool. It becomes

00:12:11.330 --> 00:12:14.269
this personalized, creative, and importantly,

00:12:14.809 --> 00:12:17.210
highly accurate assistant for managing knowledge.

00:12:17.470 --> 00:12:19.590
Yeah, the goal shifts, doesn't it? It's not just

00:12:19.590 --> 00:12:22.049
about collecting facts anymore. It's about actively

00:12:22.049 --> 00:12:24.830
using them, synthesizing them, creating new things

00:12:24.830 --> 00:12:27.750
based on them. Outputs you can actually trust

00:12:27.750 --> 00:12:29.970
because they're tied directly back to your sources.

00:12:30.200 --> 00:12:33.539
So the question for you listening is, what complex

00:12:33.539 --> 00:12:36.639
50 page report or maybe what huge online course

00:12:36.639 --> 00:12:38.519
or lecture series are you going to finally master

00:12:38.519 --> 00:12:40.960
using these kinds of features? It's time to stop

00:12:40.960 --> 00:12:43.600
feeling buried by all that data and actually

00:12:43.600 --> 00:12:45.559
start organizing that knowledge effectively.

00:12:45.779 --> 00:12:47.960
Yeah, just start building that first notebook.

00:12:48.720 --> 00:12:50.940
You might genuinely be surprised how fast you

00:12:50.940 --> 00:12:54.460
can take maybe years of saved articles and notes

00:12:54.460 --> 00:12:56.919
and turn them into a clear, verifiable summary.

00:12:57.360 --> 00:12:58.700
We'll catch you on the next Deep Dive.
