Claude Code + NotebookLM: Free AI Research Stack
Claude Code + NotebookLM: How to Build a Free AI Research Stack
Combining Claude Code with Google's NotebookLM gives you a zero-cost research pipeline that scrapes YouTube, runs RAG-based analysis, and generates deliverables like infographics and slide decks — all without burning tokens on the heavy lifting. I've been using this setup for my own research workflows, and it replaces a stack that would otherwise cost hundreds of dollars a month to build and maintain. Here's exactly how to set it up in about five minutes.
Most people's version of "research with Claude Code" is telling it to use the web search tool and hoping whatever comes back is good enough. That's fine for quick lookups, but it's not a research system. What I'm about to show you is a proper pipeline: Claude Code handles the sourcing and orchestration, NotebookLM handles the analysis and deliverables, and Google pays for the compute. You pay virtually nothing.
What Does This Claude Code + NotebookLM Workflow Actually Do?
Here's the short version: with a single prompt, Claude Code scrapes YouTube for relevant videos, pushes those sources into NotebookLM, has NotebookLM analyze the content, and then pulls that analysis back — along with any deliverable you want (infographics, slide decks, podcasts, flashcards, mind maps, you name it).
Let me break down what happens step by step:
- A custom YouTube search skill finds the latest videos on whatever topic you specify
- Claude Code sends those video URLs to NotebookLM using an unofficial Python API
- NotebookLM ingests the video captions and builds a knowledge base (this is essentially a free RAG system)
- NotebookLM runs analysis on that corpus — identifying trends, ranking topics, summarizing insights
- Claude Code pulls the analysis back into your local environment
- You request a deliverable (infographic, quiz, slide deck) and it gets generated and saved to your project folder
The critical thing to understand: all the analysis and content generation happens on Google's servers through NotebookLM, not through Claude Code's token budget. Claude Code only spends a small number of tokens sending requests and receiving results. The thinking — the expensive part — is offloaded to Google. And NotebookLM is free.
Why Is NotebookLM the Most Underrated AI Tool Right Now?
NotebookLM is Google's AI research tool that analyzes up to 50 sources — PDFs, URLs, YouTube videos — and generates insights in multiple formats including audio overviews, mind maps, flashcards, quizzes, slide decks, and infographics. It's powered by Gemini under the hood and it's free on the standard tier.
If you tried to build what NotebookLM does from scratch, you'd need:
- A scraping system to pull content from various sources
- A vector database for the RAG pipeline
- An analysis layer to process and synthesize the content
- A deliverable generation system for different output formats
I've tried building similar setups using tools like n8n for the automation layer. It's not simple. It takes significant time to configure, it's brittle, and it costs money for the infrastructure and API calls. NotebookLM abstracts all of that away for free.
The 50-source limit per notebook on the free tier is generous for most research tasks. And everything NotebookLM can do manually through its web interface — chat with sources, generate audio overviews, create mind maps, export quizzes — you can now do programmatically through Claude Code.
How Do You Connect NotebookLM to Claude Code?
This is the part where people get stuck, because NotebookLM does not have a public API. But a developer named Teng Lin built an open-source solution: notebooklm-py, an unofficial Python API for Google NotebookLM that gives you full programmatic access.
Here's how to set it up:
Step 1: Install the notebooklm-py Package
Open a separate terminal (not Claude Code — a regular terminal window) and run the installation commands from the GitHub repo. The install is straightforward pip-based setup.
Step 2: Authenticate with NotebookLM
In that same separate terminal, run the NotebookLM login command:
notebooklm-py login
This opens a Chrome window where you log in with your Google account. You only need to do this once.
Step 3: Install the Claude Code Skill
You can either run the skill installation command in your terminal or tell Claude Code to do it. The skill is just a markdown file — a set of plain-language instructions that teaches Claude Code how to interact with the notebooklm-py API. It covers how to create notebooks, add sources, request analysis, and generate deliverables.
That's it. Three steps, about five minutes total.
How Do You Set Up the YouTube Search Skill?
Before NotebookLM can analyze anything, you need sources. The YouTube search skill handles this part of the pipeline. It uses a Python script built on the yt-dlp dependency — a well-known open-source tool for extracting metadata from YouTube — to search YouTube and pull back titles, view counts, authors, durations, and URLs.
You have two options for getting this skill running:
-
Build it yourself: Tell Claude Code you want a custom YouTube search skill that uses the yt-dlp dependency. Explain that you want it to search YouTube by query and return metadata. Claude Code will build the script and the skill file for you.
-
Download the pre-built skill: I have the complete YouTube search skill setup file available in my free Skool community. Just download the markdown file and hand it to Claude Code.
Once installed, the skill shows up as a slash command: /yt-search with parameters for your query and the number of results you want. But you can also just use plain language — "find me the top 20 videos about Claude Code skills" works fine.
What Does This Look Like in Practice?
Let me walk you through the demo I ran. I gave Claude Code a single prompt that told it to:
- Use the YouTube search skill to find the latest trending videos on Claude Code skills
- Send those URLs to NotebookLM
- Have NotebookLM analyze the videos and identify the top skills
- Generate an infographic in a handwritten blueprint style depicting the analysis
Here's what happened:
- Claude Code found and uploaded 20 YouTube sources into NotebookLM. For each video, I could see the title, creator, view count, duration, and date right in the terminal.
- NotebookLM analyzed all 20 videos and returned the top five Claude Code skills along with emerging trends for how they're used.
- An infographic was generated using Nano Banana Pro (Google's image generation model) in the handwritten blueprint style I requested. It automatically appeared in my project folder.
The key detail: all the content in that infographic was grounded in the actual videos. NotebookLM wasn't making things up — it was synthesizing information from 20 real sources with citations you can verify.
You can also break the process into steps if you want more control. Run the YouTube search first, review the sources, then push them to NotebookLM, then request specific analysis or deliverables one at a time. Sometimes that's smarter than doing everything in a single prompt.
What Deliverables Can You Generate with This Workflow?
Anything NotebookLM supports — and then some. Through the notebooklm-py API, you actually get more functionality than the web interface alone, including batch downloads and JSON exports.
Here's what you can request from Claude Code:
- Audio overviews (podcast-style discussions of your sources)
- Mind maps of the key concepts and relationships
- Flashcards for study or training purposes
- Quizzes exportable as JSON
- Slide decks with prompt-based revisions
- Infographics in whatever visual style you specify
- Text analysis — summaries, comparisons, trend identification
All of it stays grounded in your uploaded sources. And all of it generates through a simple plain-language request to Claude Code.
Why Is This Combination Better Than Either Tool Alone?
There are two reasons this pairing is more powerful than the sum of its parts.
First, automation removes friction. Everything I described — the YouTube search, the source upload, the analysis, the deliverable generation — you could do manually inside NotebookLM's web interface. You could search YouTube yourself, copy-paste 20 URLs, click around the UI, and get similar results. But automating it through Claude Code means I can run this workflow in the background while I do other work. I can also chain it with other Claude Code tasks.
Second, bringing NotebookLM's analysis into Claude Code's ecosystem is where the real power lives. Once that analysis is in Claude Code, I can do anything with it — feed it into another workflow, use it to write content, build on it with additional research, generate multiple deliverables from the same source material. NotebookLM becomes a research backend that Claude Code orchestrates.
And the cost structure is wild. The entire research and analysis pipeline runs on Google's infrastructure for free. Claude Code only burns tokens on the orchestration layer — sending requests and receiving responses. The heavy compute (ingesting 20 videos, building the RAG index, running analysis, generating deliverables) all happens on NotebookLM's side.
What Are the Limits of This Setup?
Being honest about the constraints:
- NotebookLM's free tier caps you at 50 sources per notebook. For most research tasks, 20-30 sources is plenty, but if you need massive corpus analysis, you'll hit this wall. Paid tiers expand this up to 600 sources.
- notebooklm-py is an unofficial API. It works by reverse-engineering NotebookLM's internal protocols. Google could change things and break it. That said, the repo is actively maintained and has over 1,300 stars on GitHub.
- The YouTube search skill depends on yt-dlp, which occasionally needs updating as YouTube changes its platform. This is well-known in the yt-dlp community and updates come quickly.
None of these are deal-breakers. They're worth knowing so you're not surprised.
Frequently Asked Questions
Does NotebookLM have a public API?
No. As of now, Google has not released an official API for NotebookLM. The notebooklm-py package by Teng Lin is an unofficial Python API that reverse-engineers NotebookLM's internal protocols. It requires a one-time browser login for authentication and gives you full programmatic access to create notebooks, add sources, chat, and generate deliverables.
How much does this research workflow cost to run?
The NotebookLM side is completely free on the standard tier. The only cost is your Claude Code usage, and because the heavy analysis is offloaded to NotebookLM, Claude Code only spends a small number of tokens on orchestration — sending requests and receiving results. Compared to having Claude Code do all the research and analysis itself, the token savings are massive.
What kinds of sources can I push to NotebookLM besides YouTube videos?
NotebookLM accepts PDFs, Google Docs, URLs/websites, plain text, and YouTube videos. You can upload up to 50 sources per notebook on the free tier, with each source supporting up to 500,000 words. This means the YouTube workflow is just one use case — you could build similar pipelines around web articles, research papers, or internal documents.
Do I need to be technical to set this up?
You need basic comfort with a terminal — running pip install commands and navigating folders. You don't need to write code from scratch. The notebooklm-py package handles the API complexity, and Claude Code skills are just plain-language markdown files. If you can follow step-by-step instructions, you can get this running in five minutes.
Can I use this for something other than YouTube research?
Absolutely. The YouTube search skill is one input method. You could build skills that scrape websites, pull from RSS feeds, or ingest documents from a local folder. The NotebookLM side doesn't care where the sources come from — it just needs URLs, text, or files. The workflow pattern (source → NotebookLM → analysis → deliverable) works for any research domain.
If you want to go deeper into Claude Code workflows and AI research automation, join the free Chase AI community for templates, prompts, and live breakdowns. And if you're serious about building with AI, check out the paid community, Chase AI+, for hands-on guidance on how to make money with AI.


