Using the NotebookLM Model Context Protocol (MCP) server in Codex lets a coding session ask grounded questions against a NotebookLM notebook instead of relying on general model recall. That is useful when a library, internal API, or project handbook changes faster than public documentation and the session needs source-backed answers before it writes code.

Current NotebookLM MCP upstream docs describe the Codex integration as a local stdio server added with codex mcp add notebooklm -- npx -y notebooklm-mcp@latest. Current local verification shows Codex storing that server as a named MCP entry, while the server itself exposes question, notebook, library, session, and authentication tools that bridge Codex to a NotebookLM browser session.

A successful run still depends on a valid Google sign-in and a notebook that NotebookLM can access. Current Google help says shared public notebooks need Anyone with a link access and that public sharing is limited to consumer accounts, while current upstream server docs default new installs to the full tool profile unless a smaller profile such as minimal is configured for query-only use.

Steps to use the NotebookLM MCP server in Codex:

  1. Confirm that Codex can see the saved NotebookLM MCP entry before starting a chat session.
    $ codex mcp get notebooklm
    notebooklm
      enabled: true
      transport: stdio
      command: npx
      args: -y notebooklm-mcp@latest
      cwd: -
      env: -
      remove: codex mcp remove notebooklm

    If the saved server name is different, run How to list Codex MCP servers first and use the exact Name value from that output. If no entry exists yet, add it with codex mcp add notebooklm -- npx -y notebooklm-mcp@latest before continuing.

  2. Start an interactive Codex session in the project or working directory where the NotebookLM-backed answers will be used.
    $ codex

    Current upstream NotebookLM MCP quick-start guidance uses chat prompts inside an active Codex session, so interactive mode is the clearest path when follow-up questions or code changes depend on the same notebook context.

  3. Ask Codex to start NotebookLM authentication when the server has no active Google session or reports that the saved session expired.
    Log me in to NotebookLM

    Current upstream server docs say this prompt opens a Chrome window for Google sign-in from the local machine running the server. Complete the sign-in there, then return to the Codex session.

  4. Give Codex the exact notebook it should use for the current task by pasting a shared notebook link into the chat.
    Use this NotebookLM notebook for grounded answers while you work on the SDK migration:
    https://notebooklm.google.com/notebook/18aeaa72-9a60-4b49-a9c4-ebb8e4afc559

    Current upstream server docs use this direct-link pattern for ad hoc research. Current Google help says the notebook share panel should be set to Anyone with a link before another account can open a public notebook link.

    Current Google help says public notebook sharing is not available for Workspace Enterprise or Education accounts. In those environments, use a notebook the authenticated Google account can open directly instead of expecting a public link to work.

  5. Ask a concrete research question that forces Codex to use the notebook instead of answering from generic memory.
    Research the rate-limit behavior in that notebook before you change the API client, then summarize the supported retry pattern and cite the relevant source passages.

    The decisive success state is a Codex reply grounded in that notebook, ideally with notebook-specific details or citations rather than a generic best-practice answer. If the server says the information is missing, switch notebooks or provide a different NotebookLM link instead of pushing Codex to guess.