Guide April 18, 2026 · 19 mins · The D23 Team

MCP for Notion: Wiki-Aware AI Assistants

Learn how to build MCP servers for Notion so Claude and AI assistants can answer questions grounded in company docs and wikis.

MCP for Notion: Wiki-Aware AI Assistants

Understanding MCP and Why Notion Matters

The Model Context Protocol (MCP) is a standardized interface that lets AI assistants like Claude access external data sources—including your Notion workspace—in real-time. Think of it as a bridge between your AI tools and your company’s knowledge base. Instead of relying on stale training data or manual copy-paste workflows, an MCP-enabled AI can read from your Notion wiki, understand your documentation structure, and answer questions with current, accurate context.

Notion has become the de facto standard for team wikis, product specs, and internal documentation across engineering and data teams. But Notion data exists in silos—separate from your AI assistants, analytics platforms, and operational workflows. The Notion MCP addresses this directly by creating a standardized protocol for AI tools to interact with Notion workspaces without building custom integrations from scratch.

For data and engineering leaders, this has immediate implications. When your team asks an AI assistant a question about data schema, product requirements, or operational procedures, that assistant can now pull from your Notion wiki in real-time. No more outdated Slack threads or buried documentation. The AI becomes genuinely wiki-aware—grounded in your company’s actual knowledge base, not guessing based on general training.

This matters especially for teams using D23’s managed Apache Superset platform for embedded analytics and self-serve BI. When your analytics dashboards live alongside your documentation in Notion, and your AI assistants can query both, you create a unified knowledge layer. Your team can ask questions like “What does the revenue dashboard measure?” or “How is our churn metric defined?” and get answers backed by both your dashboard metadata and your Notion docs.

What Is the Model Context Protocol?

The Model Context Protocol is an open standard developed to solve a specific problem: AI assistants need structured, real-time access to external data and tools, but building custom integrations for every data source is expensive and fragile.

MCP works by defining a set of standardized “tools” that an AI can call. These tools map to real operations in your external system—in this case, Notion. When you build an MCP server for Notion, you’re essentially creating a translator that says: “Here are the specific operations Claude (or any MCP-compatible AI) can perform on our Notion workspace.”

The protocol has three core components:

Resources: These are the data sources the AI can read. In Notion, resources might be your wiki pages, database entries, or specific documentation blocks. The AI can request a resource by name or identifier and receive structured content back.

Tools: These are the actions the AI can take. For Notion, tools might include “search for a page,” “read page content,” “query a database,” or “create a new page.” Tools accept parameters (like a search query) and return results.

Prompts: These are pre-built instruction templates that guide the AI on how to use the available resources and tools effectively. A prompt might say, “When answering questions about product requirements, always search the Product Spec database first.”

Unlike older integration approaches (webhooks, API polling, custom scripts), MCP is stateless and event-driven. The AI initiates a request, the MCP server handles it, and returns a response. This keeps latency low and makes the system predictable.

Why MCP for Notion Specifically?

Notion is an ideal candidate for MCP integration because it’s where teams actually store institutional knowledge. Your product roadmap lives in Notion. Your data dictionary lives in Notion. Your runbooks, postmortems, and architectural decisions live in Notion.

But Notion data is fundamentally disconnected from your AI tools. If you ask Claude about your company’s data model, Claude has no way to access your Notion Data Dictionary database. It can’t read your product spec. It can’t reference your architectural decision records. The AI is essentially flying blind, relying on whatever you manually paste into the conversation.

Notion’s MCP integration changes this by giving AI assistants direct, authenticated access to your workspace. The AI can search for relevant pages, read their content, and ground its responses in actual company documentation. This is especially powerful for:

  • Data teams: Your AI assistant can reference your data dictionary, metric definitions, and schema documentation when answering questions about analytics.
  • Engineering teams: When building features, engineers can ask an AI to pull context from your product spec, architecture docs, and previous decisions—all from Notion.
  • Operations and finance: Portfolio companies, fund managers, and finance teams can use Notion-aware AI to answer questions about KPI definitions, reporting standards, and performance metrics—the exact use case that D23 addresses for managed analytics.
  • Cross-functional teams: Product, design, and engineering can all reference the same source of truth without context switching.

The real power emerges when you combine Notion’s documentation with other systems. For instance, if your D23 analytics dashboards live alongside your Notion product specs, an AI can explain not just what a metric is (from Notion) but also show you the dashboard that measures it (from your BI platform). This unified context is something no single tool provides on its own.

Building Your First MCP Server for Notion

Building an MCP server for Notion doesn’t require deep infrastructure expertise, but it does require understanding a few foundational concepts. Let’s walk through a practical example: creating a server that lets Claude search and read from your Notion wiki.

Prerequisites and Setup

Before you start coding, you need:

  1. A Notion workspace with pages or databases you want to expose to AI.
  2. A Notion integration token. Go to Notion’s developer portal, create an internal integration, and copy the API key. This token is how your MCP server authenticates with Notion.
  3. A Notion database or page collection you want to expose. For this example, let’s assume you have a “Company Wiki” database with pages for different topics.
  4. A development environment with Node.js or Python installed. MCP servers can be written in any language, but Python and Node.js have the most mature libraries.
  5. The MCP SDK. If you’re using Node.js, install the @modelcontextprotocol/sdk package. For Python, there’s an equivalent.

Your Notion integration token should be treated like a password—store it in an environment variable, never commit it to version control. The token grants access to only the databases and pages you explicitly share with the integration in Notion.

Defining Your Resources and Tools

The first design decision is: what should your AI assistant be able to do with Notion?

For a wiki-aware assistant, start simple:

  • Search for pages: The AI should be able to search your wiki by keyword or topic.
  • Read page content: Once a page is found, the AI should be able to read its full content.
  • Query databases: If you have structured data (like a data dictionary or product spec database), the AI should be able to query it.
  • Retrieve metadata: The AI might need to understand page hierarchy, last-updated timestamps, or ownership information.

Don’t start with write operations (creating or editing pages). Read-only access is safer, easier to debug, and covers 80% of use cases. Once you’re comfortable with the pattern, you can add writes.

Sample Implementation: Node.js MCP Server

Here’s a simplified example of an MCP server for Notion written in Node.js. This server exposes two tools: “search_notion” and “read_notion_page.”

const { Server } = require("@modelcontextprotocol/sdk/server/index.js");
const { StdioServerTransport } = require("@modelcontextprotocol/sdk/server/stdio.js");
const { Client } = require("@notionhq/client");

const notion = new Client({ auth: process.env.NOTION_API_KEY });

const server = new Server({
  name: "notion-mcp-server",
  version: "1.0.0",
});

// Define the "search_notion" tool
server.setRequestHandler("tools/call", async (request) => {
  const { name, arguments: args } = request.params;

  if (name === "search_notion") {
    const query = args.query;
    // Search across your Notion workspace
    const results = await notion.search({ query });
    return {
      content: [
        {
          type: "text",
          text: JSON.stringify(results.results.map(r => ({
            id: r.id,
            title: r.properties?.title?.[0]?.text?.content || "Untitled",
            url: r.url,
          }))),
        },
      ],
    };
  }

  if (name === "read_notion_page") {
    const pageId = args.page_id;
    // Fetch the full page content
    const page = await notion.pages.retrieve({ page_id: pageId });
    const blocks = await notion.blocks.children.list({ block_id: pageId });
    return {
      content: [
        {
          type: "text",
          text: JSON.stringify({
            title: page.properties?.title?.[0]?.text?.content || "Untitled",
            content: blocks.results,
          }),
        },
      ],
    };
  }

  return { content: [{ type: "text", text: "Tool not found" }] };
});

const transport = new StdioServerTransport();
server.connect(transport);

This is a bare-bones example, but it demonstrates the pattern: define a tool, handle requests for that tool, call the Notion API, and return structured results.

In production, you’d want to add:

  • Error handling: What happens if the Notion API is down or the token expires?
  • Rate limiting: Notion has API rate limits; you need to respect them.
  • Caching: Don’t re-fetch the same page every time. Cache frequently accessed content.
  • Pagination: Notion returns results in pages; handle pagination properly.
  • Authentication: Ensure only authorized AI instances can call your server.

Configuring Claude to Use Your MCP Server

Once your MCP server is running, you need to tell Claude (or another AI assistant) how to reach it. This depends on how you’re running Claude:

If you’re using Claude Desktop: Edit your claude_desktop_config.json file (typically in ~/.claude/config.json on macOS or %APPDATA%\Claude\config.json on Windows) to include your MCP server:

{
  "mcpServers": {
    "notion": {
      "command": "node",
      "args": ["/path/to/notion-mcp-server.js"]
    }
  }
}

If you’re using Claude via API: You’ll need to run your MCP server as a separate service and configure the Claude API client to connect to it. The exact setup depends on your deployment model.

If you’re using Claude in a web application: You might use a hosted MCP server. Notion’s hosted MCP server is available for teams that don’t want to self-host.

Once configured, Claude will automatically discover the tools your MCP server exposes and use them when answering questions. You can test this by asking Claude a question about your Notion wiki—it should search Notion, find relevant pages, and ground its answer in the actual content.

Advanced Patterns: Multi-Database Queries and Context Enrichment

Once you have basic read access working, the next level is building more sophisticated queries. Real-world Notion workspaces have multiple databases—a data dictionary, a product spec, a roadmap, a decision log. A truly wiki-aware AI should be able to search across these intelligently and correlate information.

Cross-Database Queries

Imagine your Notion workspace has:

  • A “Data Dictionary” database with entries for each metric, dimension, and fact table.
  • A “Product Specs” database documenting features and requirements.
  • A “Dashboards” database (synced from your D23 analytics platform) documenting which dashboards measure which metrics.

When an engineer asks “What does the revenue dashboard measure?”, a sophisticated MCP server should:

  1. Search the Dashboards database for “revenue dashboard.”
  2. Extract the metric names from the dashboard entry.
  3. Search the Data Dictionary for definitions of those metrics.
  4. Return a comprehensive answer that includes the dashboard, its metrics, and their definitions.

This requires building a tool that understands relationships between databases and can chain queries. In code, this might look like:

if (name === "explain_dashboard") {
  const dashboardName = args.dashboard_name;
  
  // Step 1: Find the dashboard
  const dashboardResults = await notion.databases.query({
    database_id: DASHBOARDS_DB_ID,
    filter: {
      property: "Name",
      text: { contains: dashboardName },
    },
  });
  
  const dashboard = dashboardResults.results[0];
  const metrics = dashboard.properties.Metrics.multi_select.map(m => m.name);
  
  // Step 2: Look up each metric in the Data Dictionary
  const definitions = await Promise.all(
    metrics.map(metricName =>
      notion.databases.query({
        database_id: DATA_DICTIONARY_DB_ID,
        filter: {
          property: "Metric Name",
          text: { equals: metricName },
        },
      })
    )
  );
  
  return {
    content: [
      {
        type: "text",
        text: JSON.stringify({
          dashboard: dashboard.properties.Name.title[0].text.content,
          metrics: definitions.map(d => ({
            name: d.results[0].properties["Metric Name"].title[0].text.content,
            definition: d.results[0].properties.Definition.rich_text[0].text.content,
          })),
        }),
      },
    ],
  };
}

This pattern—searching one database, extracting identifiers, then searching related databases—is powerful because it mirrors how humans navigate documentation. It gives your AI assistant the ability to “follow the thread” through your knowledge base.

Embedding Context in Prompts

Another advanced pattern is using MCP prompts to guide the AI’s behavior. Instead of leaving it to the AI to figure out when to search Notion, you can explicitly tell it: “When answering questions about metrics, always search the Data Dictionary first.”

In your MCP server, you can define prompts like:

server.setRequestHandler("prompts/list", () => {
  return {
    prompts: [
      {
        name: "metric_expert",
        description: "You are a metric definition expert. When asked about any metric, search the Data Dictionary first and ground your answer in the actual definition.",
        arguments: [
          {
            name: "metric_name",
            description: "The name of the metric to explain",
            required: true,
          },
        ],
      },
    ],
  };
});

When Claude is invoked with the “metric_expert” prompt, it knows to prioritize Notion searches for metric definitions. This is a lightweight way to embed domain knowledge into your AI workflow without hardcoding logic.

Integrating Notion MCP with Analytics and BI Platforms

For data and analytics teams, the real power of Notion MCP emerges when you connect it to your BI platform. If you’re using D23’s managed Apache Superset, you can create a unified knowledge layer where your dashboards, metrics, and documentation are all accessible to AI.

Here’s a practical workflow:

  1. Your data team builds dashboards in D23 that measure key metrics (revenue, churn, LTV, etc.). Each dashboard is documented with metadata: the underlying tables, the calculation logic, the business context.

  2. Your team documents these metrics in Notion with formal definitions, calculation formulas, and any caveats or limitations.

  3. You build an MCP server that can search both your D23 dashboards (via the D23 API) and your Notion documentation.

  4. When a stakeholder asks “What’s our monthly churn rate?”, Claude can:

    • Search Notion for the churn metric definition.
    • Query D23 for the latest churn dashboard.
    • Provide both the definition and the current metric value in a single response.

This transforms your analytics from a pull model (stakeholders hunt for dashboards) to a push model (stakeholders ask questions and get answers with full context).

For teams managing multiple portfolio companies (a common use case for PE and VC firms), this becomes even more valuable. You can have a master Notion workspace with standardized KPI definitions, and each portfolio company’s dashboards in D23. An AI assistant can then answer questions like “How does Company A’s churn compare to Company B’s?” by pulling from both sources.

Security and Permissions in MCP Notion Servers

When you expose Notion data to AI assistants via MCP, you’re creating a new access path. You need to think carefully about permissions and security.

Notion integration tokens are powerful: A single token grants access to all databases and pages shared with it. If you’re building an MCP server that runs in production, you need to:

  1. Use a dedicated integration: Create a Notion integration specifically for your MCP server, not a personal integration. This makes it easier to audit and revoke access.

  2. Limit the scope: In Notion, explicitly grant your integration access only to the databases and pages it needs. Don’t give it workspace-wide access.

  3. Rotate tokens regularly: Treat your Notion API key like a database password. Rotate it every 90 days or if you suspect compromise.

  4. Audit access: Log all requests your MCP server makes to Notion. If something goes wrong, you need a paper trail.

  5. Implement rate limiting: Notion has strict API rate limits. If your MCP server gets compromised and starts making requests rapidly, Notion will block you. Implement local rate limiting to prevent this.

  6. Authenticate the AI: If your MCP server is exposed over a network, require authentication. Don’t let any AI instance call your server; only your authorized Claude instances should have access.

For teams using D23’s managed analytics platform, the same principles apply if you’re building an MCP server that queries D23 dashboards. Use API keys with minimal necessary permissions, rotate them regularly, and log all access.

Troubleshooting and Common Pitfalls

Building MCP servers for Notion is relatively straightforward, but there are common mistakes:

Mistake 1: Forgetting to share databases with the integration

Your MCP server’s Notion token can only access databases and pages you explicitly shared with it in Notion. If you build a tool that queries a database but forgot to share it with the integration, you’ll get permission errors. Always double-check the integration settings in Notion.

Mistake 2: Parsing Notion’s block structure incorrectly

Notion pages are composed of blocks (paragraphs, headings, code blocks, etc.). Each block has a different structure. When you fetch a page’s blocks, you need to handle each type correctly. A paragraph block looks different from a database block. Write a helper function that converts Notion blocks into plain text or markdown for the AI to understand.

Mistake 3: Hitting rate limits

Notion allows 3-5 API calls per second per integration. If your MCP server is popular and lots of AI instances are querying it simultaneously, you’ll hit the rate limit. Implement caching (Redis, in-memory, or simple file-based) so frequently accessed pages don’t require a Notion API call every time.

Mistake 4: Exposing sensitive information

Your Notion workspace might contain sensitive data—financial information, customer details, security vulnerabilities. Be intentional about what you expose via MCP. Consider creating a separate “AI-safe” Notion workspace with sanitized documentation, or use database filters to exclude sensitive pages.

Mistake 5: Not testing with Claude before deploying

Build your MCP server, configure it locally, and test it thoroughly with Claude before deploying to production. Ask Claude questions that should trigger your tools and verify it’s getting the right results. Check that error handling works (what happens if a page doesn’t exist?). Test with different types of content—plain text, tables, code blocks.

Real-World Use Cases

Let’s walk through a few concrete scenarios where Notion MCP delivers value:

Data Team Onboarding

A new data analyst joins your company. Instead of spending a week reading documentation, they can ask Claude: “What tables are in our data warehouse?” Claude searches your Notion Data Dictionary and returns a structured list. “What’s the difference between users and accounts?” Claude pulls the definitions from Notion. “Show me a dashboard that uses the accounts table.” Claude queries your D23 dashboards and returns examples. Onboarding time drops from a week to a day.

Cross-Functional Decision Making

Your product team is debating whether to change how you calculate a key metric. Instead of hunting through Slack and old emails, they ask Claude: “What’s the current definition of monthly active users? Who owns it? When was it last changed?” Claude searches your Notion decision log and metric definitions, pulls the owner from the database, and provides full context. The decision is made faster and with better information.

Portfolio Company Standardization

You’re a PE firm with 15 portfolio companies. Each has its own analytics setup, but you want to standardize KPI definitions and reporting. You create a master Notion workspace with standardized metric definitions, and each portfolio company has dashboards in D23. Your MCP server can answer questions like “How does Company A’s gross margin compare to the portfolio average?” by pulling from both sources. Reporting becomes consistent and automated.

AI-Assisted Analytics

Your team is exploring text-to-SQL and AI-assisted query generation (similar to how D23 integrates AI analytics into its platform). Instead of the AI generating queries in a vacuum, it can ask Notion: “What’s the schema of the revenue table? What are the date ranges in this data?” Claude has context and generates better, more accurate queries.

Choosing Between Self-Hosted and Hosted MCP Servers

You have two main options for running your Notion MCP server:

Self-hosted: You run the server yourself, either locally, on a VPS, or in your cloud infrastructure (AWS, GCP, Azure). This gives you full control but requires you to manage deployment, monitoring, and scaling.

Hosted: Services like Notion’s official MCP server or third-party providers host the server for you. You authenticate with your Notion token, and the service handles the rest. This is simpler but requires trusting a third party with your Notion credentials.

For most teams, start with a hosted solution to reduce operational overhead. If you have specific customization needs (like cross-database queries or integration with proprietary systems), self-host.

If you’re building a product that embeds AI analytics (like D23’s embedded BI capabilities), and you want to give your users wiki-aware AI, you might build your own MCP server that integrates with your platform’s data model.

Extending MCP Beyond Notion

Once you understand MCP for Notion, the pattern extends to other systems. You can build MCP servers for:

  • Confluence or other wikis: Same concept as Notion—make your documentation AI-accessible.
  • Jira or issue trackers: Let AI understand your project context and roadmap.
  • Your BI platform: Build an MCP server that lets AI query your dashboards and metrics. D23’s API-first design makes this straightforward.
  • GitHub or GitLab: Let AI understand your codebase and commit history.
  • Slack: Let AI search your chat history and institutional knowledge stored in threads.

The MCP pattern is: identify a data source your team uses, build a standardized interface (tools and resources) for accessing it, and suddenly your AI assistants have real context. This is how you build truly intelligent, company-aware AI systems.

Getting Started Today

If you want to build an MCP server for Notion right now, here’s your roadmap:

  1. Read the docs: Start with Notion’s official MCP documentation and the Notion MCP help center to understand the protocol and Notion’s API.

  2. Create a test Notion workspace: Don’t use your production workspace. Create a test workspace with a few sample databases and pages.

  3. Choose your language: Pick Python or Node.js. Both have good MCP libraries. Node.js examples are more abundant.

  4. Start with search and read: Don’t try to build everything at once. Get search and read working first, then add more complex queries.

  5. Test locally with Claude Desktop: Download Claude Desktop, configure your MCP server, and test it.

  6. Iterate and expand: Once basic functionality works, add caching, error handling, and more sophisticated queries.

  7. Deploy and monitor: When you’re ready, deploy to production and set up logging and alerting.

For teams already using D23 for managed Apache Superset and embedded analytics, you can extend this pattern by building an MCP server that queries your D23 dashboards and metrics alongside Notion documentation. This creates a truly unified knowledge layer where your AI assistants understand both your data and your business context.

Conclusion

MCP for Notion is a powerful pattern for making your company’s knowledge base accessible to AI. By building a bridge between your Notion workspace and AI assistants like Claude, you create wiki-aware AI that answers questions grounded in your actual documentation, not guesses based on general training.

The implementation is straightforward—define tools, call the Notion API, return results. But the impact is significant: faster onboarding, better decision-making, and AI assistants that understand your company’s context.

Start small, test locally, and iterate. Once you have Notion MCP working, you’ll see opportunities to extend the pattern to other systems—your analytics platform, your issue tracker, your code repository. The future of AI in enterprise is context-aware AI, and MCP is how you build it.

For analytics and data teams specifically, combining Notion MCP with a platform like D23’s managed Apache Superset creates a powerful combination: AI assistants that understand both your metrics and your data, dashboards that are documented and discoverable, and a unified knowledge layer that serves your entire organization. That’s the kind of AI-powered analytics infrastructure that drives real competitive advantage.