Guide April 18, 2026 · 23 mins · The D23 Team

MCP Server Authentication: OAuth, API Keys, and mTLS

Learn OAuth, API keys, and mTLS authentication for production MCP servers. Secure analytics APIs with real-world patterns for Superset integration.

MCP Server Authentication: OAuth, API Keys, and mTLS

Understanding MCP Server Authentication in Production

When you’re building analytics infrastructure that exposes data through Model Context Protocol (MCP) servers, authentication isn’t a checkbox—it’s the foundation of your security posture. Whether you’re embedding self-serve BI dashboards, exposing text-to-SQL capabilities, or connecting AI agents to analytics tools, the authentication mechanism you choose determines who accesses your data, how they prove their identity, and whether you can audit and revoke access in real time.

This is especially critical when using managed Apache Superset platforms that expose MCP servers for AI-powered analytics. Your MCP server becomes a gateway to sensitive business data, query logs, and dashboard definitions. A weak authentication implementation can leak customer data, allow unauthorized query execution, or expose your entire analytics stack to compromise.

The good news: there are three mature, battle-tested authentication patterns for MCP servers in production. Each has distinct strengths, weaknesses, and use cases. This guide walks you through all three—from basic API keys to enterprise-grade OAuth 2.1 with PKCE, and mutual TLS (mTLS) for zero-trust environments—with concrete examples, decision frameworks, and real-world guidance for data and engineering leaders.

API Keys: Simple, Stateless, and Limited

API keys are the simplest authentication mechanism. A client includes a secret string (the key) in a request header or query parameter, and your MCP server validates it against a stored list of valid keys. No cryptographic handshakes, no token exchanges, no authorization servers.

Here’s what a basic API key flow looks like:

Client sends request:

GET /api/dashboards
Authorization: Bearer sk_live_abc123xyz789

Server validates:

if api_key in VALID_KEYS:
    return dashboards
else:
    return 401 Unauthorized

When API Keys Work Well

API keys are pragmatic for:

  • Development and prototyping: You need authentication fast, without infrastructure overhead. A developer can generate a key in seconds and start testing.
  • Service-to-service communication with a small, known set of clients. If you have three internal microservices that need to call your MCP server, and they never change, API keys are simple and sufficient.
  • Stateless deployments: No database lookups, no token validation logic. The key itself contains everything the server needs to authenticate the request.
  • Low-risk data: If your MCP server exposes non-sensitive aggregations or public dashboards, the security bar is lower.

According to MCP Server Authentication Methods: A Practical Guide, API keys remain a valid choice for prototype environments and internal tooling, though they require careful key rotation and storage practices.

The Critical Weaknesses of API Keys

API keys have fundamental limitations that make them unsuitable for production analytics at scale:

No granular permissions: An API key either grants access or it doesn’t. You can’t scope a key to “read-only dashboards” or “execute queries only on the sales dataset.” If the key is compromised, the attacker has full access.

No audit trail of who did what: If a key is leaked and used maliciously, you can’t tell which specific client or user made the request. You only know the key was used. This is a compliance nightmare for regulated industries (healthcare, finance, etc.).

Difficult revocation and rotation: Revoking an API key means updating every client that uses it. If you have dozens of embedded analytics instances across customer deployments, rotating keys becomes a coordination nightmare. You either have downtime or you maintain two keys per client during rotation, which doubles your storage and validation overhead.

No expiration or time-limited access: Keys are typically static and permanent. A compromised key remains valid forever unless you explicitly revoke it. You have no way to issue short-lived credentials that automatically expire.

Transport and storage risks: Keys are often stored in environment variables, config files, or CI/CD systems. They’re passed in HTTP headers, which means they’re visible in logs, browser history, and proxy caches unless you’re using HTTPS with careful header filtering.

For teams embedding analytics in products or managing multi-tenant deployments, API keys create unacceptable security and operational risks.

OAuth 2.1: Industry-Standard Delegated Authorization

OAuth 2.1 is the modern standard for delegated authorization. Instead of sharing a secret key, clients request temporary access tokens from an authorization server. Tokens are short-lived, scoped to specific permissions, and can be revoked instantly.

Here’s the OAuth 2.1 flow for MCP servers:

Step 1: Client authenticates to authorization server

POST /oauth/token
Content-Type: application/x-www-form-urlencoded

grant_type=client_credentials
&client_id=my_app_id
&client_secret=my_app_secret
&scope=analytics:read dashboards:read

Step 2: Authorization server issues access token

{
  "access_token": "eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9...",
  "token_type": "Bearer",
  "expires_in": 3600,
  "scope": "analytics:read dashboards:read"
}

Step 3: Client uses token to call MCP server

GET /api/dashboards
Authorization: Bearer eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9...

Step 4: MCP server validates token and grants access

token = extract_bearer_token(request)
if validate_token_signature(token) and token.scope.includes('dashboards:read'):
    return dashboards
else:
    return 401 Unauthorized

Why OAuth 2.1 Is the Right Choice for Production Analytics

OAuth 2.1 solves the core problems with API keys:

Granular, delegated permissions: Tokens include a scope field that specifies exactly what the client can do. You can issue a token scoped to dashboards:read (read-only access to dashboards) separate from queries:execute (permission to run ad-hoc queries). If a token is compromised, the blast radius is limited to its declared scope.

Short-lived credentials: Tokens typically expire in 15 minutes to 1 hour. Even if a token is stolen, it’s only valid for a short window. After expiration, the client must re-authenticate to the authorization server to get a new token.

Instant revocation: You can revoke a token immediately at the authorization server without updating every client. The next time that token is used, validation fails. This is critical for responding to security incidents.

Audit trail: Each token is tied to a specific client (via client_id). When a request is made with a token, your audit logs show exactly which client made the request and what permissions it had. This satisfies compliance requirements and enables forensic analysis.

Decoupled client and server: The client doesn’t need to know about your MCP server’s internal authentication logic. It just requests a token from a standard authorization server. Your authorization server can be a third-party provider (Auth0, Okta, etc.) or an internal service.

According to MCP OAuth: How OAuth 2.1 Works in the Model Context Protocol, OAuth 2.1 with PKCE (Proof Key for Code Exchange) is the recommended pattern for multi-client scenarios, especially when MCP servers are exposed to untrusted networks or embedded in third-party applications.

OAuth 2.1 Implementation Patterns for MCP Servers

Client Credentials Grant (M2M authentication): When your MCP server is called by another service or application, use the client credentials grant. The client authenticates with its client_id and client_secret, and receives a token. This is appropriate for service-to-service communication, embedded analytics in your own products, and internal tooling.

Authorization Code Grant with PKCE (User-delegated access): When a user is involved—for example, a user logging into your embedded analytics dashboard—use the authorization code grant with PKCE. The user authenticates to the authorization server, grants the application permission to access their data, and the application receives a token on their behalf. PKCE (Proof Key for Code Exchange) prevents token interception attacks and is mandatory in OAuth 2.1.

Token Validation at the MCP Server: Your MCP server doesn’t validate tokens by calling the authorization server on every request (that’s slow and creates a dependency). Instead, tokens are JWT (JSON Web Tokens) signed by the authorization server. Your server stores the public key, verifies the signature locally, and checks expiration and scope claims. This is stateless and fast.

According to Understanding Authorization in MCP, implementing OAuth 2.1 authorization for MCP servers involves protecting resources with access tokens and validating token signatures at the server level.

Real-World OAuth 2.1 Example: Securing a Superset Analytics MCP Server

Imagine you’re running a managed Apache Superset instance and exposing an MCP server that allows AI agents and internal tools to query dashboards and execute text-to-SQL queries. Here’s how you’d secure it with OAuth 2.1:

Authorization Server Setup (using a service like Auth0 or an internal OAuth provider):

  • Create an OAuth application for your MCP server.
  • Define scopes: dashboards:read, queries:read, queries:execute, exports:create.
  • Generate a signing key pair (public key for token validation, private key for signing).

Client Registration:

  • Register each client that will call your MCP server (your AI agent, your embedded analytics frontend, your internal reporting tool).
  • Assign each client a client_id and client_secret.
  • Specify which scopes each client is allowed to request (least privilege).

Token Issuance: When a client needs to call your MCP server, it requests a token:

curl -X POST https://auth.example.com/oauth/token \
  -H "Content-Type: application/x-www-form-urlencoded" \
  -d "grant_type=client_credentials" \
  -d "client_id=ai_agent_123" \
  -d "client_secret=secret_xyz" \
  -d "scope=dashboards:read queries:execute"

The authorization server returns:

{
  "access_token": "eyJhbGciOiJSUzI1NiIsInR5cCI6IkpXVCJ9.eyJzdWIiOiJhaV9hZ2VudF8xMjMiLCJzY29wZSI6ImRhc2hib2FyZHM6cmVhZCBxdWVyaWVzOmV4ZWN1dGUiLCJleHAiOjE3MDAwMDAwMDB9.signature",
  "token_type": "Bearer",
  "expires_in": 3600
}

Token Validation at MCP Server: Your MCP server receives the token and validates it:

import jwt
from functools import wraps

# Public key from authorization server
PUBLIC_KEY = "-----BEGIN PUBLIC KEY-----...-----END PUBLIC KEY-----"

def require_oauth_scope(*required_scopes):
    def decorator(f):
        @wraps(f)
        def decorated_function(*args, **kwargs):
            token = extract_bearer_token(request)
            try:
                payload = jwt.decode(token, PUBLIC_KEY, algorithms=["RS256"])
            except jwt.InvalidTokenError:
                return {"error": "Invalid token"}, 401
            
            token_scopes = set(payload.get("scope", "").split())
            if not any(scope in token_scopes for scope in required_scopes):
                return {"error": "Insufficient permissions"}, 403
            
            request.client_id = payload["sub"]
            request.token_scopes = token_scopes
            return f(*args, **kwargs)
        return decorated_function
    return decorator

@app.route("/api/dashboards")
@require_oauth_scope("dashboards:read")
def get_dashboards():
    # Log the client_id for audit
    log_access(client_id=request.client_id, action="list_dashboards")
    return {"dashboards": [...]}

@app.route("/api/queries", methods=["POST"])
@require_oauth_scope("queries:execute")
def execute_query():
    log_access(client_id=request.client_id, action="execute_query")
    return execute_superset_query(request.json)

Audit and Revocation: Every request is logged with the client_id, timestamp, action, and scopes. If a client is compromised, you revoke its credentials at the authorization server. Existing tokens remain valid until they expire (typically within an hour), but the client can’t issue new tokens.

Mutual TLS (mTLS): Zero-Trust Network Authentication

While OAuth 2.1 handles application-level authentication (“who is this client?”), mTLS handles network-level authentication (“is this connection trusted?”). mTLS requires both the client and server to present digital certificates to prove their identity before any data is exchanged.

In mTLS, both sides perform verification:

Client → Server: “Here’s my certificate. Is it signed by a CA you trust?” Server → Client: “Here’s my certificate. Is it signed by a CA you trust?”

Only if both certificates are valid does the TLS handshake complete and data flow.

How mTLS Works for MCP Servers

Step 1: Certificate Generation: You create a Certificate Authority (CA) and issue certificates to each client and your MCP server:

  • Server certificate: Identifies your MCP server to clients.
  • Client certificate: Identifies each client to your server.
  • CA certificate: Used by both sides to verify the other’s certificate.

Step 2: Server Configuration: Your MCP server is configured to require client certificates and verify them against your CA:

import ssl
from flask import Flask

app = Flask(__name__)

# Create SSL context
context = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
context.load_cert_chain(
    certfile="/etc/certs/server.crt",
    keyfile="/etc/certs/server.key"
)
context.load_verify_locations("/etc/certs/ca.crt")
context.verify_mode = ssl.CERT_REQUIRED  # Require client cert

app.run(ssl_context=context, host="0.0.0.0", port=443)

Step 3: Client Configuration: Clients present their certificate when connecting:

curl --cert /etc/certs/client.crt \
     --key /etc/certs/client.key \
     --cacert /etc/certs/ca.crt \
     https://mcp-server.example.com/api/dashboards

Step 4: Server Validation: The server extracts the client certificate’s subject (e.g., CN=client-ai-agent-1) and uses it for authorization:

@app.route("/api/dashboards")
def get_dashboards():
    # Extract client identity from certificate
    client_cert = request.environ.get("SSL_CLIENT_CERT")
    client_cn = extract_common_name(client_cert)  # e.g., "client-ai-agent-1"
    
    # Check if this client is authorized
    if not is_authorized(client_cn):
        return {"error": "Unauthorized"}, 403
    
    log_access(client_id=client_cn, method="mTLS")
    return {"dashboards": [...]}

When to Use mTLS

mTLS is ideal for:

  • Zero-trust networks: Every connection must be cryptographically verified, regardless of network location. This is the standard in regulated industries (finance, healthcare) and for sensitive data.
  • Service mesh environments: If you’re running Kubernetes with Istio or Linkerd, mTLS is automatically enforced between services. Your MCP server benefits from this without additional configuration.
  • Internal, high-security deployments: When your MCP server is accessed only by trusted internal services (not by external customers or third-party integrations), mTLS provides strong guarantees.
  • Compliance and audit: mTLS creates a cryptographic proof of identity for every connection. This satisfies strict compliance requirements (SOC 2, ISO 27001, etc.).

According to Solution: mTLS Client Authentication for OAuth 2.0 Flows, mTLS can be combined with OAuth 2.0 for defense-in-depth: mTLS authenticates the network connection, and OAuth 2.0 authenticates the application request.

The Operational Overhead of mTLS

mTLS is secure but operationally expensive:

Certificate Management: You must generate, distribute, rotate, and revoke certificates for every client. This requires a certificate management system (often a PKI infrastructure). Certificate rotation is mandatory before expiration; expired certificates break connections.

Debugging Complexity: When a connection fails, it could be a certificate issue, a network issue, or an authorization issue. Debugging requires certificate inspection, TLS handshake logs, and deep networking knowledge.

Limited Granularity: mTLS authenticates the client but doesn’t provide fine-grained permissions like OAuth 2.1 scopes. You can’t easily say “this client can read dashboards but not execute queries.” You’d need to add application-level authorization on top of mTLS.

Scaling Challenges: For every new client, you must issue a certificate and distribute it securely. For hundreds or thousands of clients (like embedded analytics in customer deployments), this becomes a bottleneck.

Combining OAuth 2.1 and mTLS: Defense in Depth

For maximum security, combine OAuth 2.1 and mTLS. mTLS handles network-level authentication (proving the client is who it claims to be at the network layer), and OAuth 2.1 handles application-level authorization (proving the client has permission to perform the requested action).

This pattern is especially useful for:

  • Private analytics platforms: Your MCP server is accessed only by internal services (mTLS ensures network trust), but you still want fine-grained permissions (OAuth 2.1 scopes).
  • Regulated industries: mTLS satisfies network-level compliance requirements, and OAuth 2.1 provides audit trails and granular access control.
  • Multi-tenant SaaS: mTLS authenticates each tenant’s connection, and OAuth 2.1 ensures one tenant can’t access another’s data.

According to OAuth Configuration - Secure MCP Gateway, combining mTLS and OAuth 2.0/2.1 with proper token management creates a robust security posture for MCP servers.

Implementation Example:

import ssl
import jwt
from flask import Flask, request

app = Flask(__name__)

# mTLS configuration
context = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
context.load_cert_chain("server.crt", "server.key")
context.load_verify_locations("ca.crt")
context.verify_mode = ssl.CERT_REQUIRED

def require_mtls_and_oauth(*required_scopes):
    def decorator(f):
        @wraps(f)
        def decorated_function(*args, **kwargs):
            # 1. mTLS: Extract and verify client certificate
            client_cert = request.environ.get("SSL_CLIENT_CERT")
            if not client_cert:
                return {"error": "Client certificate required"}, 401
            
            client_cn = extract_common_name(client_cert)
            log_access(client_id=client_cn, auth_method="mTLS", action="cert_verified")
            
            # 2. OAuth 2.1: Extract and validate access token
            token = extract_bearer_token(request)
            try:
                payload = jwt.decode(token, PUBLIC_KEY, algorithms=["RS256"])
            except jwt.InvalidTokenError:
                return {"error": "Invalid or expired token"}, 401
            
            # 3. Check scopes
            token_scopes = set(payload.get("scope", "").split())
            if not any(scope in token_scopes for scope in required_scopes):
                return {"error": "Insufficient permissions"}, 403
            
            # Both mTLS and OAuth validated; proceed
            request.client_id = client_cn
            request.token_scopes = token_scopes
            return f(*args, **kwargs)
        return decorated_function
    return decorator

@app.route("/api/dashboards")
@require_mtls_and_oauth("dashboards:read")
def get_dashboards():
    return {"dashboards": [...]}

if __name__ == "__main__":
    app.run(ssl_context=context, host="0.0.0.0", port=443)

Migrating from API Keys to OAuth 2.1

If you’re currently using API keys to secure your MCP server, migrating to OAuth 2.1 is a critical step for production systems. Here’s a practical migration path:

Phase 1: Parallel Operation (2-4 weeks): Deploy your authorization server and update your MCP server to accept both API keys and OAuth 2.1 tokens. Existing clients continue using API keys; new clients use OAuth 2.1. Your MCP server validates both:

def authenticate_request(request):
    auth_header = request.headers.get("Authorization", "")
    
    if auth_header.startswith("Bearer eyJ"):  # JWT token
        return validate_oauth_token(auth_header[7:])
    elif auth_header.startswith("Bearer sk_"):  # API key
        return validate_api_key(auth_header[7:])
    else:
        return None, 401

Phase 2: Client Migration (4-8 weeks): Work with each client to migrate from API keys to OAuth 2.1. Provide documentation, SDKs, and support. Most clients can migrate in a day or two once they understand the flow.

Phase 3: API Key Deprecation (2-4 weeks): Announce a deprecation date for API key support. Set a timeline (e.g., 90 days) for clients to complete migration. Provide clear communication and support.

Phase 4: API Key Removal (on deprecation date): Remove API key validation from your MCP server. All clients now use OAuth 2.1.

According to Migrating from API Keys to OAuth for MCP Servers, this step-by-step migration approach minimizes disruption and allows time for clients to adjust. The key is maintaining backward compatibility during the transition phase.

Authentication in the Context of Embedded Analytics

When you’re embedding analytics (dashboards, queries, AI-powered insights) into your product, authentication becomes more nuanced. You’re not just securing an API; you’re securing access to customer data across multiple tenants.

Consider a scenario where you’re using D23’s managed Apache Superset platform to power embedded analytics for your SaaS product. Your customers log into your application, and you embed Superset dashboards directly in their interface. You need to:

  1. Authenticate the customer to your application (handled by your app’s auth system).
  2. Authenticate your application to the Superset MCP server (handled by OAuth 2.1 or mTLS).
  3. Authorize the customer to see only their own data (handled by Superset row-level security or your application logic).
  4. Audit every access for compliance and security.

Your MCP server authentication handles step 2. Your application’s auth system and Superset’s row-level security handle steps 1, 3, and 4.

For embedded analytics, OAuth 2.1 with PKCE is typically the best choice. Your application authenticates to the Superset MCP server using client credentials, receives a token, and uses that token to fetch dashboards and execute queries on behalf of the logged-in customer. The customer’s identity is passed in the request body or headers (not in the authentication itself), and Superset’s authorization layer ensures they only see their data.

Real-World MCP Server Authentication in Production

Let’s walk through a complete, production-ready example: securing an MCP server that exposes D23’s analytics capabilities to internal AI agents and embedded analytics frontends.

Architecture:

  • Authorization Server: Auth0 or internal OAuth provider.
  • MCP Server: Python Flask app exposing Superset dashboards and queries.
  • Clients: AI agent (Python), embedded analytics frontend (React), internal reporting tool (Node.js).

Step 1: Authorization Server Setup:

# Create OAuth application in Auth0
# Name: superset-mcp-server
# Grant types: Client Credentials
# Scopes:
#   - dashboards:read (list and view dashboards)
#   - dashboards:write (create and edit dashboards)
#   - queries:read (view saved queries)
#   - queries:execute (run ad-hoc queries)
#   - exports:create (export data)

# Create machine-to-machine applications
# AI Agent: client_id=ai_agent_prod, scopes=[dashboards:read, queries:execute]
# Frontend: client_id=analytics_frontend, scopes=[dashboards:read]
# Reporting Tool: client_id=reporting_tool, scopes=[dashboards:read, queries:read, exports:create]

Step 2: MCP Server Implementation:

from flask import Flask, request, jsonify
from functools import wraps
import jwt
import requests
from datetime import datetime
import logging

app = Flask(__name__)
logging.basicConfig(level=logging.INFO)
logger = logging.getLogger(__name__)

# Configuration
AUTH0_DOMAIN = "your-auth0-domain.auth0.com"
AUTH0_AUDIENCE = "https://superset-mcp-server"
PUBLIC_KEY_URL = f"https://{AUTH0_DOMAIN}/.well-known/jwks.json"

# Cache for public keys (refresh every hour)
public_keys = {}
last_key_fetch = 0

def get_public_keys():
    global public_keys, last_key_fetch
    now = datetime.now().timestamp()
    
    if now - last_key_fetch > 3600:  # Refresh every hour
        response = requests.get(PUBLIC_KEY_URL)
        response.raise_for_status()
        keys = response.json()["keys"]
        public_keys = {key["kid"]: jwt.algorithms.RSAAlgorithm.from_jwk(key) for key in keys}
        last_key_fetch = now
    
    return public_keys

def verify_oauth_token(token):
    """Verify OAuth 2.1 token and return payload."""
    try:
        # Decode without verification first to get the kid (key ID)
        unverified = jwt.decode(token, options={"verify_signature": False})
        kid = unverified.get("kid")
        
        # Get public keys
        public_keys = get_public_keys()
        if kid not in public_keys:
            return None, "Invalid key ID"
        
        # Verify with the correct public key
        payload = jwt.decode(
            token,
            public_keys[kid],
            algorithms=["RS256"],
            audience=AUTH0_AUDIENCE,
            options={"verify_exp": True}
        )
        return payload, None
    except jwt.ExpiredSignatureError:
        return None, "Token expired"
    except jwt.InvalidTokenError as e:
        return None, f"Invalid token: {str(e)}"

def require_oauth_scope(*required_scopes):
    """Decorator to require OAuth token with specific scopes."""
    def decorator(f):
        @wraps(f)
        def decorated_function(*args, **kwargs):
            # Extract token from Authorization header
            auth_header = request.headers.get("Authorization", "")
            if not auth_header.startswith("Bearer "):
                logger.warning("Missing or invalid Authorization header")
                return {"error": "Missing or invalid Authorization header"}, 401
            
            token = auth_header[7:]
            
            # Verify token
            payload, error = verify_oauth_token(token)
            if error:
                logger.warning(f"Token verification failed: {error}")
                return {"error": f"Unauthorized: {error}"}, 401
            
            # Check scopes
            token_scopes = set(payload.get("scope", "").split())
            if not any(scope in token_scopes for scope in required_scopes):
                logger.warning(f"Insufficient scopes. Required: {required_scopes}, Got: {token_scopes}")
                return {"error": "Insufficient permissions"}, 403
            
            # Store client info in request context for logging
            request.client_id = payload.get("sub")
            request.token_scopes = token_scopes
            request.issued_at = payload.get("iat")
            
            return f(*args, **kwargs)
        return decorated_function
    return decorator

def log_access(action, resource, status):
    """Log access for audit trail."""
    logger.info(f"""
        client_id={request.client_id}
        action={action}
        resource={resource}
        status={status}
        timestamp={datetime.utcnow().isoformat()}
        scopes={','.join(request.token_scopes)}
    """)

@app.route("/health", methods=["GET"])
def health():
    """Health check endpoint (no auth required)."""
    return {"status": "healthy"}, 200

@app.route("/api/dashboards", methods=["GET"])
@require_oauth_scope("dashboards:read")
def list_dashboards():
    """List all dashboards accessible to the client."""
    try:
        # Query Superset API for dashboards
        dashboards = [
            {"id": 1, "title": "Sales Overview", "owner": "analytics"},
            {"id": 2, "title": "Customer Metrics", "owner": "analytics"},
        ]
        
        log_access(action="list_dashboards", resource="dashboards", status="success")
        return {"dashboards": dashboards}, 200
    except Exception as e:
        logger.error(f"Error listing dashboards: {str(e)}")
        log_access(action="list_dashboards", resource="dashboards", status="error")
        return {"error": "Internal server error"}, 500

@app.route("/api/dashboards/<int:dashboard_id>", methods=["GET"])
@require_oauth_scope("dashboards:read")
def get_dashboard(dashboard_id):
    """Get a specific dashboard."""
    try:
        # Query Superset API for dashboard
        dashboard = {
            "id": dashboard_id,
            "title": "Sales Overview",
            "owner": "analytics",
            "charts": [{"id": 1, "title": "Revenue by Region"}]
        }
        
        log_access(action="get_dashboard", resource=f"dashboard:{dashboard_id}", status="success")
        return {"dashboard": dashboard}, 200
    except Exception as e:
        logger.error(f"Error getting dashboard: {str(e)}")
        log_access(action="get_dashboard", resource=f"dashboard:{dashboard_id}", status="error")
        return {"error": "Internal server error"}, 500

@app.route("/api/queries", methods=["POST"])
@require_oauth_scope("queries:execute")
def execute_query():
    """Execute an ad-hoc SQL query against Superset."""
    try:
        query = request.json.get("sql")
        if not query:
            return {"error": "Missing 'sql' field"}, 400
        
        # Execute query through Superset API
        # (In production, validate the query, enforce limits, etc.)
        result = {
            "query": query,
            "rows": [{"id": 1, "name": "ACME Corp", "revenue": 50000}],
            "execution_time_ms": 145
        }
        
        log_access(action="execute_query", resource="queries", status="success")
        return {"result": result}, 200
    except Exception as e:
        logger.error(f"Error executing query: {str(e)}")
        log_access(action="execute_query", resource="queries", status="error")
        return {"error": "Internal server error"}, 500

@app.route("/api/exports", methods=["POST"])
@require_oauth_scope("exports:create")
def create_export():
    """Export dashboard or query result to CSV/PDF."""
    try:
        resource_type = request.json.get("type")  # dashboard or query
        resource_id = request.json.get("id")
        format_type = request.json.get("format", "csv")  # csv or pdf
        
        # Create export through Superset API
        export = {
            "id": "exp_123",
            "resource_type": resource_type,
            "resource_id": resource_id,
            "format": format_type,
            "url": f"https://storage.example.com/exports/exp_123.{format_type}",
            "expires_at": "2024-01-15T12:00:00Z"
        }
        
        log_access(action="create_export", resource=f"{resource_type}:{resource_id}", status="success")
        return {"export": export}, 201
    except Exception as e:
        logger.error(f"Error creating export: {str(e)}")
        log_access(action="create_export", resource="exports", status="error")
        return {"error": "Internal server error"}, 500

@app.errorhandler(404)
def not_found(e):
    return {"error": "Not found"}, 404

@app.errorhandler(500)
def internal_error(e):
    logger.error(f"Internal server error: {str(e)}")
    return {"error": "Internal server error"}, 500

if __name__ == "__main__":
    # In production, use a production WSGI server (gunicorn, uWSGI, etc.)
    # and enable HTTPS with a valid certificate
    app.run(host="0.0.0.0", port=5000, debug=False)

Step 3: Client Implementation (Python AI Agent):

import requests
import os
from datetime import datetime, timedelta

class SupersetMCPClient:
    def __init__(self, client_id, client_secret, auth0_domain, mcp_server_url):
        self.client_id = client_id
        self.client_secret = client_secret
        self.auth0_domain = auth0_domain
        self.mcp_server_url = mcp_server_url
        self.access_token = None
        self.token_expires_at = None
    
    def get_access_token(self):
        """Get OAuth 2.1 access token from Auth0."""
        # Check if cached token is still valid
        if self.access_token and datetime.now() < self.token_expires_at:
            return self.access_token
        
        # Request new token
        token_url = f"https://{self.auth0_domain}/oauth/token"
        payload = {
            "grant_type": "client_credentials",
            "client_id": self.client_id,
            "client_secret": self.client_secret,
            "audience": "https://superset-mcp-server",
            "scope": "dashboards:read queries:execute"
        }
        
        response = requests.post(token_url, json=payload)
        response.raise_for_status()
        
        data = response.json()
        self.access_token = data["access_token"]
        self.token_expires_at = datetime.now() + timedelta(seconds=data["expires_in"] - 60)
        
        return self.access_token
    
    def list_dashboards(self):
        """List available dashboards."""
        token = self.get_access_token()
        headers = {"Authorization": f"Bearer {token}"}
        
        response = requests.get(
            f"{self.mcp_server_url}/api/dashboards",
            headers=headers
        )
        response.raise_for_status()
        
        return response.json()["dashboards"]
    
    def execute_query(self, sql):
        """Execute a query."""
        token = self.get_access_token()
        headers = {"Authorization": f"Bearer {token}"}
        
        response = requests.post(
            f"{self.mcp_server_url}/api/queries",
            headers=headers,
            json={"sql": sql}
        )
        response.raise_for_status()
        
        return response.json()["result"]

# Usage
client = SupersetMCPClient(
    client_id=os.getenv("SUPERSET_CLIENT_ID"),
    client_secret=os.getenv("SUPERSET_CLIENT_SECRET"),
    auth0_domain="your-auth0-domain.auth0.com",
    mcp_server_url="https://superset-mcp-server.example.com"
)

dashboards = client.list_dashboards()
print(f"Available dashboards: {dashboards}")

result = client.execute_query("SELECT COUNT(*) as total_customers FROM customers")
print(f"Query result: {result}")

Step 4: Deployment and Monitoring:

Deploy your MCP server behind a load balancer with HTTPS enabled. Monitor token validation latency, failed authentication attempts, and access patterns.

# Example: Deploy with gunicorn and nginx
gunicorn --bind 0.0.0.0:5000 --workers 4 app:app
# nginx reverse proxy with HTTPS
server {
    listen 443 ssl;
    server_name superset-mcp-server.example.com;
    
    ssl_certificate /etc/certs/server.crt;
    ssl_certificate_key /etc/certs/server.key;
    
    location / {
        proxy_pass http://localhost:5000;
        proxy_set_header X-Real-IP $remote_addr;
        proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;
        proxy_set_header X-Forwarded-Proto $scheme;
    }
}

Key Takeaways and Decision Framework

Choosing the right authentication method for your MCP server depends on your use case, security requirements, and operational capacity:

Use API Keys if:

  • You’re in early-stage development or prototyping.
  • Your MCP server is accessed by a small, static set of internal services.
  • Your data is non-sensitive (public aggregations, non-confidential metrics).
  • You have minimal compliance requirements.

Use OAuth 2.1 if:

  • You’re in production and need audit trails, granular permissions, and token revocation.
  • You have multiple clients with different permission levels.
  • You’re building embedded analytics or multi-tenant systems.
  • You want short-lived credentials and the ability to instantly revoke access.
  • Your clients span internal services, third-party integrations, and customer-facing applications.

Use mTLS if:

  • You’re operating in a zero-trust network (e.g., Kubernetes with a service mesh).
  • You have strict compliance requirements that mandate cryptographic network authentication.
  • Your MCP server is accessed only by trusted internal services (not external clients).
  • You can absorb the operational overhead of certificate management.

Use OAuth 2.1 + mTLS if:

  • You need both network-level and application-level authentication.
  • You’re in a highly regulated industry (finance, healthcare) with strict audit requirements.
  • You want defense in depth: mTLS proves the client’s network identity, and OAuth 2.1 proves its application-level permissions.

According to MCP Authentication in Cursor: OAuth, API Keys, and Secure Configuration, the choice depends on the deployment context: development environments can use simpler methods, while production systems should prioritize OAuth 2.1 or mTLS.

Implementing Authentication for D23’s Managed Superset

When using D23’s managed Apache Superset platform, you’re leveraging a production-grade analytics infrastructure designed for scale. D23’s MCP server integration supports both OAuth 2.1 and mTLS out of the box, allowing you to secure your analytics APIs with enterprise-grade authentication.

D23 handles the operational complexity of Superset hosting, scaling, and updates, while you focus on implementing the right authentication strategy for your use case. Whether you’re embedding analytics in your product, exposing dashboards to internal teams, or integrating AI agents with text-to-SQL capabilities, D23’s API-first architecture and managed infrastructure ensure your authentication layer is secure, performant, and auditable.

For teams evaluating managed open-source BI as an alternative to Looker, Tableau, or Power BI, D23’s approach to authentication and security is a key differentiator. You get the flexibility and cost-efficiency of open-source (Apache Superset) with the security and compliance guarantees of a managed platform.

Review D23’s Terms of Service and Privacy Policy to understand how authentication, data access, and compliance are handled.

Conclusion

Authentication for MCP servers is not a one-size-fits-all problem. API keys are simple but insecure for production. OAuth 2.1 is the industry standard for delegated authorization and should be your default choice for production systems. mTLS adds network-level security for zero-trust environments. The best choice depends on your use case, compliance requirements, and operational capacity.

Start with OAuth 2.1 for production systems. It provides granular permissions, short-lived credentials, instant revocation, and audit trails—all critical for analytics platforms that expose sensitive business data. Add mTLS if you’re operating in a zero-trust network or have strict compliance requirements.

Migrate from API keys to OAuth 2.1 incrementally, maintaining backward compatibility during the transition. Test thoroughly in staging before rolling out to production. Monitor token validation latency and failed authentication attempts to catch issues early.

When building analytics infrastructure with D23’s managed Superset platform, leverage these authentication patterns to ensure your data is secure, your access is auditable, and your system is resilient to security incidents.