Artur
Artur
Founder

n8n MCP Server: How to Expose Your Workflows to AI Agents

March 20, 2026

n8nmcpai-agentsmodel-context-protocol

n8n MCP Server: How to Expose Your Workflows to AI Agents

MCP (Model Context Protocol) lets AI agents call external tools through a standardized interface. n8n has native MCP support, which means you can expose any n8n workflow as a tool that Claude Desktop, Cursor, Windsurf, or any MCP-compatible client can discover and run.

The practical result: you tell Claude "send a Slack message to #dev-updates" and it calls your n8n workflow that actually sends the message. Or you ask Cursor to "check the latest orders" and it triggers a workflow that queries your database and returns the data. Your n8n workflows become the hands that your AI agents use to interact with the real world.

This guide covers setting up n8n as an MCP server, configuring authentication, exposing workflows as tools, and connecting clients.

Two ways to use MCP in n8n

n8n supports MCP in both directions.

As an MCP server, n8n exposes your workflows as tools that external AI agents can call. You set up an MCP Server Trigger node, attach tool nodes to it, and share the endpoint URL with your AI client.

As an MCP client, n8n connects to external MCP servers and uses their tools inside your workflows. You add an MCP Client Tool node, point it at an external MCP endpoint, and your n8n AI agent can call those tools.

This guide focuses on the server side - making your n8n workflows available to AI agents. The client side is useful when you want n8n's built-in AI agent to access tools from other MCP servers, but the server setup is what most people are looking for first.

Setting up the MCP Server Trigger node

Start with a new workflow in n8n. Add an MCP Server Trigger node. This is the entry point that MCP clients will connect to.

The node generates two URLs: a test URL for development and a production URL for when the workflow is active. The test URL works while you have the workflow open and are listening for events. The production URL works when the workflow is published and active.

Copy the production URL. This is what you'll give to your MCP client.

Attaching tools

The MCP Server Trigger doesn't pass data to the next node in a chain like a normal trigger. Instead, it connects directly to tool nodes. Each tool node you attach becomes a callable function that MCP clients can discover.

You can attach n8n's built-in tool nodes (Google Calendar, Slack, HTTP Request, databases) or use the Custom n8n Workflow Tool node to expose an entire subworkflow as a single tool.

The Custom n8n Workflow Tool approach is the most flexible. You can build a complex workflow with multiple steps - query a database, process the results, format a response - and expose the whole thing as one tool that an AI agent calls with simple parameters.

Authentication

By default, the MCP Server Trigger has no authentication. This means anyone with the URL can call your tools. Do not run this in production without auth.

The node supports Bearer Token and Header Auth. For Bearer Token, create a credential in n8n with your token value. The MCP client will need to include this token in its connection configuration.

For production setups, Bearer Token auth is the minimum. If your n8n instance is exposed to the internet, also consider putting it behind a reverse proxy with IP restrictions.

Instance-level MCP access

n8n also supports instance-level MCP, which is separate from the per-workflow MCP Server Trigger approach.

With instance-level MCP, you enable MCP access in your n8n settings (Settings > Instance-level MCP), then toggle individual workflows to be accessible. Any connected MCP client can then search across all enabled workflows and run them.

The difference: the MCP Server Trigger gives you fine-grained control over exactly which tools are exposed and how they're configured per workflow. Instance-level MCP is broader - it exposes entire workflows to all connected clients.

For most setups, start with the MCP Server Trigger approach. It's more explicit about what you're exposing.

Connecting Claude Desktop

Claude Desktop supports MCP natively. To connect it to your n8n MCP server, add the following to your Claude Desktop configuration file:

{
  "mcpServers": {
    "n8n": {
      "command": "npx",
      "args": [
        "mcp-remote",
        "https://your-n8n-instance.com/webhook/mcp/your-path",
        "--header",
        "Authorization: Bearer your-token-here"
      ]
    }
  }
}

Replace the URL with your MCP Server Trigger's production URL and the token with your Bearer Token value.

On macOS, the config file is at ~/Library/Application Support/Claude/claude_desktop_config.json. On Windows, it's at %APPDATA%\Claude\claude_desktop_config.json.

Restart Claude Desktop after editing. If the connection is successful, you'll see your n8n tools listed when you click the tool icon in Claude's chat.

Connecting Cursor and other IDE clients

For Cursor, add the MCP server in your project's .cursor/mcp.json:

{
  "mcpServers": {
    "n8n": {
      "url": "https://your-n8n-instance.com/webhook/mcp/your-path",
      "headers": {
        "Authorization": "Bearer your-token-here"
      }
    }
  }
}

Windsurf and VS Code with Copilot follow similar patterns. The key is always the same: provide the MCP endpoint URL and authentication credentials.

Practical example: exposing a Slack + database workflow

Here's a concrete setup that's useful for development teams.

Create a workflow with the MCP Server Trigger. Attach three tool nodes:

  1. Slack tool - configured to send messages to channels in your workspace
  2. PostgreSQL tool - configured to run read-only queries against your database
  3. Custom n8n Workflow Tool - pointing to a subworkflow that creates a GitHub issue from a title and description

Activate the workflow. Connect Claude Desktop to the MCP endpoint.

Now you can ask Claude: "Check how many users signed up today, then post the count to #daily-metrics on Slack." Claude discovers the PostgreSQL and Slack tools, runs the query, gets the count, and sends the message. Two tool calls, zero manual work.

The subworkflow approach for the GitHub tool is worth highlighting. Instead of exposing the raw GitHub API node (which would require Claude to know the exact API parameters), you build a simple subworkflow that accepts title and description as inputs and handles all the GitHub API details internally. The AI agent gets a clean, simple tool to call.

Troubleshooting

MCP client can't connect: Check that your workflow is active (published), not just saved. The production URL only works when the workflow is toggled on. Also verify your n8n instance is reachable from the client - if self-hosted, your firewall and reverse proxy need to allow the connection.

Tools not showing up: The MCP Server Trigger only exposes tool nodes that are directly connected to it. If you've connected a regular node (like a Function node) instead of a tool node, it won't appear to MCP clients.

Authentication failures: Make sure your Bearer Token credential in n8n matches exactly what you've configured in the client. Watch for trailing spaces or newlines in the token value.

SSE connection drops: The MCP Server Trigger uses Server-Sent Events (SSE) for persistent connections. If you're behind a reverse proxy like Nginx, make sure it's configured to support SSE - specifically, disable proxy buffering for the MCP endpoint path.

When MCP makes sense vs webhooks

MCP adds value when your AI agent needs to discover and choose between multiple tools dynamically. If you have a fixed workflow that always does the same thing when triggered, a simple webhook is simpler and more reliable.

Use MCP when: your AI agent needs to pick the right tool based on context, you want to expose a suite of capabilities to IDE assistants like Cursor, or you're building an agentic system where the AI decides what actions to take.

Use webhooks when: you have a specific trigger (form submission, scheduled event, external API call) that always runs the same workflow. No AI decision-making involved.

Most production setups use both. MCP for the AI-facing interface, webhooks for the deterministic automation.


FAQ

What is MCP in n8n?

MCP (Model Context Protocol) is a standard that lets AI agents discover and call external tools. n8n supports MCP natively with two nodes: the MCP Server Trigger (which exposes n8n tools to AI clients) and the MCP Client Tool (which lets n8n call tools from external MCP servers). No extra packages or installations needed.

How do I connect Claude Desktop to n8n via MCP?

Add your n8n MCP Server Trigger's production URL to Claude Desktop's configuration file with a Bearer Token for authentication. Use the mcp-remote npx package as the command. Restart Claude Desktop and your n8n tools will appear in the tools menu.

Does n8n MCP require self-hosting?

No. Both n8n Cloud and self-hosted n8n support MCP. The MCP Server Trigger node is available in both environments. Self-hosting gives you more control over the endpoint URL and network configuration, but it's not required.

Can I use n8n MCP with Cursor or VS Code?

Yes. Any MCP-compatible client can connect to an n8n MCP server. Cursor, VS Code with Copilot, Windsurf, and Claude Code all support MCP connections. Configure the endpoint URL and authentication in the client's MCP settings.

What's the difference between the MCP Server Trigger and instance-level MCP?

The MCP Server Trigger is configured per workflow and gives you precise control over which tools are exposed. Instance-level MCP (in n8n settings) lets you toggle entire workflows on/off for MCP access across all connected clients. The per-workflow approach is more granular; the instance-level approach is faster to set up when you want broad access.


n8n MCP Server: How to Expose Your Workflows to AI Agents | n8nlogic