|Documentation
dataworkers

MCP Client Setup

Data Workers is built on the Model Context Protocol (MCP). Every agent is a standard MCP server — the same server binary works with Claude Code, Cursor, GitHub Copilot, Cline, Continue, OpenClaw, and any other MCP-compatible client. You do not need different packages for different editors. Pick your client, point it at npx dw-claw, and every agent tool is available immediately.

Universal Quick Start

The fastest way to connect Data Workers to any MCP client is a single command:

npx dw-claw

This launches the full Data Workers MCP server with all 15 agents and 155+ tools. It works with every client listed on this page. No installation step is required — npx downloads and runs the latest version automatically. The per-client sections below show exactly where to paste this command in each client's configuration.

Important: InMemory Stubs on First Run

When you first run `npx dw-claw` without any infrastructure credentials, the server starts with InMemory stub data. This is expected behavior, not an error. The stubs include realistic sample tables, pipelines, quality metrics, and lineage data so you can explore every tool immediately.

You will see responses referencing sample datasets such as analytics.orders, staging.customers, and raw.events. This confirms the server is working correctly. To connect to your real infrastructure, set the appropriate environment variables (for example, SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER, SNOWFLAKE_PASSWORD) before starting the server. The server auto-detects real credentials and switches from stubs to live connections — no code changes needed.

Verify It Works

After configuring any client below, verify the connection by asking your AI assistant:

List all available Data Workers tools and show me the health of my data pipelines.

You should see a list of tools organized by agent domain (incident, quality, schema, pipeline, catalog, governance, and more). If you are running with InMemory stubs, the pipeline health check will return sample pipeline statuses — this confirms everything is connected and working.

Dedicated Platform Guides

For in-depth onboarding, first workflows, and platform-specific troubleshooting, see the dedicated guides:

  • Claude Code Setup — Deep-dive installation, verification, team config, and troubleshooting
  • Cursor Setup — Full onboarding guide including UI configuration, project vs global setup, and team sharing
  • GitHub Copilot Setup — VS Code configuration, @workspace activation, and version requirements
  • Microsoft Copilot Setup — Current status and compatibility notes for Copilot Studio
  • OpenClaw, Cline & Continue Setup — Configuration for OpenClaw, Cline, Continue, and any MCP-compatible client

Claude Code

Claude Code has first-class MCP support and is the most mature integration. Data Workers was developed and tested primarily in Claude Code.

Option 1 — One command (recommended):

claude mcp add data-workers -- npx dw-claw

This registers the server with Claude Code. It persists across sessions — you do not need to run it again.

Option 2 — Manual configuration:

Edit .mcp.json in your project root and add the server under mcpServers:

{ "mcpServers": { "data-workers": { "command": "npx", "args": ["dw-claw"] } } }

After adding, Claude Code auto-discovers all agent tools. Type claude mcp list to verify the server appears and shows a connected status.

Tips:

  • Set environment variables (e.g., SNOWFLAKE_ACCOUNT, SNOWFLAKE_USER) before launching Claude Code so agents can reach your infrastructure.
  • You can run claude mcp remove data-workers to cleanly remove the server.
  • Claude Code supports running multiple MCP servers simultaneously. You can add Data Workers alongside other MCP servers without conflict.

Cursor

Cursor has well-supported MCP integration available in both project-scoped and global configurations.

Project-scoped configuration (recommended for teams):

Create or edit .cursor/mcp.json in your project root:

{ "mcpServers": { "data-workers": { "command": "npx", "args": ["dw-claw"] } } }

Global configuration (applies to all projects):

Open Cursor Settings, navigate to MCP Servers, and add a new server with the command npx dw-claw.

Tips:

  • Cursor uses the mcpServers key in its configuration.
  • Restart Cursor after adding or modifying the MCP configuration.
  • The .cursor/mcp.json file is project-scoped. If you want Data Workers available across all projects, add it through Cursor Settings instead.

GitHub Copilot (VS Code)

GitHub Copilot added MCP server support in recent VS Code releases. This integration is real and functional, but it is newer than Claude Code and Cursor support — expect the experience to improve as GitHub iterates on their MCP implementation.

Add the following to your VS Code settings.json:

"mcp.servers": { "data-workers": { "command": "npx", "args": ["dw-claw"] } }

After adding, open the Copilot chat panel and verify tools are discovered. If tools do not appear, ensure you are running VS Code 1.99 or later and that the GitHub Copilot extension is updated to the latest version.

Tips:

  • GitHub Copilot uses the mcp.servers key in VS Code settings, which differs from Cursor's mcpServers format. Do not mix them up.
  • Use @workspace in Copilot Chat to activate agent mode, which enables MCP tool calls.
  • GitHub Copilot's MCP support is relatively new. Check the GitHub Copilot documentation for the latest configuration options.

Microsoft Copilot (Copilot Studio)

Status: Evolving. Microsoft has announced MCP support for Copilot Studio as part of their enterprise agent platform. The integration is enterprise-oriented and aimed at organizations building custom agents within the Microsoft 365 ecosystem.

Microsoft's MCP implementation is actively changing. Rather than provide speculative configuration steps that may become outdated, we recommend the following:

  • Monitor the official Microsoft Copilot Studio documentation for MCP server configuration guidance.
  • The underlying Data Workers MCP server (npx dw-claw) is fully compatible — the connection method is the only variable.
  • Join our Discord (discord.com/invite/b8DR5J53) for community-reported configuration steps as the integration stabilizes.

We will update this section with verified configuration steps once Microsoft's MCP support reaches general availability.

OpenClaw, Cline, and Continue

OpenClaw, Cline, and Continue all support MCP servers via the standard stdio transport. Because npx dw-claw is a standard MCP stdio server, the same configuration pattern works across all three clients.

In each client's MCP configuration file, add:

{ "mcpServers": { "data-workers": { "command": "npx", "args": ["dw-claw"] } } }

OpenClaw: Add the server in your OpenClaw MCP settings. OpenClaw discovers tools automatically on connection.

Cline: Edit the Cline MCP configuration in VS Code settings or the Cline config file. Cline will list discovered tools in the chat sidebar.

Continue: Add the server in your Continue configuration file (~/.continue/config.json or project-level). Continue shows available tools in its tool panel.

Any MCP client that supports stdio transport can connect to Data Workers using this same pattern. If your client is not listed here, consult its documentation for how to add an MCP server with a command and args configuration.

Configuration Reference

All clients use the same underlying MCP server. The differences are only in where the configuration lives:

  • Claude Code: .mcp.json in your project root — key: mcpServers
  • Cursor: .cursor/mcp.json (project) or Cursor Settings (global) — key: mcpServers
  • GitHub Copilot: VS Code settings.json — key: mcp.servers
  • OpenClaw / Cline / Continue: Each client's respective MCP config file — key: mcpServers

The server command is always npx dw-claw. Environment variables for infrastructure credentials should be set in your shell profile or .env file before launching your editor.

Adding Individual Agents

By default, npx dw-claw starts all 15 agents. If you prefer to run a single agent, you can specify it directly:

  • npx dw-claw --agent incident — Incident Debugging Agent
  • npx dw-claw --agent quality — Quality Monitoring Agent
  • npx dw-claw --agent schema — Schema Evolution Agent
  • npx dw-claw --agent pipeline — Pipeline Building Agent
  • npx dw-claw --agent context — Data Context and Catalog Agent
  • npx dw-claw --agent governance — Governance and Security Agent

This is useful for resource-constrained environments or when you want to limit tool discovery to a single domain. See the Agent Reference for the full list of agent names.

Troubleshooting

"npx dw-claw" hangs or does not start — Ensure Node.js 20 or later is installed. Run node --version to check. If you are behind a corporate proxy, ensure npm is configured with your proxy settings.

Tools do not appear in my client — The MCP server must be running before the client can discover tools. Restart your editor after adding the configuration. In Claude Code, run claude mcp list to verify the server status.

"Connection refused" or "Transport error" — This usually means the npx process exited. Run npx dw-claw manually in a terminal to see the full output. Common causes: missing Node.js, network issues preventing package download, or a port conflict.

Tools appear but return errors — If you see errors referencing missing credentials (for example, SNOWFLAKE_ACCOUNT not set), the agent is trying to reach real infrastructure but credentials are not configured. Either set the required environment variables or continue using the InMemory stubs for exploration.

Different config key names across clients — GitHub Copilot uses mcp.servers while Cursor, Claude Code, and most other clients use mcpServers. Using the wrong key name is a common source of silent misconfiguration.

Still stuck? Join our Discord community at discord.com/invite/b8DR5J53 or open an issue on GitHub at github.com/DataWorkersProject/dataworkers-claw-community.

What You Can Do Next

With the Community tier, all 15 agents operate in read-only mode — diagnostics, analysis, discovery, and recommendations. This is a fully functional experience for understanding your data infrastructure.

The Pro tier unlocks write tools that let agents take action. Here is a concrete example: with Community, the Incident Agent can analyze a failing pipeline, identify the root cause, and recommend a fix. With Pro, it can execute the remediation, restart the pipeline, and verify the fix — without manual intervention.

  • Pipeline Agent (Pro): Generate new pipelines from natural language, deploy fixes, manage DAG modifications
  • Schema Agent (Pro): Execute migration scripts, apply schema changes across environments
  • Quality Agent (Pro): Auto-generate and enforce quality rules, remediate data issues
  • Governance Agent (Pro): Apply access policies, execute PII remediation, enforce retention rules
  • Incident Agent (Pro): Execute automated remediation runbooks, restart failed tasks, apply hotfixes

Explore the read-only tools first. When you are ready for agents that do not just diagnose but fix, visit the Pricing page to see Pro and Enterprise options.