OpenClaw, Cline & Continue Setup
This guide covers setup for OpenClaw, Cline, Continue, and any other MCP-compatible client. For a quick reference across all platforms, see the MCP Client Setup page.
Prerequisites
- •Your MCP client — OpenClaw, Cline, Continue, or any client that supports the MCP stdio transport.
- •Node.js 20+ — Required to run MCP servers. Check with
node --version. - •npm — Bundled with Node.js.
Universal Configuration
All these clients support MCP servers via the standard stdio transport. The configuration pattern is the same:
{ "mcpServers": { "data-workers": { "command": "npx", "args": ["dw-claw"] } } }Add this to your client's MCP configuration file. The location varies by client — see the sections below.
OpenClaw
Add the server in your OpenClaw MCP settings. OpenClaw discovers tools automatically on connection.
OpenClaw also supports YAML configuration:
mcp_servers:\n - name: data-workers\n transport: stdio\n command: npx\n args: ["dw-claw"]Cline
Edit the Cline MCP configuration in VS Code settings or the Cline config file. Add the data-workers server entry with npx dw-claw as the command.
After configuration, Cline will list discovered tools in the chat sidebar. You can call Data Workers tools directly from the Cline chat interface.
Continue
Add the server in your Continue configuration file. Continue supports both project-level and global configuration:
Global: Edit ~/.continue/config.json and add the data-workers server entry.
Project-level: Add a .continue/config.json in your project root.
Continue shows available tools in its tool panel after configuration.
Any Other MCP Client
Data Workers agents are standard MCP servers using stdio transport. If your client supports MCP and is not listed here, consult its documentation for how to add an MCP server with a command and args configuration. The command is always npx and the args are ["dw-claw"].
Verify Connection
After configuration, ask your AI assistant:
List all available Data Workers tools and show me the health of my data pipelines.
You should see tools organized by agent domain.
InMemory Stubs on First Run
Without infrastructure credentials, the server starts with InMemory stub data. This is expected. Set environment variables to connect to real infrastructure.
First 5 Workflows to Try
- •Incident diagnosis: "Why did my nightly ETL pipeline fail? Check logs and provide root cause analysis."
- •Data quality check: "Run a quality assessment on analytics.orders and flag anomalies."
- •Catalog search: "Search the catalog for tables related to customer revenue."
- •Schema analysis: "Analyze the schema of staging.events and suggest improvements."
- •Pipeline health: "Show pipeline health across all orchestrators."
Environment Variables
- •
SNOWFLAKE_ACCOUNT,SNOWFLAKE_USER,SNOWFLAKE_PASSWORD— Snowflake - •
GOOGLE_CLOUD_PROJECT,GOOGLE_APPLICATION_CREDENTIALS— BigQuery / Dataplex - •
DATABRICKS_HOST,DATABRICKS_TOKEN— Databricks - •
DBT_API_TOKEN,DBT_ACCOUNT_ID— dbt Cloud - •
AIRFLOW_HOST,AIRFLOW_USER,AIRFLOW_PASSWORD— Apache Airflow
Troubleshooting
Tools not appearing — Restart your editor/client after adding the configuration. Most MCP clients load server configurations at startup.
Server not starting — Run npx dw-claw manually in a terminal to see error output.
Config key — Most clients use mcpServers (camelCase). If tools are not discovered, check your client's documentation for the exact key name.
Still stuck? Join our Discord at discord.com/invite/b8DR5J53 or open an issue on GitHub.