Running Docker MCP Gateway in a Docker container
The MCP Gateway is Docker's solution for securely orchestrating and managing Model Context Protocol (MCP) servers locally and in production including enterprise environments.

You're a developer trying to set up a simple AI application that needs file access, database connectivity, and web search capabilities. Here's what your terminal situation looks like:
# Terminal 1: File system server
npx @modelcontextprotocol/server-filesystem /project/files
# Terminal 2: Database server
uvx mcp-server-postgresql --connection="postgresql://..."
# Terminal 3: Web search server
docker run -p 3001:3000 mcp/web-search
# Terminal 4: Custom API server
node custom-mcp-server.js
Four different terminals. Four different startup commands. Four different ways things can break.
And that's just the beginning. Each AI client - VS Code, Cursor, Claude Desktop - wants its own special configuration format:
// VS Code wants this
{
"mcp:" {
"servers:": {
"filesystem": {
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "."]
}
}
}
// But Cursor wants this
{
"mcpServers": {
"filesystem": {
"command": "npx",
"args": ["@modelcontextprotocol/server-filesystem", "."]
}
}
}
It's like speaking three different languages for the same conversation. Exhausting.
The Security Nightmare
But here's where it gets really scary. Traditional MCP setups force you to expose secrets everywhere:
export DATABASE_URL="postgresql://user:secret123@db:5432/mydb"
export GITHUB_TOKEN="ghp_extremely_secret_token_here"
export OPENAI_API_KEY="sk-very_secret_openai_key"
Run ps aux | grep mcp
and boom - your database password is right there for anyone to see. Commit a Docker file with these exports? Congratulations, your secrets are now permanently in Git history.
This isn't just inconvenient - it's a security disaster waiting to happen.
Enter Docker MCP Gateway: Your AI Command Center
Docker MCP Gateway solves all of this with three core principles:
π Secure by Default
Every MCP server runs in its own isolated container with minimal privileges. Even if a tool gets compromised, it can't touch your host system or access unauthorized resources.
π― Unified Management
One gateway endpoint aggregates multiple MCP servers. One configuration. One place to manage credentials. One interface for all your AI tools.
ποΈ Enterprise Control
Comprehensive logging, monitoring, and filtering give you complete visibility into what your AI tools are doing.
Let's Get Our Hands Dirty: 6 Ways to Run Docker MCP Gateway
Demo 1: The Minimalist Approach
Want to get started in under 2 minutes? Here's the simplest possible setup:
services:
gateway:
image: docker/mcp-gateway
command:
- --servers=duckduckgo
volumes:
- /var/run/docker.sock:/var/run/docker.sock
ports:
- "8811:8811"
That's it. Run docker compose up
and you've got a working MCP gateway with DuckDuckGo search capabilities. No secrets to manage, no complex configuration - just working AI tools.
Output:
gateway-1 | - Using images:
gateway-1 | - mcp/duckduckgo@sha256:68eb20db6109f5c312a695fc5ec3386ad15d93ffb765a0b4eb1baf4328dec14f
gateway-1 | > Images pulled in 20.635834ms
gateway-1 | - Verifying images [mcp/duckduckgo]
gateway-1 | > Images verified in 4.319907252s
gateway-1 | - Listing MCP tools...
gateway-1 | - Running mcp/duckduckgo with [run --rm -i --init --security-opt no-new-privileges --cpus 1 --memory 2Gb --pull never -l docker-mcp=true -l docker-mcp-tool-type=mcp -l docker-mcp-name=duckduckgo -l docker-mcp-transport=stdio --network mcp-gate_default]
gateway-1 | - duckduckgo: [07/12/25 07:32:58] INFO Processing request of type server.py:523
gateway-1 | - duckduckgo: ListToolsRequest
gateway-1 | - duckduckgo: INFO Processing request of type server.py:523
gateway-1 | - duckduckgo: ListPromptsRequest
gateway-1 | - duckduckgo: INFO Processing request of type server.py:523
gateway-1 | - duckduckgo: ListResourcesRequest
gateway-1 | - duckduckgo: INFO Processing request of type server.py:523
gateway-1 | - duckduckgo: ListResourceTemplatesRequest
gateway-1 | > duckduckgo: (2 tools)
gateway-1 | > 2 tools listed in 976.765458ms
gateway-1 | > Initialized in 5.660334003s
gateway-1 | > Start stdio over TCP server on port 8811
That output is exactly what we want to see - a perfect validation of the blog post content.
Let me break down what just happened in those logs:
β‘ Lightning Fast Setup: Images pulled in just 20ms - this is the "2-minute setup" promise delivered.
π Security First: Notice the --security-opt no-new-privileges --cpus 1 --memory 2Gb
flags? That's the gateway automatically applying security constraints to protect your system.
π Image Verification: 4.3 seconds spent cryptographically verifying the MCP server image - no compromised containers making it into your environment.
π οΈ Tool Discovery: The gateway found 2 tools from DuckDuckGo (search and web content fetching) and is now serving them through a unified interface.
π Ready to Rock: Your gateway is now listening on port 8811, ready to handle requests from any MCP client.
Switching the MCP Gateway to Server-Sent Events mode for HTTP Testing
Want to test it programmatically? Switch to SSE mode and use curl:
services:
gateway:
image: docker/mcp-gateway
command:
- --servers=duckduckgo
- --transport=sse
- --port=8811
volumes:
- /var/run/docker.sock:/var/run/docker.sock
ports:
- "8811:8811"
docker compose down
docker compose up
Testing with netcat
# Test basic connectivity
echo '{"jsonrpc":"2.0","id":1,"method":"tools/list","params":{}}' | nc localhost 8811
Result:
{"jsonrpc":"2.0","id":1,"result":{"tools":[{"annotations":{},"description":"\n Fetch and parse content from a webpage URL.\n\n Args:\n url: The webpage URL to fetch content from\n ctx: MCP context for logging\n ","inputSchema":{"properties":{"url":{"title":"Url","type":"string"}},"required":["url"],"type":"object"},"name":"fetch_content"},{"annotations":{},"description":"\n Search DuckDuckGo and return formatted results.\n\n Args:\n query: The search query string\n max_results: Maximum number of results to return (default: 10)\n ctx: MCP context for logging\n ","inputSchema":{"properties":{"max_results":{"default":10,"title":"Max Results","type":"integer"},"query":{"title":"Query","type":"string"}},"required":["query"],"type":"object"},"name":"search"}]}}
That response shows exactly what we want to see:
Tools Available:
search
- DuckDuckGo search with configurable results (default: 10)fetch_content
- Fetch and parse content from any webpage URL
Let's Test the Tools in Action!
Now let's actually use these tools:
Test 1: DuckDuckGo Search
echo '{"jsonrpc":"2.0","id":2,"method":"tools/call","params":{"name":"search","arguments":{"query":"Docker MCP Gateway","max_results":3}}}' | nc localhost 8811
{"jsonrpc":"2.0","id":2,"result":{"content":[{"type":"text","text":"Found 3 search results:\n\n1. MCP Gateway | Docker Docs\n URL: https://docs.docker.com/ai/mcp-gateway/\n Summary: TheMCPGatewayisDocker'sopen-source enterprise-ready solution for orchestrating and managing Model Context Protocol (MCP) servers securely across development and production environments. It is designed to help organizations connectMCPservers from theDockerMCPCatalog toMCPClients without compromising security, visibility, or control.\n\n2. GitHub - docker/mcp-gateway: docker mcp CLI plugin / MCP Gateway\n URL: https://github.com/docker/mcp-gateway\n Summary: DockerMCPPlugin andDockerMCPGatewayTheMCPToolkit, inDockerDesktop, allows developers to configure and consumeMCPservers from theDockerMCPCatalog. Underneath, the Toolkit is powered by adockerCLI plugin:docker-mcp. This repository is the code of this CLI plugin. It can work inDockerDesktop or independently.\n\n3. Docker MCP Gateway: Unified, Secure Infrastructure for Agentic AI ...\n URL: https://www.docker.com/blog/docker-mcp-gateway-secure-infrastructure-for-agentic-ai/\n Summary: Easily and securely connect trustedMCPservers to AI agents usingDocker'sopen-sourceMCPGateway. Discover how it simplifies agentic AI development.\n"}]}}
Test 2: Fetch Web Content
echo '{"jsonrpc":"2.0","id":3,"method":"tools/call","params":{"name":"fetch_content","arguments":{"url":"https://www.docker.com"}}}' | nc localhost 8811
{"jsonrpc":"2.0","id":3,"result":{"content":[{"type":"text","text":"Docker: Accelerated Container Application Development Develop faster Your foundation for secure, intelligent development Download Docker Desktop Download for Mac ...ainer images, ensuring best practices and peace of mind. Discover Docker Hub Run Develop secure, modern applications with Docker Desktop Docker Desktop simplifies and accelerates the development of secure, containerized applications. Gain s thatβs right for you Find your perfect balance of collaboration, security, and support with a Docker subscription. Find pricing"}]}}
What This Proves:
β
Gateway is running and responding correctly
β
Tool discovery works - found both DuckDuckGo tools
β
Security is working - containers are isolated
β
JSON-RPC protocol is functioning properly
The docker mcp
CLI commands work with the MCP Toolkit's internal gateway, not your compose-based gateway.
If You Want Both Approaches
You can keep your compose gateway running on port 8811 and also run the toolkit gateway on a different port:
# Your compose gateway on 8811 (keep running)
# Toolkit gateway on different port
docker mcp gateway run --port 8812 --servers duckduckgo
Demo 2: Adding Secure Secret Management
Ready for production? Let's add proper credential handling:
services:
gateway:
image: docker/mcp-gateway
command:
- --servers=github-official
- --secrets=docker-desktop:/run/secrets/mcp_secret
volumes:
- /var/run/docker.sock:/var/run/docker.sock
secrets:
- mcp_secret
secrets:
mcp_secret:
file: .env
Create a .env
file with your GitHub token:
GITHUB_PERSONAL_ACCESS_TOKEN=ghp_your_github_token_here
Now your secrets are properly managed through Docker's secret system instead of floating around in environment variables.
Demo 3: Full Database Integration
This example demonstrates how to run the Docker MCP Gateway with PostgreSQL database integration using Docker Compose. It showcases database connectivity, secure secret management, and practical AI agent interactions with your PostgreSQL data.
This configuration provides:
- Complete Database Stack: PostgreSQL database with sample data
- MCP Gateway Integration: Secure proxy to PostgreSQL MCP servers
- Secret Management: Environment-based database credentials
- Client Examples: Ready-to-use AI agent connections
- Development Ready: Local development with persistent data
This Docker Compose configuration demonstrates a complete MCP Gateway stack with PostgreSQL integration using streaming transport protocol.
The setup consists of three interconnected services:
- a PostgreSQL database (pg),
- the MCP Gateway (gateway), and
- a custom client application (client).
The PostgreSQL service is configured with basic credentials and includes a health check that ensures the database is ready before dependent services start, using pg_isready to verify connectivity every second with a 3-second timeout and up to 10 retries.
Getting Started
git clone https://github.com/docker/mcp-gateway
cd mcp-gateway/examples/postgresql
services:
client:
build: .
environment:
- MCP_HOST=http://gateway:8811/mcp
depends_on:
- gateway
gateway:
image: docker/mcp-gateway
volumes:
- /var/run/docker.sock:/var/run/docker.sock
command:
- --transport=streaming
- --servers=postgres
- --tools=query
- --verbose=false
- --secrets=/run/secrets/database_url
secrets:
- database_url
depends_on:
pg:
condition: service_healthy
pg:
image: postgres
environment:
POSTGRES_USER: user
POSTGRES_PASSWORD: password
POSTGRES_DB: database
healthcheck:
test: ["CMD-SHELL", "pg_isready -U user -d database"]
interval: 1s
timeout: 3s
retries: 10
secrets:
database_url:
file: ./postgres_url
Start the services
docker compose up -d --build
The logs show three distinct phases of operation that confirm the entire stack is working correctly.
- The PostgreSQL 17.5 database successfully initializes with the configured credentials (user/password/database), completes its startup sequence, and becomes ready to accept connections on port 5432.
- The gateway successfully reads its configuration from Docker's MCP catalog, pulls and verifies the mcp/postgres image, and discovers 1 available tool (the query tool). It then starts the streaming server on port 8811 and handles an incoming request by executing the SQL query SELECT datname FROM pg_database; against the PostgreSQL database.
- The client successfully connects to the gateway's streaming endpoint, sends a database query request, and receives the expected JSON response listing all databases in the PostgreSQL instance (postgres, database, template1, template0). The clean exit with code 0 confirms the entire workflow - from client request through the MCP Gateway to the PostgreSQL MCP server and back - operates seamlessly, validating that the streaming transport protocol effectively bridges AI applications with database operations.
Demo 4: Building Custom Clients
This example demonstrates how to build and deploy a custom client application that connects to the Docker MCP Gateway using the HTTP streaming protocol. The client leverages DuckDuckGo search capabilities through the gateway, showcasing how AI applications can be containerized and integrated with MCP tools.
This configuration provides:
- Custom Client Application: Containerized client built from local source code
- HTTP Streaming Protocol: Efficient HTTP-based communication with the gateway
- DuckDuckGo Integration: Web search capabilities through MCP
- Simplified Setup: No secrets required for this public search service
Getting Started
git clone https://github.com/docker/mcp-gateway
cd mcp-gateway/examples/client
This Docker Compose configuration establishes a client-server architecture where a custom-built client application communicates with the MCP Gateway through HTTP streaming protocol.
The client service is built from the local directory using build: ., meaning it will use a Dockerfile in the current directory to create a containerized application.
The client connects to the gateway at http://gateway:9011/mcp through the MCP_HOST environment variable, and the depends_on: gateway ensures the gateway starts first before the client attempts to connect.
This setup allows developers to create custom applications that can leverage MCP tools through a standardized HTTP interface.
services:
client:
build: .
environment:
- MCP_HOST=http://gateway:9011/mcp
depends_on:
- gateway
gateway:
image: docker/mcp-gateway
command:
- --transport=streaming
- --servers=duckduckgo
- --port=9011
volumes:
- /var/run/docker.sock:/var/run/docker.sock
Start the services
docker compose up -d --build
This log output demonstrates the complete lifecycle of the MCP Gateway client example, showing both the gateway initialization and successful client interaction. The gateway startup process begins by reading its configuration from Docker's hosted MCP catalog, discovering and enabling the DuckDuckGo server, then pulling and verifying the mcp/duckduckgo image with SHA verification for security. The gateway discovers 2 available tools from the DuckDuckGo server during initialization, which takes about 4 seconds total, and then starts its HTTP streaming server on port 9011, ready to accept client requests.
The client interaction phase shows a successful search operation where the client sends a search request for "Docker" to the gateway. The gateway processes this request by scanning the arguments for sensitive information (finding none), then dynamically spawning a DuckDuckGo container with security restrictions including --security-opt no-new-privileges, CPU limits of 1 core, and memory limits of 2GB. The search completes in approximately 1.3 seconds, the response is scanned for secrets (none found), and the client successfully receives 10 search results. This demonstrates the full end-to-end workflow of HTTP streaming communication between a custom client application and the MCP Gateway, showcasing how containerized MCP tools can be securely orchestrated and accessed through a unified interface.
Demo 5: MCP Toolkit Integration
For the full Docker Desktop experience with a visual UI:
Getting Started
git clone https://github.com/docker/mcp-gateway
cd mcp-gateway/examples/mcp_toolkit
services:
gateway:
image: docker/mcp-gateway
ports:
- "8080:8080"
volumes:
- "/var/run/docker.sock:/var/run/docker.sock"
- "~/.docker/mcp:/mcp"
command:
- --catalog=/mcp/catalogs/docker-mcp.yaml
- --config=/mcp/config.yaml
- --registry=/mcp/registry.yaml
- --secrets=docker-desktop
- --watch=true
- --transport=sse
- --port=8080
Start the services
docker compose up -d --build
This log output demonstrates the comprehensive startup sequence of a production-configured MCP Gateway with multiple servers and enterprise-grade features.
The gateway begins by reading its multi-file configuration setup, loading the registry definitions from /mcp/registry.yaml, custom catalog from /mcp/catalogs/docker-mcp.yaml, and main configuration from /mcp/config.yaml, then securely accessing stored credentials including pat_token, firecrawl.api_key, and github.personal_access_token.
The configuration process immediately establishes file watchers on the registry and config files, enabling live configuration reloading without service restarts when these files are modified.
The server orchestration phase reveals five enabled MCP servers (dockerhub, duckduckgo, firecrawl, github-official, sequentialthinking) with their corresponding container images pulled and cryptographically verified using SHA256 hashes for security.
Each server container is launched with robust security constraints including --security-opt no-new-privileges to prevent privilege escalation, CPU limits of 1 core, memory limits of 2GB, and proper Docker labeling for management.
The containers are injected with service-specific environment variables (like FIRECRAWL_API_KEY for Firecrawl and GITHUB_PERSONAL_ACCESS_TOKEN for GitHub) while maintaining network isolation within the mcp_toolkit_default network.
This sophisticated setup demonstrates how the MCP Gateway can manage a heterogeneous collection of AI tools - from web search (DuckDuckGo) and web scraping (Firecrawl) to repository management (GitHub) and AI reasoning (Sequential Thinking) - all orchestrated through a single, secure, and dynamically configurable gateway interface accessible via SSE transport on port 8080.
Demo 6: Docker-in-Docker for CI/CD
Need complete isolation for CI/CD pipelines? Docker-in-Docker has you covered.
This example demonstrates running the Docker MCP Gateway in a Docker-in-Docker (DinD) configuration, enabling containerized environments to manage their own Docker containers. This pattern is essential for CI/CD pipelines, development environments, and scenarios where the gateway needs complete container isolation while retaining Docker management capabilities.
Getting Started
git clone https://github.com/docker/mcp-gateway
cd mcp-gateway/examples/docker-in-docker
services:
gateway:
image: docker/mcp-gateway:dind
privileged: true
ports:
- "8080:8080"
command:
- --transport=sse
- --servers=fetch
- --memory=512Mb
Start the services
docker compose up -d --build
This log output demonstrates the Docker-in-Docker container initialization process for the MCP Gateway, showing the startup sequence of the internal Docker daemon that runs within the privileged container. The process begins with containerd (Docker's container runtime) starting up and systematically loading various plugins and storage drivers including event handling, snapshotters (overlayfs, native), content management, and runtime components. Some plugins like aufs, zfs, and devmapper are intentionally skipped because they're either not supported in the container environment or not configured, which is normal behavior for a containerized Docker daemon.
This nested containerization setup creates a completely isolated Docker environment within the DinD container, allowing the MCP Gateway to spawn and manage its own MCP server containers without affecting the host system's Docker daemon. The containerd initialization establishes the foundation for the gateway to dynamically pull, run, and orchestrate MCP servers (like the fetch server specified in the configuration) while maintaining strict resource boundaries through the 512MB memory limit. This approach is particularly valuable for development environments, CI/CD pipelines, and multi-tenant scenarios where you need reproducible, isolated container management capabilities with the full power of Docker available to the MCP Gateway for managing its containerized tools and services.
Perfect for containerized build environments where you need the gateway to manage its own Docker containers.
The Developer Experience Revolution
Here's what changes when you switch to Docker MCP Gateway:
Before: "I need to remember which terminal is running the GitHub server, and why is my database password showing up in ps aux
again?"
After: "All my AI tools are running behind one endpoint, secrets are properly managed, and I can see exactly what's happening through centralized logs."
Before: Three different configuration files for three different AI clients.
After: Configure once, use everywhere.
Before: Manual process management and hoping nothing crashes.
After: Docker handles restarts, health checks, and resource limits automatically.
Why This Matters for Your Organization
If you're a developer, the gateway eliminates infrastructure complexity so you can focus on building AI applications, not managing servers.
If you're on a security team, you get comprehensive visibility and control over AI tool usage with enterprise-grade isolation.
If you're in operations, you get a centralized management platform that scales from development to production with minimal overhead.
Getting Started Today
Ready to transform your AI tool management? Here's how to get started:
- Clone the examples:
git clone https://github.com/docker/mcp-gateway
- Pick your demo: Start with Demo 1 for simplicity or Demo 5 for the full experience
- Run it:
docker compose up
- Connect your AI client: Point Claude Desktop, VS Code, or Cursor to
localhost:8811
That's it. You're now running a production-ready AI tool orchestration platform.
Wrapping Up
Docker MCP Gateway isn't just solving today's problems - it's laying the foundation for the AI infrastructure of tomorrow. As AI applications become more sophisticated and require access to more tools and data sources, you need a platform that can scale, secure, and manage that complexity.
Comments ()