Collabnix AI Weekly - Edition 1

Your weekly digest of Cloud-Native AI and Model Context Protocol innovations.

Collabnix AI Weekly - Edition 1
Collabnix AI Weekly

Stay ahead with curated insights on Docker, Kubernetes, IoT, and emerging AI technologies delivered straight to your inbox

Have MCP news to share? Tag us on social media or submit via our Slack community channels!

🎯 This Week's Spotlight

A Curated list of new MCP Server added to the Docker MCP Toolkit.

  • AWS Diagram - Seamlessly create diagrams using the Python diagrams package DSL.
  • AWS Documentation - Tools to access AWS documentation, search for content, and get recommendations.
  • AWS CDK - AWS Cloud Development Kit (CDK) best practices, infrastructure as code patterns, and security compliance with CDK Nag.
  • AWS Core MCP Server - Starting point for using the awslabs MCP servers.
  • AWS Terraform - Terraform on AWS best practices, infrastructure as code patterns, and security compliance with Checkov.

🐳 Docker Desktop 4.42 Release Brings Enhanced AI Features - Docker released Docker Desktop 4.42 on June 4, 2025, featuring IPv6 networking capabilities, numerous Docker Model Runner updates, and enhanced AI development tools. This release builds on Docker's commitment to making AI development more accessible with local LLM capabilities and improved MCP support.

  • The Docker MCP Toolkit is now natively integrated into Docker Desktop.
  • Docker Model Runner is now available for Windows systems running on Qualcomm/ARM GPUs.
  • Added a Logs tab to the Models view so you can see the inference engine output in real time.
  • Gordon now integrates the MCP Toolkit, providing access to 100+ MCP servers.

🐳 Latest AI Blogs

πŸŽ₯ Videos & Talks

πŸ™‹ Q&A Highlights

  • Question: How do I get started with Docker's MCP servers?
  • Answer: Check out the Docker MCP Catalog and follow the official documentation to set up containerized MCP servers with one-click deployment.
  • Question: What's the difference between Docker Model Runner and traditional LLM hosting?
  • Answer: With the Model Runner feature, Docker provides inference capabilities to developers on their laptop. It provides a Docker-native experience for running Large Language Models (LLMs) locally, seamlessly integrating with existing container tooling and workflows

Technical Note: With Docker Model Runner, the AI model DOES NOT run in a container. Instead, Docker Model Runner uses a host-installed inference server (llama.cpp for now) that runs natively on your Mac rather than containerizing the model. Docker does have a plan to support additional inference (i.e MLX, vLLM) in future releases.

🌐 Upcoming Events

  • Cloud-Native AI and MCP Day : The Developer Experience Revolution

Cloud-Native AI and MCP Day

Transform Your Development Workflow with Context-Aware InfrastructureFor the first time in years, developers are saying work feels fun again. Join us for a groundbreaking event where Model Context Protocol (MCP) meets cloud-native infrastructure to revolutionize how we build, deploy, and manage applications.

Register for this event

πŸ“¬ Newsletter Info

Collabnix AI Weekly is curated by the Collabnix AI Community, bringing you the latest developments in Model Context Protocol and AI agent technologies.

πŸ”” Coming Next Week

We'll be covering:

  • New MCP Servers for DevOps
  • Tutorial on Claude Desktop + GitHub MCP Server
  • Enterprise case studies from early adopters
  • New security best practices and guidelines
  • Community spotlight on innovative MCP projects

Have MCP news to share? Tag us on social media or submit via our Slack community channels!