Chat with your Kubernetes cluster using natural language!
This project connects OpenWebUI to Kubernetes, letting you manage your cluster through conversational AI. Ask questions like "What pods are running?" or "Scale my deployment to 5 replicas" and get instant results.
- TL;DR
- Configure Open-WebUI
- Troubleshooting Configuration
- Architecture Overview
- Repository Structure
- Components
- Usage
One-command setup:
./install.shComponentes installed:
- Kind (Kubernetes in Docker)
- Open Web UI
- Ollama
- MCP bridge
After successful installation, you'll see: β Docker Compose startup complete!
Services available at:
- Open WebUI: http://localhost:3000
- MCP Bridge: http://localhost:9000
- Ollama API: http://localhost:11434
Configure Open WebUI to use the MCP tools
After running the installation script, you need to configure Open-WebUI to use the Kubernetes tools:
- Open your browser and go to http://localhost:3000
- Create an admin account on first visit
- Sign in with your new account
- Click on your profile icon (top right)
- Go to Settings
- Navigate to Admin Panel β Tools
- Click "+ Add Tool Server"
- Enter the following details:
- Name:
Kubernetes Tools - URL:
http://mcpo:9000 - Description:
Kubernetes management via kubectl and helm
- Name:
- Click "Add"
Configure your AI model connections - use the local Ollama or add remote connections:
- In Settings, go to Admin Panel β Connections
- For Local Ollama (recommended):
- The local Ollama server should be automatically detected
- Verify connection to
http://ollama:11434
- For Remote Connections (optional):
- Add OpenAI, Anthropic, or other model providers
- Configure API keys and endpoints as needed
- Go back to the chat interface
- Look for the tools icon (π§) in the chat input area
- You should see available tools like:
kubectl_get- List Kubernetes resourceskubectl_apply- Apply manifestskubectl_describe- Describe resourceshelm_install- Install Helm charts- And more...
Try asking these questions to verify everything works (copy and paste into the chat):
Copy & Paste these queries:
What pods are running in the kube-system namespace?
Show me all services in the default namespace
List all deployments across all namespaces
Create a new namespace called production
Install Jenkins using Helm in the jenkins namespace
If tools don't appear:
- Check that the MCP Bridge is running:
curl http://localhost:9000/health - Verify the tool server URL is exactly:
http://mcpo:9000 - Restart Open-WebUI:
docker-compose restart open-webui
If you get connection errors:
- Ensure all containers are running:
docker-compose ps - Check container logs:
docker-compose logs mcpo - Verify network connectivity between containers
Then open http://localhost:3000 and start chatting with your cluster!
Example queries:
- "What pods are running in kube-system?"
- "Show me all services in default namespace"
- "Create a new namespace called production"
- "Install Jenkins using Helm in the jenkins namespace"
This project integrates Open WebUI with Kubernetes management capabilities through a bridge architecture that connects multiple components to provide AI-powered Kubernetes operations.
graph TB
subgraph "External AI Services"
Gemini[Google Gemini<br/>External API]
end
subgraph "User Interface"
User[User]
end
subgraph "UI Layer"
OpenWebUI[Open WebUI<br/>Port: 3000]
Ollama[Ollama<br/>LLM Server<br/>Port: 11434]
end
subgraph "Bridge Layer"
MCP-Bridge[MCP-Bridge<br/>OpenAPI Bridge<br/>Port: 9000]
end
subgraph "Target Infrastructure"
K8s[Kubernetes Cluster<br/>kubectl, helm, istioctl]
end
subgraph "Configuration"
KubeConfig[kubeconfig<br/>./kube/config]
end
%% User interactions
User -->|Chat & Commands| OpenWebUI
%% AI Layer connections
OpenWebUI -->|LLM Requests| Ollama
OpenWebUI -.->|External API Calls| Gemini
OpenWebUI -->|Tool Calls<br/>OpenAPI/REST| MCP-Bridge
%% Bridge Layer to Kubernetes
MCP-Bridge -->|kubectl commands| K8s
MCP-Bridge -->|helm operations| K8s
MCP-Bridge -->|istioctl commands| K8s
%% Configuration
KubeConfig -.->|Mounted Volume| MCP-Bridge
%% Styling
classDef userLayer fill:#e1f5fe
classDef aiLayer fill:#f3e5f5
classDef externalLayer fill:#fff8e1
classDef bridgeLayer fill:#fff3e0
classDef mcpLayer fill:#e8f5e8
classDef k8sLayer fill:#fce4ec
classDef configLayer fill:#f1f8e9
class User userLayer
class OpenWebUI,Ollama aiLayer
class Gemini,OpenAI,Anthropic externalLayer
class MCP-Bridge bridgeLayer
class K8s k8sLayer
class KubeConfig configLayer
- User Query: User asks a Kubernetes-related question in Open WebUI
- AI Processing: Ollama processes the query and determines if tools are needed
- Tool Selection: Open WebUI identifies the appropriate kubectl_get tool
- API Call: Open WebUI makes REST API call to MCPO bridge
- Protocol Translation: MCPO translates REST call to MCP protocol
- Response Chain: Results flow back through the same chain to the user
graph LR
subgraph "Docker Network: mcp-lab-network"
subgraph "External Access"
Host[Host Machine<br/>localhost:3000<br/>localhost:9000]
end
subgraph "Internal Services"
OW[open-webui:8080]
MC[mcpo:9000]
K8S[Kubernetes:6443]
OL[ollama:11434]
end
end
Host -->|Port 3000| OW
Host -->|Port 9000| MC
OW --> OL
OW --> MC
MC --> K8S
openwebui-k8s-bridge/
βββ mcp-bridge/ # Bridge service code
βββ scripts/ # Setup and utility scripts
βββ tests/ # Test scripts and examples
βββ docs/ # Documentation
βββ kube/ # Kubernetes configuration
βββ docker-compose.yml # Complete stack setup
βββ README.md # Main documentation
- Purpose: Web-based chat interface for AI interactions
- Port: 3000
- Role: Provides the user interface and orchestrates AI conversations with tool calling capabilities
- Key Features:
- Chat interface for natural language Kubernetes queries
- Tool server integration for external API calls
- Model management and conversation history
- Purpose: Local LLM server
- Port: 11434
- Role: Provides the AI language model (llama3.2:latest) for understanding user queries and generating responses
- Key Features:
- Local model hosting
- Function calling capabilities
- Integration with Open WebUI
- Purpose: Protocol bridge between OpenAPI and MCP
- Port: 9000
- Role: Translates REST API calls from Open WebUI into MCP protocol calls
- Key Features:
- OpenAPI specification generation for Open WebUI
- REST to MCP protocol translation
- Tool parameter validation and formatting
- Purpose: Target infrastructure for management operations
- Role: The actual Kubernetes cluster being managed
- Access: Through kubeconfig mounted as volume
Once configured, you can ask natural language questions about your Kubernetes cluster:
- "What pods are running in the kube-system namespace?"
- "Show me all services in the default namespace"
- "List all deployments across all namespaces"
- "Get the logs from the nginx pod"
- install argocd using helm from repo https://argoproj.github.io/argo-helm
--- ---
The AI will automatically use the appropriate Kubernetes tools to execute commands and provide formatted responses.
You can directly open this repository in Google Cloud Shell to start exploring the examples:
For questions or feedback, feel free to reach out:
- Email: [email protected]
- GitHub: https://github.com/elevy99927




