kai

kai

16

Kai is a Model Context Protocol server designed to manage Kubernetes clusters using natural language processes through large language models. It provides tools for managing various Kubernetes resources such as pods, deployments, and services, thereby facilitating easier and more interactive cluster management.

Kai - Kubernetes MCP Server

A Model Context Protocol (MCP) server for managing a Kubernetes cluster through LLM clients like Claude and Ollama.

Overview

Kai MCP server provides a bridge between large language model (LLM) clients and your Kubernetes clusters, enabling users to interact with their Kubernetes resources through natural language. The server exposes a comprehensive set of tools for managing clusters, namespaces, pods, deployments, services, and other Kubernetes resources.

Features

  • Pods: Create, list, get, delete, and stream logs
  • Deployments: Create, list, describe, and update deployments
  • Services: Create, get, list, and delete services
  • Cluster Management: Connect, list, switch, and monitor clusters
  • Namespaces: Create, list, update, and delete namespaces
  • Ingress: HTTP/HTTPS routing and TLS configuration
  • ConfigMaps & Secrets: Configuration and secret management
  • Jobs & CronJobs: Batch workload orchestration
  • Nodes: Node monitoring, cordoning, and draining
  • Utilities: Port forwarding, events, and API exploration
  • Persistent Volumes: Storage management and claims
  • RBAC: Role-based access control
  • Custom Resources: CRD and custom resource operations

Requirements

The server will by default connect to your current context. Make sure you have access to a Kubernetes cluster configured for kubectl.