watsonx-mcp-server
This document provides a comprehensive guide to building a Watsonx.ai Chatbot Server using the Model Context Protocol (MCP) in Python.
The tutorial outlines the process of creating a professional, production-ready chatbot server powered by IBM Watsonx.ai and exposed via the Model Context Protocol (MCP) Python SDK. The server is designed to be reusable and can be invoked by any MCP-compatible client, such as Claude Desktop or custom Python clients. The guide covers setting up the environment, installing dependencies, writing clean Python code, exposing Watsonx.ai inference as an MCP tool, and running and testing the server. It also provides tips for extending and hardening the service. The tutorial emphasizes the modularity, reusability, and rapid iteration capabilities of combining IBM Watsonx.ai with MCP, making it suitable for building scalable and adaptable chatbot solutions.
Features
- Modularity: Decouple chatbot logic from client implementations.
- Reusability: Any MCP-compatible client can call the same 'chat' endpoint.
- Rapid iteration: Built-in development inspector with live reloading.
- Secure credentials management using environment variables.
- Integration with IBM Watsonx.ai for LLM inference.
MCP Tools
- chat: Generates a chatbot response via Watsonx.ai.
MCP Resources
- {'name': 'greeting://patient/{name}', 'description': 'Returns a personalized greeting for the given patient name.'}
Usage with Different Platforms
mcp_dev
mcp dev server.py
python_client
# client.py
import asyncio
from mcp import ClientSession
from mcp.client.stdio import stdio_client
from mcp import StdioServerParameters
async def main():
server_params = StdioServerParameters(command="python", args=["server.py"])
async with stdio_client(server_params) as (reader, writer):
async with ClientSession(reader, writer) as session:
await session.initialize()
# Call the chat tool
user_msg = "Hello, how are you today?"
response = await session.call_tool("chat", arguments={"query": user_msg})
print("Bot:", response)
if __name__ == "__main__":
asyncio.run(main())
flask_app
# chatbot.py
import os
import atexit
import asyncio
from flask import Flask, render_template, request, redirect, url_for, session
from mcp import ClientSession, StdioServerParameters
from mcp.client.stdio import stdio_client
# Flask app setup
app = Flask(__name__)
app.secret_key = os.environ.get("SECRET_KEY", os.urandom(24))
# MCP server parameters
SERVER_PARAMS = StdioServerParameters(command="python", args=["server.py"], env=None)
# Dedicated asyncio loop for MCP
loop = asyncio.new_event_loop()
asyncio.set_event_loop(loop)
# Globals for client and session contexts
_stdio_ctx = None
_session_ctx = None
SESSION = None
async def _init_session():
global _stdio_ctx, _session_ctx, SESSION
_stdio_ctx = stdio_client(SERVER_PARAMS)
_reader, _writer = await _stdio_ctx.__aenter__()
_session_ctx = ClientSession(_reader, _writer)
SESSION = await _session_ctx.__aenter__()
await SESSION.initialize()
# Initialize once at import
loop.run_until_complete(_init_session())
app.logger.info("MCP client session initialized once.")
async def _close_session():
if _session_ctx:
await _session_ctx.__aexit__(None, None, None)
if _stdio_ctx:
await _stdio_ctx.__aexit__(None, None, None)
atexit.register(lambda: loop.run_until_complete(_close_session()))
# Helper: fetch greeting text
def fetch_greeting(name: str) -> str:
resp = loop.run_until_complete(SESSION.read_resource(f"greeting://patient/{name}"))
contents = getattr(resp, 'contents', None)
if isinstance(contents, list):
return "\n".join(c.text for c in contents).strip()
return str(resp)
# Helper: assess symptoms via chat tool
def assess_symptoms(symptoms: str) -> str:
prompt_resp = loop.run_until_complete(
SESSION.get_prompt("assess_symptoms", arguments={"symptoms": symptoms})
)
# Extract clean text from prompt_resp.messages
msgs = getattr(prompt_resp, 'messages', None)
if msgs:
lines = []
for m in msgs:
txt = m.content.text if hasattr(m.content, 'text') else str(m.content)
if txt.startswith("<module"):
txt = txt.split("\n", 1)[1]
lines.append(txt)
diagnosis_prompt = "\n".join(lines).strip()
else:
diagnosis_prompt = str(prompt_resp)
tool_resp = loop.run_until_complete(
SESSION.call_tool("chat", arguments={"query": diagnosis_prompt})
)
cont = getattr(tool_resp, 'content', None)
if isinstance(cont, list):
return "\n".join(c.text for c in cont).strip()
return str(cont).strip()
# Flask routes
@app.route("/", methods=["GET", "POST"])
def home():
if request.method == "POST":
session['name'] = request.form['name']
return redirect(url_for('symptoms'))
return render_template("home.html")
@app.route("/symptoms", methods=["GET", "POST"])
def symptoms():
name = session.get('name')
if not name:
return redirect(url_for('home'))
if request.method == "POST":
diag = assess_symptoms(request.form['symptoms'])
return render_template("diagnosis.html", diagnosis=diag)
greet = fetch_greeting(name)
return render_template("symptoms.html", greeting=greet)
if __name__ == "__main__":
app.run(debug=True)
Frequently Asked Questions
What is the Model Context Protocol (MCP)?
MCP is a protocol that standardizes how applications expose tools, resources, and prompts to LLM clients, enabling modularity and reusability.
How do I secure my credentials?
Store your credentials in a .env
file and use the python-dotenv
package to load them securely. Never commit this file to source control.
What are the benefits of using Watsonx.ai with MCP?
Combining Watsonx.ai with MCP provides modularity, reusability, and rapid iteration capabilities, making it suitable for scalable and adaptable chatbot solutions.
Related MCP Servers
View all ai_chatbot servers →Sequential Thinking🏅
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
exa-mcp-server
by exa-labs
A Model Context Protocol (MCP) server allows AI assistants to use the Exa AI Search API for real-time web searches in a secure manner.
repomix
by yamadashy
Repomix is a tool that packages your entire codebase into a single, AI-friendly file, making it easier to use with AI tools like LLMs.
claude-task-master
by eyaltoledano
Task Master is a task management system for AI-driven development with Claude, designed to work seamlessly with Cursor AI.
blender-mcp
by ahujasid
BlenderMCP connects Blender to Claude AI through the Model Context Protocol (MCP), enabling prompt-assisted 3D modeling, scene creation, and manipulation.
mcp-server-calculator
by githejie
A Model Context Protocol server for calculating. This server enables LLMs to use calculator for precise numerical calculations.
Cua Agent
by trycua
cua-mcp-server is a Model Context Protocol (MCP) server for the Computer-Use Agent (CUA), enabling integration with Claude Desktop and other MCP clients.