sbb-mcp-server
A starter project for building Model Context Protocol (MCP) servers in TypeScript, featuring a simple echo server implementation.
MCP TypeScript Server Starter
A starter project for building Model Context Protocol (MCP) servers in TypeScript. This project provides a simple echo server implementation that demonstrates the core features of MCP.
Quick Start Checklists
📥 Installation
- Clone the repository:
git clone https://github.com/ralf-boltshauser/mcp-typescript-server-starter.git cd mcp-typescript-server-starter
- Install dependencies:
pnpm install
🛠️ Local Development
- Start the development server:
pnpm dev
- Access the inspector at http://localhost:6274
- Test your MCP server:
- Click on "Connect" in the inspector
- Navigate to "Tools" section
- Click "List Tools"
- Select "echo" tool
- Write a test message
- Click "Submit"
- Open
src/index.ts
to add your own:- Tools (functions your AI can call)
- Resources (data your AI can access)
- Prompts (templates for AI interactions)
- Update
src/index.html
with your server's description and documentation
🚀 Deployment (Coolify Example)
- Set up on Coolify:
- Connect your repository
- In advanced settings:
- Disable GZIP compression (required for SSE)
- Configure domain:
- Add your domain as:
https://subdomain.yourdomain.com:3001
- The
:3001
is crucial - it tells traefik to bind to your internal port
- The
- Add your domain as:
- Verify deployment:
- Visit
subdomain.yourdomain.com
to see your index.html - Test SSE connection at
https://subdomain.yourdomain.com/sse
- Visit
🔌 Connecting to Your Deployed Server
Use this command to connect to your server:
npx -y mcp-remote https://subdomain.yourdomain.com/sse
Example configuration for Cursor/Claude Desktop:
{
"mcpServers": {
"your-server-name": {
"command": "npx",
"args": ["-y", "mcp-remote", "https://subdomain.yourdomain.com/sse"]
},
}
}
Features
- Simple echo server implementation
- Support for tools, resources, and prompts
- TypeScript support
- Development server with hot reloading
- Built-in inspector for testing and debugging
- Support for both STDIO and SSE communication modes
Prerequisites
- Node.js (v16 or later)
- pnpm (recommended) or npm
Usage Modes
This server supports two main communication modes:
-
STDIO Mode
- Ideal for local development and basic testing
- Direct process communication
- Used by most MCP clients by default
- Perfect for running servers locally
- Simple to set up and use
-
SSE Mode
- Better for production deployments
- HTTP/SSE communication
- Can be converted to STDIO using npm packages (covered later)
- Enables remote access to your server
- More scalable and production-ready
Choose STDIO for local development and SSE when you need to deploy your server for remote access.
STDIO Mode (Direct Process Communication)
This mode is ideal for direct integration with tools like Cursor or Claude Desktop.
-
Configure the Server
- In
src/index.ts
:- Comment out the Express/SSE code at (the bottom)
- Uncomment the STDIO code (above it)
- In
-
Build and Run
pnpm build node dist/index.cjs
or
pnpm dev # starts the server and the inspector
-
Integration with Claude Desktop
pnpm add-claude
⚠️ Note: This will overwrite your existing Claude Desktop configuration.
This way of configuring claude desktop is standard. The json that is generated can also be used in cursor and so on!
-
Manual Integration For other tools, use the command:
node /path/to/your/project/dist/index.cjs
or
pnpm cmd # this gives you the node .../dist/index.cjs command directly with pwd
SSE Mode (HTTP/SSE Communication)
This mode is ideal for web-based tools and remote deployments.
-
Configure the Server
- In
src/index.ts
:- Keep the Express/SSE code enabled (at the bottom)
- Comment out the STDIO code (above it)
- In
-
Local Development
pnpm dev
The server will be available at:
- Main endpoint: http://127.0.0.1:3001
- SSE endpoint: http://127.0.0.1:3001/sse
- Test endpoint: http://127.0.0.1:3001/test
- Inspector: http://127.0.0.1:6274
-
Local Docker Testing The docker compose override is needed to actually expose the ports. When deploying to stuff like coolify you don't want it because traefik will handle it.
docker compose -f docker-compose.yaml -f docker-compose.local.yaml up
-
Production Deployment (e.g., Coolify)
- Ask your IDE to update src/index.html to match your servers description.
- Deploy the server to your preferred platform
- Important: In Coolify's advanced settings:
- Disable GZIP compression (this kills the SSE stream)
- Ensure port 3001 is properly exposed -> when setting a domain do it like this: https://your-domain.com:3001 this tells traefik to bind to port 3001.
- Configure the server to listen on all interfaces (0.0.0.0) (already done)
-
Using the Remote Server Once deployed, you can connect to the server using:
npx -y mcp-remote https://your-domain.com/sse
You can paste this as command and replace the "node .../dist/index.cjs" with this.
Project Structure
src/index.ts
- Main server implementationsrc/low-level-index.ts
- Alternative implementation using the low-level APIdist/
- Compiled output directory
Server Features
Echo Tool
A simple tool that echoes back the input message:
server.tool("echo", { message: z.string() }, async ({ message }) => ({
content: [{ type: "text", text: `Tool echo: ${message}` }],
}));
Echo Resource
A resource that can be accessed via URI:
server.resource(
"echo",
new ResourceTemplate("echo://{message}", { list: undefined }),
async (uri, { message }) => ({
contents: [
{
uri: uri.href,
text: `Resource echo: ${message}`,
},
],
})
);
Echo Prompt
A prompt template for processing messages:
server.prompt("echo", { message: z.string() }, ({ message }) => ({
messages: [
{
role: "user",
content: {
type: "text",
text: `Please process this message: ${message}`,
},
},
],
}));
Implementation Recommendations
Debug Messages
Debug messages can be sent using the server.server.sendLoggingMessage
method to provide visibility into server operations.
Basic Usage
server.server.sendLoggingMessage({
level: "info",
data: "Starting server...",
});
This allows you to:
- Track server operations in real-time
- Debug issues during development
- Monitor server state in production
You can see them in the inspector on the bottom right!
Environment Variables
For server-side environment variables (developer-provided, not user-specific):
-
Using Docker Compose
# docker-compose.yaml services: mmcp-server: environment: - API_KEY=${API_KEY} - DATABASE_URL=${DATABASE_URL}
This allows you to:
- Set variables in your shell:
export API_KEY=your-key
- Use a
.env
file that Docker Compose will automatically load
- Set variables in your shell:
-
Accessing in Code
const apiKey = process.env.API_KEY; const dbUrl = process.env.DATABASE_URL;
-
Local Development
- Create a
.env
file in your project root:API_KEY=sk-123
- Add
.env
to.gitignore
to keep secrets secure - Run the development server with environment variables:
pnpm dev
- Create a
-
Production Deployment
- Set environment variables in your deployment platform (e.g., Coolify)
- Never commit sensitive values to version control
Best Practices
-
Error Handling
- Always implement proper error handling for environment variables
- Provide meaningful error messages for missing required variables
-
Type Safety
- Use TypeScript to define environment variable types
- Consider using a validation library like
zod
for runtime checks
-
Security
- Never expose sensitive environment variables to the client
- Use different sets of variables for development and production
License
Related MCP Servers
View all developer_tools servers →context7
by upstash
Context7 MCP provides up-to-date, version-specific documentation and code examples directly into your prompt, enhancing the capabilities of LLMs by ensuring they use the latest information.
git-mcp
by idosal
GitMCP is a free, open-source, remote Model Context Protocol (MCP) server that transforms GitHub projects into documentation hubs, enabling AI tools to access up-to-date documentation and code.
exa-mcp-server
by exa-labs
A Model Context Protocol (MCP) server allows AI assistants to use the Exa AI Search API for real-time web searches in a secure manner.
Sequential Thinking
by modelcontextprotocol
An MCP server implementation that provides a tool for dynamic and reflective problem-solving through a structured thinking process.
Everything MCP Server
by modelcontextprotocol
The Everything MCP Server is a comprehensive test server designed to demonstrate the full capabilities of the Model Context Protocol (MCP). It is not intended for production use but serves as a valuable tool for developers building MCP clients.
gateway
by centralmind
CentralMind Gateway is a tool designed to expose databases to AI agents via MCP or OpenAPI protocols, providing secure, LLM-optimized APIs.
mcpdoc
by langchain-ai
MCP LLMS-TXT Documentation Server provides a structured way to manage and retrieve LLM documentation using the Model Context Protocol.