ragdoll-mcp-server

ragdoll-mcp-server

3.3

A Model Context Protocol (MCP) server for Ragdoll AI knowledge base queries.

The Ragdoll AI MCP Server provides a streamlined interface for querying Ragdoll AI knowledge bases using the Model Context Protocol. It facilitates easy integration with various LLM client applications such as Cursor, Windsurf, and Cline. The server requires the Bun runtime and a Ragdoll AI API key and knowledge base ID for operation. Users can clone the repository, install dependencies, and configure environment variables to get started. The server can be run locally or accessed via NPX for convenience. It supports querying with parameters like 'query', 'topK', and 'rerank' to customize search results.

Features

  • Seamless integration with LLM clients like Cursor, Windsurf, and Cline.
  • Supports querying with customizable parameters such as 'query', 'topK', and 'rerank'.
  • Easy setup and configuration with environment variables.
  • Can be run locally or accessed via NPX for flexibility.
  • Utilizes the Model Context Protocol for efficient knowledge base queries.

Usage with Different Platforms

Cursor


{
  "mcpServers": {
    "ragdoll-mcp-server": {
      "command": "npx",
      "args": ["-y", "ragdoll-mcp-server"],
      "env": {
        "RAGDOLL_API_KEY": "your-ragdoll-api-key",
        "RAGDOLL_KNOWLEDGE_BASE_ID": "your-knowledge-base-id"
      }
    }
  }
}

Windsurf


{
  "mcpServers": {
    "ragdoll-mcp-server": {
      "command": "npx",
      "args": ["-y", "ragdoll-mcp-server"],
      "env": {
        "RAGDOLL_API_KEY": "your-ragdoll-api-key",
        "RAGDOLL_KNOWLEDGE_BASE_ID": "your-knowledge-base-id"
      }
    }
  }
}

Cline


{
  "mcpServers": {
    "ragdoll-mcp-server": {
      "command": "npx",
      "args": ["-y", "ragdoll-mcp-server"],
      "env": {
        "RAGDOLL_API_KEY": "your-ragdoll-api-key",
        "RAGDOLL_KNOWLEDGE_BASE_ID": "your-knowledge-base-id"
      }
    }
  }
}