sushimcp
SushiMCP is a protocol server aimed at enhancing AI IDEs by providing context for better code generation performance in LLM models. It offers both default and advanced configurations for developer integration.
SushiMCP
SushiMCP is a model context protocol server designed to assist developers with delivering context to their AI IDE's. It's simple to use and massively improves the performance of base and premium LLM models when generating code. The easiest way to get started is by registering SushiMCP with your client using the default configuration:
Registering SushiMCP with an MCP Client
{
"sushimcp": {
"command": "npx",
"args": [
"-y",
"@chriswhiterocks/sushimcp@latest",
"--llms-txt-source",
"cool_project:https://coolproject.dev/llms-full.txt",
"--openapi-spec-source",
"local_api:http://localhost:8787/api/v1/openapi.json"
]
}
}
Advanced Configuration & Deeper Learning
Visit the SushiMCP Docs for more information on advanced configuration and deeper learning about SushiMCP.
Glama.ai Ratings
Author
Chris White: | GitHub | Discord | Personal Site | X | LinkedIn | Five9 Cyber
License
This project is licensed under the AGPL-3.0-or-later. See the license.txt
file for details.