mcp-llama-github-integration
0
The project integrates a Model Context Protocol server with a local Llama model and GitHub capabilities, allowing users to forward queries and list repository files. It is built using FastAPI and includes a Python client for demonstration.
Overview
This project is a Model Context Protocol (MCP) server that integrates a locally running Llama model with GitHub repository file listing capabilities. It consists of:
- MCP Server: A FastAPI-based server that implements the Model Context Protocol, forwards queries to a local Llama model, and lists files from GitHub repositories.
- Python Client: A sample client application to interact with the MCP server.
Prerequisites
- Python 3.7 or higher
- A running Llama model server at http://localhost:11434/
- Internet connection for GitHub API access
- Git and GitHub account
Features
- Context queries via local Llama model
- GitHub repository file listings
- Customizable Llama model settings
Running the MCP Server
Start the server to run on http://localhost:8000
and use the client application to interact with it.