mcp-server-demo

mcp-server-demo

3.2

This document provides a comprehensive guide on setting up and using a Model Context Protocol (MCP) server with LibreChat and Ollama.

The Model Context Protocol (MCP) server is designed to facilitate seamless communication between different AI models and applications. By integrating with LibreChat, users can leverage the power of various AI models to perform tasks such as fetching IP addresses. The setup involves configuring an IP server, a local MongoDB server, and the LibreChat application. Additionally, the Ollama model provider is used to enhance the capabilities of the chat application. The configuration is done through a YAML file, which specifies the server details, endpoints, and models to be used. Once set up, users can interact with the LibreChat UI to create agents, add tools, and execute queries to retrieve IP information.

Features

  • Seamless integration with LibreChat for enhanced AI model interaction.
  • Support for multiple AI models through the Ollama model provider.
  • Customizable server and endpoint configurations via YAML.
  • Tool addition for specific tasks like fetching IP addresses.
  • User-friendly interface for creating and managing chat agents.

MCP Tools

  • {'get-external-ip': 'Fetches the external IP address of the user.'}
  • {'get-local-ip-v6': 'Retrieves the local IPv6 address.'}
  • {'get-external-ip-v6': 'Fetches the external IPv6 address.'}
  • {'get-local-ip': 'Retrieves the local IP address.'}

Usage with Different Platforms

LibreChat

yaml
mcpServers:
  ipServer:
    url: http://localhost:3000/sse
    timeout: 60000
endpoints:
  custom:
    - name: "Ollama"
      apiKey: "ollama"
      baseURL: "http://localhost:11434/v1/chat/completions"
      models:
        default:
          [
            "qwen2.5:3b-instruct-q4_K_M",
            "mistral:7b-instruct-q4_K_M",
            "gemma:7b-instruct-q4_K_M",
          ]
        fetch: true
      titleConvo: true
      titleModel: "current_model"
      summarize: false
      summaryModel: "current_model"
      forcePrompt: false
      modelDisplayLabel: "Ollama"