mc

mcp-memgraph

Memgraph MCP Server - includes a tool to run a query against Memgraph and a schema resource.

Publishermcp-memgraph
Submitted date4/13/2025

🧠 Unleash the Power of Graph Data with Memgraph MCP Server πŸš€

A streamlined, open-source implementation of the Model Context Protocol (MCP) designed to seamlessly bridge the gap between Memgraph and Large Language Models (LLMs). Empower your AI applications with the rich contextual understanding that only graph data can provide.

Memgraph MCP Server Architecture

⚑️ Get Started in Minutes

πŸ“Ή Watch the Quick Start Video: Memgraph MCP Server Quick Start

1. Launch the Memgraph MCP Server

  1. Install uv: A fast and modern Python package installer. Follow the installation guide: uv Installation
  2. Create and Activate a Virtual Environment:
    uv venv .venv\Scripts\activate # Windows source .venv/bin/activate # macOS/Linux
  3. Install Dependencies:
    uv add "mcp[cli]" httpx
  4. Start the Server:
    uv run server.py

2. Configure Your MCP Client (e.g., Claude)

  1. Install Claude for Desktop: Download from Claude.ai.

  2. Modify the Claude Configuration: Add the Memgraph server details to the claude_desktop_config.json file.

    Locate the Configuration File:

    • MacOS/Linux: ~/Library/Application\ Support/Claude/claude_desktop_config.json
    • Windows: $env:AppData\Claude\claude_desktop_config.json

    Example Configuration:

    { "mcpServers": { "mpc-memgraph": { "command": "/Users/katelatte/.local/bin/uv", "args": [ "--directory", "/Users/katelatte/projects/mcp-memgraph", "run", "server.py" ] } } }

    Important Notes:

    • Ensure you provide the absolute path to the uv executable in the command field. Use which uv (macOS/Linux) or where uv (Windows) to find the correct path.
    • The directory argument should point to the location of your server.py file.

3. Interact with Memgraph Through Your LLM

  1. Start Memgraph with MAGE: Enable schema information for optimal LLM integration.
    docker run -p 7687:7687 memgraph/memgraph-mage --schema-info-enabled=True

    The --schema-info-enabled=True flag is crucial. It allows the LLM to execute SHOW SCHEMA INFO queries, providing valuable context about your graph database.

  2. Open Claude Desktop: You should now see Memgraph tools and resources listed.
  3. Start Chatting! Load sample data from Memgraph Lab Datasets to begin exploring the power of graph-enhanced AI.

πŸ› οΈ Available Tools

run_query()

  • Description: Executes a Cypher query against your Memgraph database.
  • Use Case: Retrieve specific data, perform graph algorithms, and extract insights based on your LLM's instructions.

πŸ—„οΈ Available Resources

get_schema()

  • Description: Retrieves schema information from Memgraph.
  • Prerequisite: Memgraph must be started with the --schema-info-enabled=True flag.
  • Use Case: Provides the LLM with a clear understanding of your graph structure, enabling it to formulate more accurate and relevant Cypher queries.

πŸ—ΊοΈ Future Directions

The Memgraph MCP Server is rapidly evolving. Our roadmap includes:

  • TypeScript Version: A TypeScript implementation to better support JavaScript-based environments and facilitate integration with web applications.
  • Integration with AI Toolkit: Moving the project into the central Memgraph AI Toolkit repository. This will consolidate our efforts and provide a unified platform for graph-powered AI development, including integrations with LangChain, LlamaIndex, and other essential tools.

Our vision is to empower developers with a comprehensive, open-source toolkit that makes it effortless to build intelligent agents and graph-powered applications with Memgraph at the core.

Visit More

View All