ma

mattermost-mcp-host

A MCP server along with MCP host that provides access to Mattermost teams, channels and messages. MCP host is integrated as a bot in Mattermost with access to MCP servers that can be configured.

Publishermattermost-mcp-host
Submitted date4/13/2025

Unleashing AI Agents in Mattermost: A Deep Dive into the Model Context Protocol (MCP) Integration

Harness the power of Large Language Models (LLMs) within your Mattermost workspace with this robust integration, seamlessly connecting to external data sources and tools via the Model Context Protocol (MCP). This solution empowers you to build AI-driven workflows, enhance team collaboration, and automate complex tasks directly within your existing communication platform.

Version Python License Package Manager

Interactive Demonstrations

Witness the integration in action:

1. Intelligent GitHub Issue Management in Support Channels

This demo showcases how the integration can intelligently analyze user requests in a support channel, search existing GitHub issues and pull requests, and automatically create a new issue if a relevant one doesn't already exist.

GitHub Agent Demo

2. Web Search and Channel Posting via Mattermost-MCP-Server

This demonstration illustrates the ability to leverage web search capabilities and post relevant information directly to a Mattermost channel, all orchestrated through the MCP server.

Web Search Demo

Comprehensive Demo on YouTube

For a more in-depth walkthrough, explore the full demo on YouTube (link below).

Core Capabilities

This Mattermost-MCP integration offers a comprehensive suite of features designed to maximize the utility of LLMs within your workflow:

  • LangGraph Agent Orchestration: At its heart lies a sophisticated LangGraph agent, responsible for interpreting user requests, planning actions, and orchestrating responses with precision. This ensures a coherent and efficient interaction flow.
  • Seamless MCP Server Connectivity: The integration effortlessly connects to multiple MCP servers, defined in the mcp-servers.json configuration file. This allows you to tap into a diverse ecosystem of external data sources and tools.
  • Dynamic Tool Discovery and Integration: The integration automatically discovers available tools from connected MCP servers, dynamically converting them into Langchain structured tools. This eliminates the need for manual configuration and ensures that the agent always has access to the latest capabilities.
  • Context-Aware Conversations: The integration intelligently maintains conversational context within Mattermost threads, enabling coherent and natural interactions over time. This is crucial for complex tasks that require multiple turns of conversation.
  • Intelligent Tool Selection and Chaining: The AI agent possesses the ability to strategically select and chain multiple tools to fulfill complex user requests. This allows for sophisticated workflows that go beyond simple single-tool interactions.
  • MCP Capability Exploration: Users can easily discover available servers, tools, resources, and prompts through intuitive direct commands, empowering them to understand and leverage the full potential of the integration.
  • Direct Command Interface: A command prefix (default: #) provides a direct interface for interacting with MCP servers, enabling advanced users to execute specific tools and access resources with precision.

Architectural Overview

The integration operates through a well-defined process:

  1. Mattermost Connection (mattermost_client.py): Establishes a connection to the Mattermost server via the API and WebSocket, enabling real-time message monitoring within a designated channel.
  2. MCP Connection Management (mcp_client.py): Creates and manages connections (primarily stdio) to each MCP server defined in src/mattermost_mcp_host/mcp-servers.json. It also handles the discovery of available tools on each server.
  3. Agent Initialization (agent/llm_agent.py): Initializes a LangGraphAgent, configuring it with the chosen LLM provider and the dynamically loaded tools from all connected MCP servers. This agent serves as the brain of the integration.
  4. Message Handling (main.py):
    • Messages prefixed with the command prefix (#) are parsed as direct commands, allowing users to list servers/tools or invoke specific tools via the corresponding MCPClient.
    • All other messages, along with their thread history, are passed to the LangGraphAgent for processing.
  5. Agent Execution: The agent analyzes the request, potentially invoking one or more MCP tools via the MCPClient instances, and formulates a response.
  6. Response Delivery: The final response, whether generated by the agent or resulting from a direct command, is posted back to the appropriate Mattermost channel or thread.

Installation and Configuration

Follow these steps to set up the Mattermost-MCP integration:

  1. Clone the Repository:

    git clone <repository-url> cd mattermost-mcp-host
  2. Install Dependencies:

    • Using uv (recommended):

      # Install uv if you don't have it yet # curl -LsSf https://astral.sh/uv/install.sh | sh # Activate venv source .venv/bin/activate # Install the package with uv uv sync # To install dev dependencies uv sync --dev --all-extras
  3. Configure Environment (.env file):

    Copy the .env.example file and populate it with your specific configuration values, or create a .env file in the project root (or set environment variables):

    # Mattermost Details MATTERMOST_URL=http://your-mattermost-url MATTERMOST_TOKEN=your-bot-token # Needs permissions to post, read channel, etc. MATTERMOST_TEAM_NAME=your-team-name MATTERMOST_CHANNEL_NAME=your-channel-name # Channel for the bot to listen in # MATTERMOST_CHANNEL_ID= # Optional: Auto-detected if name is provided # LLM Configuration (Azure OpenAI is default) DEFAULT_PROVIDER=azure AZURE_OPENAI_ENDPOINT=your-azure-endpoint AZURE_OPENAI_API_KEY=your-azure-api-key AZURE_OPENAI_DEPLOYMENT=your-deployment-name # e.g., gpt-4o # AZURE_OPENAI_API_VERSION= # Optional, defaults provided # Optional: Other providers (install with `[all]` extra) # OPENAI_API_KEY=... # ANTHROPIC_API_KEY=... # GOOGLE_API_KEY=... # Command Prefix COMMAND_PREFIX=#

    Refer to .env.example for a comprehensive list of configuration options.

  4. Configure MCP Servers:

    Modify the src/mattermost_mcp_host/mcp-servers.json file to define the MCP servers you wish to connect to. Consult src/mattermost_mcp_host/mcp-servers-example.json for a sample configuration. Ensure that any necessary dependencies (e.g., npx, uvx, docker) are installed and accessible in your system's PATH.

  5. Start the Integration:

    mattermost-mcp-host

System Requirements

  • Python 3.13.1+
  • uv package manager
  • Mattermost server instance
  • Mattermost Bot Account with API token
  • Access to an LLM API (Azure OpenAI is the default)

Optional Components

  • One or more MCP servers configured in mcp-servers.json
  • Tavily web search requires a TAVILY_API_KEY in the .env file

Interacting with the Integration

Once the integration is running and connected to your Mattermost instance:

  1. Direct Chat: Engage in natural conversations within the configured channel or directly with the bot. The AI agent will intelligently respond, leveraging available tools as needed. Context is maintained within message threads for seamless interactions.
  2. Direct Commands: Utilize the command prefix (default #) to execute specific actions:
    • #help: Display help information.
    • #servers: List configured and connected MCP servers.
    • #<server_name> tools: List available tools for <server_name>.
    • #<server_name> call <tool_name> <json_arguments>: Invoke <tool_name> on <server_name> with arguments provided as a JSON string.
      • Example: #my-server call echo '{"message": "Hello MCP!"}'
    • #<server_name> resources: List available resources for <server_name>.
    • #<server_name> prompts: List available prompts for <server_name>.

Future Enhancements

  • ⚙️ Configurable LLM Backend: Expand support for additional AI providers (Azure OpenAI default, OpenAI, Anthropic Claude, Google Gemini) via environment variables.

Mattermost Bot Setup

  1. Create a Bot Account:

    • Navigate to Integrations > Bot Accounts > Add Bot Account.
    • Provide a name and description for the bot.
    • Store the generated access token in the .env file.
  2. Grant Required Bot Permissions:

    • post_all
    • create_post
    • read_channel
    • create_direct_channel
    • read_user
  3. Add Bot to Team/Channel:

    • Invite the bot to your team.
    • Add the bot to the desired channels where it will operate.

Troubleshooting Guide

  1. Connection Issues:

    • Verify that the Mattermost server is running correctly.
    • Double-check the bot token permissions.
    • Ensure that the team and channel names are configured correctly.
  2. AI Provider Issues:

    • Validate the API keys for your chosen AI provider.
    • Check API quotas and limits to ensure you are not exceeding them.
    • Verify network access to the API endpoints.
  3. MCP Server Issues:

    • Examine the server logs for any errors or warnings.
    • Verify the server configurations to ensure they are correct.
    • Ensure that all required dependencies are installed and environment variables are defined.

Demonstrations

Automated Issue Creation via Chat using GitHub MCP Server

GitHub Issue Creation Demo

Comprehensive YouTube Demonstration

AI Agent in Action in Mattermost

Contributing

Contributions are welcome! Please feel free to submit pull requests.

License

This project is licensed under the MIT License - see the LICENSE file for details.

Visit More

View All