A MCP server along with MCP host that provides access to Mattermost teams, channels and messages. MCP host is integrated as a bot in Mattermost with access to MCP servers that can be configured.
Harness the power of Large Language Models (LLMs) within your Mattermost workspace with this robust integration, seamlessly connecting to external data sources and tools via the Model Context Protocol (MCP). This solution empowers you to build AI-driven workflows, enhance team collaboration, and automate complex tasks directly within your existing communication platform.
Witness the integration in action:
This demo showcases how the integration can intelligently analyze user requests in a support channel, search existing GitHub issues and pull requests, and automatically create a new issue if a relevant one doesn't already exist.
This demonstration illustrates the ability to leverage web search capabilities and post relevant information directly to a Mattermost channel, all orchestrated through the MCP server.
For a more in-depth walkthrough, explore the full demo on YouTube (link below).
This Mattermost-MCP integration offers a comprehensive suite of features designed to maximize the utility of LLMs within your workflow:
mcp-servers.json
configuration file. This allows you to tap into a diverse ecosystem of external data sources and tools.#
) provides a direct interface for interacting with MCP servers, enabling advanced users to execute specific tools and access resources with precision.The integration operates through a well-defined process:
mattermost_client.py
): Establishes a connection to the Mattermost server via the API and WebSocket, enabling real-time message monitoring within a designated channel.mcp_client.py
): Creates and manages connections (primarily stdio
) to each MCP server defined in src/mattermost_mcp_host/mcp-servers.json
. It also handles the discovery of available tools on each server.agent/llm_agent.py
): Initializes a LangGraphAgent
, configuring it with the chosen LLM provider and the dynamically loaded tools from all connected MCP servers. This agent serves as the brain of the integration.main.py
):
#
) are parsed as direct commands, allowing users to list servers/tools or invoke specific tools via the corresponding MCPClient
.LangGraphAgent
for processing.MCPClient
instances, and formulates a response.Follow these steps to set up the Mattermost-MCP integration:
Clone the Repository:
git clone <repository-url> cd mattermost-mcp-host
Install Dependencies:
Using uv (recommended):
# Install uv if you don't have it yet # curl -LsSf https://astral.sh/uv/install.sh | sh # Activate venv source .venv/bin/activate # Install the package with uv uv sync # To install dev dependencies uv sync --dev --all-extras
Configure Environment (.env
file):
Copy the .env.example
file and populate it with your specific configuration values, or create a .env
file in the project root (or set environment variables):
# Mattermost Details MATTERMOST_URL=http://your-mattermost-url MATTERMOST_TOKEN=your-bot-token # Needs permissions to post, read channel, etc. MATTERMOST_TEAM_NAME=your-team-name MATTERMOST_CHANNEL_NAME=your-channel-name # Channel for the bot to listen in # MATTERMOST_CHANNEL_ID= # Optional: Auto-detected if name is provided # LLM Configuration (Azure OpenAI is default) DEFAULT_PROVIDER=azure AZURE_OPENAI_ENDPOINT=your-azure-endpoint AZURE_OPENAI_API_KEY=your-azure-api-key AZURE_OPENAI_DEPLOYMENT=your-deployment-name # e.g., gpt-4o # AZURE_OPENAI_API_VERSION= # Optional, defaults provided # Optional: Other providers (install with `[all]` extra) # OPENAI_API_KEY=... # ANTHROPIC_API_KEY=... # GOOGLE_API_KEY=... # Command Prefix COMMAND_PREFIX=#
Refer to .env.example
for a comprehensive list of configuration options.
Configure MCP Servers:
Modify the src/mattermost_mcp_host/mcp-servers.json
file to define the MCP servers you wish to connect to. Consult src/mattermost_mcp_host/mcp-servers-example.json
for a sample configuration. Ensure that any necessary dependencies (e.g., npx
, uvx
, docker
) are installed and accessible in your system's PATH.
Start the Integration:
mattermost-mcp-host
mcp-servers.json
TAVILY_API_KEY
in the .env
fileOnce the integration is running and connected to your Mattermost instance:
#
) to execute specific actions:
#help
: Display help information.#servers
: List configured and connected MCP servers.#<server_name> tools
: List available tools for <server_name>
.#<server_name> call <tool_name> <json_arguments>
: Invoke <tool_name>
on <server_name>
with arguments provided as a JSON string.
#my-server call echo '{"message": "Hello MCP!"}'
#<server_name> resources
: List available resources for <server_name>
.#<server_name> prompts
: List available prompts for <server_name>
.Create a Bot Account:
.env
file.Grant Required Bot Permissions:
post_all
create_post
read_channel
create_direct_channel
read_user
Add Bot to Team/Channel:
Connection Issues:
AI Provider Issues:
MCP Server Issues:
Contributions are welcome! Please feel free to submit pull requests.
This project is licensed under the MIT License - see the LICENSE file for details.
📇 VOYP Voice Over Your Phone MCP Server for making calls.
📇 An MCP server with openAPI specs for using the WhatsApp unnoficial API (https://waha.devlike.pro/ also open source: https://github.com/devlikeapro/waha
🐍 🏠 JMeter MCP Server for performance testing
📇 ☁️ Slack workspace integration for channel management and messaging