π¦ π Provides up-to-date documentation context for a specific Rust crate to LLMs via an MCP tool, using semantic search (embeddings) and LLM summarization.
Rust Docs MCP Server
A server that fetches and processes Rust crate documentation, enabling AI coding assistants to provide accurate, up-to-date answers about specific crates via semantic search and LLM summarization.
Download Pre-Compiled Binary:
PATH
.Build from Source (Optional):
rustup
).cargo build --release
.export OPENAI_API_KEY="sk-..."
serde@^1.0
, tokio
).
rustdocs_mcp_server "crate_name@version" --features "feat1,feat2"
query_rust_docs
) for AI assistants to query crate documentation.{ "method": "callTool", "params": { "tool_name": "query_rust_docs", "arguments": { "question": "How to make a GET request with reqwest?" } } }
gpt-4o-mini
.~/.local/share/rustdocs-mcp-server/
) to avoid repeated costs.serde
, tokio
, or reqwest
).License: MIT
GitHub: https://github.com/Govcraft/rust-docs-mcp-server
π β an openAI middleware proxy to use mcp in any existing openAI compatible client
π β framework to build vertical AI agent
π β Use MCP provided tools in LangChain.js
A CLI host application that enables Large Language Models (LLMs) to interact with external tools through the Model Context Protocol (MCP).