Confluent integration to interact with Confluent Kafka and Confluent Cloud REST APIs.
mcp-confluent
๐Harness the power of Large Language Models (LLMs) to revolutionize your interaction with Confluent Kafka and Confluent Cloud! mcp-confluent
is a cutting-edge Model Context Protocol (MCP) server implementation that empowers AI assistants to seamlessly manage your Kafka infrastructure through natural language. Imagine effortlessly creating topics, managing connectors, and executing Flink SQL statements, all with simple, intuitive commands.
mcp-confluent
This guide provides everything you need to get started, from initial setup to advanced development.
Craft Your .env
Configuration: Begin by creating a .env
file in your project's root directory. Populate it with the necessary configuration details, drawing inspiration from the example structure provided below.
Configure Your Environment: Tailor the .env
file with your Confluent Cloud environment settings. Refer to the Configuration section for detailed explanations of each variable.
Node.js Installation: Ensure Node.js is installed on your system. We highly recommend using NVM (Node Version Manager) for streamlined Node.js version management.
nvm install 22 nvm use 22
Create a .env
file in the root directory of your project with the following configuration:
# .env file BOOTSTRAP_SERVERS="pkc-v12gj.us-east4.gcp.confluent.cloud:9092" KAFKA_API_KEY="..." KAFKA_API_SECRET="..." KAFKA_REST_ENDPOINT="https://pkc-v12gj.us-east4.gcp.confluent.cloud:443" KAFKA_CLUSTER_ID="" KAFKA_ENV_ID="env-..." FLINK_ENV_ID="env-..." FLINK_ORG_ID="" FLINK_REST_ENDPOINT="https://flink.us-east4.gcp.confluent.cloud" FLINK_ENV_NAME="" FLINK_DATABASE_NAME="" FLINK_API_KEY="" FLINK_API_SECRET="" FLINK_COMPUTE_POOL_ID="lfcp-..." CONFLUENT_CLOUD_API_KEY="" CONFLUENT_CLOUD_API_SECRET="" CONFLUENT_CLOUD_REST_ENDPOINT="https://api.confluent.cloud" SCHEMA_REGISTRY_API_KEY="..." SCHEMA_REGISTRY_API_SECRET="..." SCHEMA_REGISTRY_ENDPOINT="https://psrc-zv01y.northamerica-northeast2.gcp.confluent.cloud"
Variable | Description | Default Value | Required |
---|---|---|---|
BOOTSTRAP_SERVERS | List of Kafka broker addresses in the format host1:port1,host2:port2 used to establish initial connection to the Kafka cluster (string) | Yes | |
CONFIG_PATH | File system path to store and retrieve conversation-based configurations for session persistence (Future Implementation) (string) | Yes | |
CONFLUENT_CLOUD_API_KEY | Master API key for Confluent Cloud platform administration, enabling management of resources across your organization (string (min: 1)) | Yes | |
CONFLUENT_CLOUD_API_SECRET | Master API secret paired with CONFLUENT_CLOUD_API_KEY for comprehensive Confluent Cloud platform administration (string (min: 1)) | Yes | |
FLINK_API_KEY | Authentication key for accessing Confluent Cloud's Flink services, including compute pools and SQL statement management (string (min: 1)) | Yes | |
FLINK_API_SECRET | Secret token paired with FLINK_API_KEY for authenticated access to Confluent Cloud's Flink services (string (min: 1)) | Yes | |
KAFKA_API_KEY | Authentication credential (username) required to establish secure connection with the Kafka cluster (string (min: 1)) | Yes | |
KAFKA_API_SECRET | Authentication credential (password) paired with KAFKA_API_KEY for secure Kafka cluster access (string (min: 1)) | Yes | |
SCHEMA_REGISTRY_API_KEY | Authentication key for accessing Schema Registry services to manage and validate data schemas (string (min: 1)) | Yes | |
SCHEMA_REGISTRY_API_SECRET | Authentication secret paired with SCHEMA_REGISTRY_API_KEY for secure Schema Registry access (string (min: 1)) | Yes | |
CONFLUENT_CLOUD_REST_ENDPOINT | Base URL for Confluent Cloud's REST API services (default) | No | |
FLINK_COMPUTE_POOL_ID | Unique identifier for the Flink compute pool, must start with 'lfcp-' prefix (string) | No | |
FLINK_DATABASE_NAME | Name of the associated Kafka cluster used as a database reference in Flink SQL operations (string (min: 1)) | No | |
FLINK_ENV_ID | Unique identifier for the Flink environment, must start with 'env-' prefix (string) | No | |
FLINK_ENV_NAME | Human-readable name for the Flink environment used for identification and display purposes (string (min: 1)) | No | |
FLINK_ORG_ID | Organization identifier within Confluent Cloud for Flink resource management (string (min: 1)) | No | |
FLINK_REST_ENDPOINT | Base URL for Confluent Cloud's Flink REST API endpoints used for SQL statement and compute pool management (string) | No | |
KAFKA_CLUSTER_ID | Unique identifier for the Kafka cluster within Confluent Cloud ecosystem (string (min: 1)) | No | |
KAFKA_ENV_ID | Environment identifier for Kafka cluster, must start with 'env-' prefix (string) | No | |
KAFKA_REST_ENDPOINT | REST API endpoint for Kafka cluster management and administration (string) | No | |
SCHEMA_REGISTRY_ENDPOINT | URL endpoint for accessing Schema Registry services to manage data schemas (string) | No |
This MCP server seamlessly integrates with various MCP clients, including Claude Desktop and Goose CLI/Desktop. The specific configuration and interaction will vary depending on your chosen client.
localhost
with a specific port).Access Claude Desktop Configuration:
~/Library/Application Support/Claude/claude_desktop_config.json
%APPDATA%\Claude\claude_desktop_config.json
Modify Configuration File:
{ "mcpServers": { "confluent": { "command": "node", "args": [ "/path/to/confluent-mcp-server/dist/index.js", "--env-file", "/path/to/confluent-mcp-server/.env", ] } } }
{ "mcpServers": { "confluent": { "command": "npx", "args": [ "-y" "@confluentinc/mcp-confluent", "-e", "/path/to/confluent-mcp-server/.env" ] } } }
Replace /path/to/confluent-mcp-server/
with the actual path where you've installed this MCP server.
Restart Claude Desktop: Close and reopen Claude Desktop for the changes to take effect.
Run the Configuration Command:
goose configure
Follow the Interactive Prompts:
Add extension
Command-line Extension
mcp-confluent
as the extension namenode /path/to/confluent-mcp-server/dist/index.js --env-file /path/to/confluent-mcp-server/.env
npx -y @confluentinc/mcp-confluent -e /path/to/confluent-mcp-server/.env
Replace /path/to/confluent-mcp-server/
with the actual path where you've installed this MCP server.
mcp-confluent
/ โโโ src/ # Source code โ โโโ confluent/ # Code related to Confluent integration (API clients, etc.) โ โโโ tools/ # Tool implementations (each tool in a separate file) โ โโโ index.ts # Main entry point for the server โ โโโ ... # Other server logic, utilities, etc. โโโ dist/ # Compiled output (TypeScript -> JavaScript) โโโ openapi.json # OpenAPI specification for Confluent Cloud โโโ .env # Environment variables (example - should be copied and filled) โโโ README.md # This file โโโ package.json # Node.js project metadata and dependencies
Install Dependencies:
npm install
Development Mode (watch for changes):
npm run dev
Production Build (one-time compilation):
npm run build
Start the Server:
npm run start
# make sure you've already built the project either in dev mode or by running npm run build npx @modelcontextprotocol/inspector node $PATH_TO_PROJECT/dist/index.js --env-file $PATH_TO_PROJECT/.env
ToolName
.ToolFactory
class.BaseToolHandler
.
handle
method of the base class.getToolConfig
method of the base class.enabledTools
in index.ts
.# as of v7.5.2 there is a bug when using allOf w/ required https://github.com/openapi-ts/openapi-typescript/issues/1474. need --empty-objects-unknown flag to avoid it npx openapi-typescript ./openapi.json -o ./src/confluent/openapi-schema.d.ts --empty-objects-unknown
We welcome bug reports and feedback in the form of Github Issues. For guidelines on contributing please see CONTRIBUTING.md
๐ โ๏ธ Biomedical research server providing access to PubMed, ClinicalTrials.gov, and MyVariant.info.
๐ MCP server that provides SQL analysis, linting, and dialect conversion using [SQLGlot](https://github.com/tobymao/sqlglot)
๐ ๐ All-in-one MCP server for Postgres development and operations, with tools for performance analysis, tuning, and health checks
Supabase MCP Server with support for SQL query execution and database exploration tools