mc

mcp-confluent

Confluent integration to interact with Confluent Kafka and Confluent Cloud REST APIs.

Publishermcp-confluent
Submitted date4/13/2025

๐Ÿš€ Unleash the Power of AI with Confluent Kafka: An Expert's Guide to mcp-confluent ๐Ÿš€

Harness the power of Large Language Models (LLMs) to revolutionize your interaction with Confluent Kafka and Confluent Cloud! mcp-confluent is a cutting-edge Model Context Protocol (MCP) server implementation that empowers AI assistants to seamlessly manage your Kafka infrastructure through natural language. Imagine effortlessly creating topics, managing connectors, and executing Flink SQL statements, all with simple, intuitive commands.

Interactive Demos: Witness the Magic

Goose CLI: Command Kafka with Natural Language

Goose CLI Demo

Claude Desktop: AI-Powered Confluent Management

Claude Desktop Demo

Your Comprehensive Guide to mcp-confluent

This guide provides everything you need to get started, from initial setup to advanced development.

๐Ÿ› ๏ธ User Guide: Master the Art of AI-Driven Kafka Management

Getting Started: Your First Steps to AI-Powered Kafka

  1. Craft Your .env Configuration: Begin by creating a .env file in your project's root directory. Populate it with the necessary configuration details, drawing inspiration from the example structure provided below.

  2. Configure Your Environment: Tailor the .env file with your Confluent Cloud environment settings. Refer to the Configuration section for detailed explanations of each variable.

  3. Node.js Installation: Ensure Node.js is installed on your system. We highly recommend using NVM (Node Version Manager) for streamlined Node.js version management.

    nvm install 22 nvm use 22

Configuration: Fine-Tuning Your AI-Kafka Connection

Create a .env file in the root directory of your project with the following configuration:

Example .env file structure
# .env file BOOTSTRAP_SERVERS="pkc-v12gj.us-east4.gcp.confluent.cloud:9092" KAFKA_API_KEY="..." KAFKA_API_SECRET="..." KAFKA_REST_ENDPOINT="https://pkc-v12gj.us-east4.gcp.confluent.cloud:443" KAFKA_CLUSTER_ID="" KAFKA_ENV_ID="env-..." FLINK_ENV_ID="env-..." FLINK_ORG_ID="" FLINK_REST_ENDPOINT="https://flink.us-east4.gcp.confluent.cloud" FLINK_ENV_NAME="" FLINK_DATABASE_NAME="" FLINK_API_KEY="" FLINK_API_SECRET="" FLINK_COMPUTE_POOL_ID="lfcp-..." CONFLUENT_CLOUD_API_KEY="" CONFLUENT_CLOUD_API_SECRET="" CONFLUENT_CLOUD_REST_ENDPOINT="https://api.confluent.cloud" SCHEMA_REGISTRY_API_KEY="..." SCHEMA_REGISTRY_API_SECRET="..." SCHEMA_REGISTRY_ENDPOINT="https://psrc-zv01y.northamerica-northeast2.gcp.confluent.cloud"

Environment Variables Reference: A Deep Dive

VariableDescriptionDefault ValueRequired
BOOTSTRAP_SERVERSList of Kafka broker addresses in the format host1:port1,host2:port2 used to establish initial connection to the Kafka cluster (string)Yes
CONFIG_PATHFile system path to store and retrieve conversation-based configurations for session persistence (Future Implementation) (string)Yes
CONFLUENT_CLOUD_API_KEYMaster API key for Confluent Cloud platform administration, enabling management of resources across your organization (string (min: 1))Yes
CONFLUENT_CLOUD_API_SECRETMaster API secret paired with CONFLUENT_CLOUD_API_KEY for comprehensive Confluent Cloud platform administration (string (min: 1))Yes
FLINK_API_KEYAuthentication key for accessing Confluent Cloud's Flink services, including compute pools and SQL statement management (string (min: 1))Yes
FLINK_API_SECRETSecret token paired with FLINK_API_KEY for authenticated access to Confluent Cloud's Flink services (string (min: 1))Yes
KAFKA_API_KEYAuthentication credential (username) required to establish secure connection with the Kafka cluster (string (min: 1))Yes
KAFKA_API_SECRETAuthentication credential (password) paired with KAFKA_API_KEY for secure Kafka cluster access (string (min: 1))Yes
SCHEMA_REGISTRY_API_KEYAuthentication key for accessing Schema Registry services to manage and validate data schemas (string (min: 1))Yes
SCHEMA_REGISTRY_API_SECRETAuthentication secret paired with SCHEMA_REGISTRY_API_KEY for secure Schema Registry access (string (min: 1))Yes
CONFLUENT_CLOUD_REST_ENDPOINTBase URL for Confluent Cloud's REST API services (default)No
FLINK_COMPUTE_POOL_IDUnique identifier for the Flink compute pool, must start with 'lfcp-' prefix (string)No
FLINK_DATABASE_NAMEName of the associated Kafka cluster used as a database reference in Flink SQL operations (string (min: 1))No
FLINK_ENV_IDUnique identifier for the Flink environment, must start with 'env-' prefix (string)No
FLINK_ENV_NAMEHuman-readable name for the Flink environment used for identification and display purposes (string (min: 1))No
FLINK_ORG_IDOrganization identifier within Confluent Cloud for Flink resource management (string (min: 1))No
FLINK_REST_ENDPOINTBase URL for Confluent Cloud's Flink REST API endpoints used for SQL statement and compute pool management (string)No
KAFKA_CLUSTER_IDUnique identifier for the Kafka cluster within Confluent Cloud ecosystem (string (min: 1))No
KAFKA_ENV_IDEnvironment identifier for Kafka cluster, must start with 'env-' prefix (string)No
KAFKA_REST_ENDPOINTREST API endpoint for Kafka cluster management and administration (string)No
SCHEMA_REGISTRY_ENDPOINTURL endpoint for accessing Schema Registry services to manage data schemas (string)No

Usage: Unleashing the Power of AI-Driven Kafka

This MCP server seamlessly integrates with various MCP clients, including Claude Desktop and Goose CLI/Desktop. The specific configuration and interaction will vary depending on your chosen client.

  1. Build the Foundation: Follow the instructions in the Developer Guide to build and run the server from source.
  2. Configure Your MCP Client: Configure your client (e.g., Claude, Goose) to connect to the address where this server is running (likely localhost with a specific port).
  3. Start the Magic: Launch your MCP client, which will automatically initiate an instance of this MCP server locally.
  4. Interact with Confluent: Use the client's intuitive interface to interact with Confluent Cloud resources.

Configuring Claude Desktop: AI Assistant at Your Service

  1. Access Claude Desktop Configuration:

    • Mac: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Modify Configuration File:

    • Open the config file in your preferred text editor.
    • Add or modify the configuration using one of the following methods:
    Option 1: Run from source
    { "mcpServers": { "confluent": { "command": "node", "args": [ "/path/to/confluent-mcp-server/dist/index.js", "--env-file", "/path/to/confluent-mcp-server/.env", ] } } }
    Option 2: Run from npx
    { "mcpServers": { "confluent": { "command": "npx", "args": [ "-y" "@confluentinc/mcp-confluent", "-e", "/path/to/confluent-mcp-server/.env" ] } } }

    Replace /path/to/confluent-mcp-server/ with the actual path where you've installed this MCP server.

  3. Restart Claude Desktop: Close and reopen Claude Desktop for the changes to take effect.

Configuring Goose CLI: Command-Line Power Meets AI

  1. Run the Configuration Command:

    goose configure
  2. Follow the Interactive Prompts:

    • Select Add extension
    • Choose Command-line Extension
    • Enter mcp-confluent as the extension name
    • Choose one of the following configuration methods:
    Option 1: Run from source
    node /path/to/confluent-mcp-server/dist/index.js --env-file /path/to/confluent-mcp-server/.env
    Option 2: Run from npx
    npx -y @confluentinc/mcp-confluent -e /path/to/confluent-mcp-server/.env

    Replace /path/to/confluent-mcp-server/ with the actual path where you've installed this MCP server.

๐Ÿง‘โ€๐Ÿ’ป Developer Guide: Dive Deep into mcp-confluent

Project Structure: A Blueprint for Understanding

/ โ”œโ”€โ”€ src/ # Source code โ”‚ โ”œโ”€โ”€ confluent/ # Code related to Confluent integration (API clients, etc.) โ”‚ โ”œโ”€โ”€ tools/ # Tool implementations (each tool in a separate file) โ”‚ โ”œโ”€โ”€ index.ts # Main entry point for the server โ”‚ โ””โ”€โ”€ ... # Other server logic, utilities, etc. โ”œโ”€โ”€ dist/ # Compiled output (TypeScript -> JavaScript) โ”œโ”€โ”€ openapi.json # OpenAPI specification for Confluent Cloud โ”œโ”€โ”€ .env # Environment variables (example - should be copied and filled) โ”œโ”€โ”€ README.md # This file โ””โ”€โ”€ package.json # Node.js project metadata and dependencies

Building and Running: From Source to Execution

  1. Install Dependencies:

    npm install
  2. Development Mode (watch for changes):

    npm run dev
  3. Production Build (one-time compilation):

    npm run build
  4. Start the Server:

    npm run start

Testing: Ensuring Reliability and Performance

MCP Inspector: Your Debugging Companion
# make sure you've already built the project either in dev mode or by running npm run build npx @modelcontextprotocol/inspector node $PATH_TO_PROJECT/dist/index.js --env-file $PATH_TO_PROJECT/.env

Adding a New Tool: Expanding the AI-Kafka Ecosystem

  1. Add a new enum to the enum class ToolName.
  2. Add your new tool to the handlers map in the ToolFactory class.
  3. Create a new file, exporting the class that extends BaseToolHandler.
    1. Implement the handle method of the base class.
    2. Implement the getToolConfig method of the base class.
  4. Once satisfied, add it to the set of enabledTools in index.ts.

Generating Types: Maintaining Type Safety

# as of v7.5.2 there is a bug when using allOf w/ required https://github.com/openapi-ts/openapi-typescript/issues/1474. need --empty-objects-unknown flag to avoid it npx openapi-typescript ./openapi.json -o ./src/confluent/openapi-schema.d.ts --empty-objects-unknown

Contributing: Join the Community

We welcome bug reports and feedback in the form of Github Issues. For guidelines on contributing please see CONTRIBUTING.md

Visit More

View All