ke

keboola-mcp-server

interact with Keboola Connection Data Platform. This server provides tools for listing and accessing data from Keboola Storage API.

Publisherkeboola-mcp-server
Submitted date4/13/2025

Bridging the Gap: Keboola MCP Server for Seamless LLM Integration

Unlock the power of your Keboola data within Large Language Model (LLM) applications with the Keboola Model Context Protocol (MCP) Server. This robust tool provides a standardized interface for connecting LLMs to your Keboola Connection data, enabling AI-powered insights and workflows.

Key Features

  • Effortless Integration: Seamlessly connect LLMs to Keboola Storage API for real-time data access.
  • Standardized Protocol: Adheres to the Model Context Protocol (MCP) for universal compatibility.
  • Versatile Applications: Ideal for AI-powered IDEs, intelligent chat interfaces, and custom AI workflows.
  • Secure Data Access: Leverage Keboola's security model with Storage API token authentication.
  • Workspace Support: Compatible with Snowflake and BigQuery Read-Only Workspaces.

Prerequisites

  • Python 3.10+
  • Valid Keboola Storage API token
  • Read-Only Workspace (Snowflake or BigQuery)

Installation Guide

Option 1: Automated Installation via Smithery

Simplify the installation process with Smithery:

npx -y @smithery/cli install keboola-mcp-server --client claude

Option 2: Manual Installation

  1. Clone the Repository:

    git clone https://github.com/keboola/keboola-mcp-server.git cd keboola-mcp-server
  2. Create a Virtual Environment:

    python3 -m venv .venv source .venv/bin/activate pip3 install -U pip
  3. Install the Package:

    pip3 install -e .
  4. Install Development Dependencies (Optional):

    pip3 install -e ".[dev]"

Configuration for LLM Platforms

Claude Desktop

  1. Locate the Configuration File:

    • macOS: ~/Library/Application Support/Claude/claude_desktop_config.json
    • Windows: %APPDATA%\Claude\claude_desktop_config.json
  2. Add the Keboola MCP Server Configuration:

    { "mcpServers": { "keboola": { "command": "/path/to/keboola-mcp-server/.venv/bin/python", "args": [ "-m", "keboola_mcp_server", "--api-url", "https://connection.YOUR_REGION.keboola.com" ], "env": { "KBC_STORAGE_TOKEN": "your-keboola-storage-token", "KBC_WORKSPACE_SCHEMA": "your-workspace-schema" } } } }
    • Replace /path/to/keboola-mcp-server with the actual path to your cloned repository.
    • Replace YOUR_REGION with your Keboola region (e.g., north-europe.azure). Omit if your region is simply connection.
    • Replace your-keboola-storage-token with your Keboola Storage API token.
    • Replace your-workspace-schema with your Snowflake schema or BigQuery dataset name.

    Important Notes:

    • If using a specific Python version (e.g., 3.11), adjust the command accordingly (e.g., /path/to/keboola-mcp-server/.venv/bin/python3.11).
    • The Workspace can be created within your Keboola project, providing the necessary connection parameters.
  3. Restart Claude Desktop: Ensure a complete restart (not just closing the window).

  4. Verify Connection: Look for the hammer icon in the bottom right corner.

Cursor AI

  1. Locate the Configuration File: ~/.cursor/mcp.json

  2. Choose a Transport Method: Server-Sent Events (SSE) or Standard I/O (stdio).

Option 1: Server-Sent Events (SSE)

{ "mcpServers": { "keboola": { "url": "http://localhost:8000/sse?storage_token=YOUR-KEBOOLA-STORAGE-TOKEN&workspace_schema=YOUR-WORKSPACE-SCHEMA" } } }

Option 2a: Standard I/O (stdio)

{ "mcpServers": { "keboola": { "command": "/path/to/keboola-mcp-server/.venv/bin/python", "args": [ "-m", "keboola_mcp_server", "--transport", "stdio", "--api-url", "https://connection.YOUR_REGION.keboola.com" ], "env": { "KBC_STORAGE_TOKEN": "your-keboola-storage-token", "KBC_WORKSPACE_SCHEMA": "your-workspace-schema" } } } }

Option 2b: WSL Standard I/O (stdio)

For running the MCP server from Windows Subsystem for Linux:

{ "mcpServers": { "keboola": { "command": "wsl.exe", "args": [ "bash", "-c", "'source /wsl_path/to/keboola-mcp-server/.env", "&&", "/wsl_path/to/keboola-mcp-server/.venv/bin/python -m keboola_mcp_server.cli --transport stdio'" ] } } }

Create a .env file in your WSL environment:

export KBC_STORAGE_TOKEN="your-keboola-storage-token" export KBC_WORKSPACE_SCHEMA="your-workspace-schema"

Important Notes:

  • Replace placeholders with your actual values.

  • For SSE, ensure the MCP server is running:

    /path/to/keboola-mcp-server/.venv/bin/python -m keboola_mcp_server --transport sse --api-url https://connection.YOUR_REGION.keboola.com
  1. Restart Cursor AI: The MCP server should be automatically detected.

BigQuery Integration

For Keboola projects using BigQuery, set the GOOGLE_APPLICATION_CREDENTIALS environment variable:

  1. Obtain Credentials: Access your Keboola BigQuery workspace and download the credentials file.
  2. Set Environment Variable: Set GOOGLE_APPLICATION_CREDENTIALS to the full path of the downloaded JSON file.

Available Tools

  • List buckets and tables
  • Get bucket and table information
  • Preview table data
  • Export table data to CSV
  • List components and configurations

Development Practices

  • Run Tests: pytest
  • Format Code: black . && isort .
  • Type Checking: mypy .

By leveraging the Keboola MCP Server, you can seamlessly integrate your Keboola data with LLM applications, unlocking new possibilities for AI-driven insights and automation.

Visit More

View All