Snowflake integration implementing read and (optional) write operations as well as insight tracking
This document details the implementation of a Model Context Protocol (MCP) server designed to seamlessly connect Large Language Models (LLMs) with Snowflake, a leading cloud data platform. By leveraging MCP, we empower LLMs with direct access to Snowflake's vast data resources, enabling sophisticated data analysis, intelligent decision-making, and the creation of AI-driven applications with unparalleled context awareness. This server acts as a crucial bridge, transforming raw data into actionable insights for LLMs.
This MCP server exposes key resources and tools, providing LLMs with the necessary capabilities to interact effectively with Snowflake.
memo://insights
: A dynamic, continuously updated repository of data insights derived from Snowflake. This memo acts as a central knowledge base, aggregating findings discovered through the append_insight
tool. The real-time update mechanism ensures that LLMs always have access to the latest discoveries.
context://table/{table_name}
: (Optional, requires prefetch) Exposes detailed schema summaries for each table, including column definitions, data types, and descriptive comments. This resource provides LLMs with a deep understanding of the data structure, enabling more accurate and contextually relevant queries and analyses.
The server provides a comprehensive suite of tools, categorized for clarity, that enable LLMs to perform a wide range of operations within Snowflake.
read_query
: Executes SELECT
statements, allowing LLMs to retrieve data from Snowflake. The tool returns query results as an array of objects, facilitating easy parsing and utilization by the LLM.
query
(string): The SELECT
SQL query to be executed.write_query
(Requires --allow-write
): Executes INSERT
, UPDATE
, and DELETE
statements, enabling LLMs to modify data within Snowflake. This tool is disabled by default for security reasons and requires explicit activation.
query
(string): The SQL modification query.create_table
(Requires --allow-write
): Executes CREATE TABLE
statements, allowing LLMs to create new tables within Snowflake. This tool is also disabled by default and requires explicit activation.
query
(string): The CREATE TABLE
SQL statement.list_databases
: Retrieves a list of all databases within the Snowflake instance. This tool allows LLMs to discover the available data sources.
list_schemas
: Retrieves a list of all schemas within a specified database. This tool enables LLMs to navigate the database structure and identify relevant schemas.
database
(string): The name of the database.list_tables
: Retrieves a list of all tables within a specified database and schema. This tool provides LLMs with a comprehensive view of the available tables.
database
(string): The name of the database.schema
(string): The name of the schema.describe_table
: Retrieves detailed column information for a specified table, including names, data types, nullability, defaults, and comments. This tool provides LLMs with a deep understanding of the table structure, enabling more accurate and contextually relevant queries.
table_name
(string): The fully qualified table name (database.schema.table
).append_insight
: Adds new data insights to the memo://insights
resource. This tool allows LLMs to contribute to the collective knowledge base, enriching the context available to other LLMs and applications.
insight
(string): A data insight discovered from analysis.memo://insights
resource.This section provides detailed instructions for integrating the Snowflake MCP server with the Claude Desktop application.
Leverage the Smithery CLI for a simplified installation process:
npx -y @smithery/cli install mcp_snowflake_server --client claude
This command automates the installation and configuration of the Snowflake Server for Claude Desktop.
This method provides greater control over the installation and configuration process.
claude_desktop_config.json
: Add the following configuration block to your claude_desktop_config.json
file:"mcpServers": { "snowflake_pip": { "command": "uvx", "args": [ "--python=3.12", // Optional: specify Python version <=3.12 "mcp_snowflake_server", "--account", "your_account", "--warehouse", "your_warehouse", "--user", "your_user", "--password", "your_password", "--role", "your_role", "--database", "your_database", "--schema", "your_schema" // Optionally: "--allow_write" // Optionally: "--log_dir", "/absolute/path/to/logs" // Optionally: "--log_level", "DEBUG"/"INFO"/"WARNING"/"ERROR"/"CRITICAL" // Optionally: "--exclude_tools", "{tool_name}", ["{other_tool_name}"] ] } }
* **`command`**: Specifies the command to execute the server. In this case, we use `uvx` to manage the Python environment.
* **`args`**: An array of arguments passed to the command.
* `--python=3.12`: (Optional) Specifies the Python version to use. Ensure it's <=3.12.
* `mcp_snowflake_server`: The name of the MCP server package.
* `--account`, `--warehouse`, `--user`, `--password`, `--role`, `--database`, `--schema`: Snowflake connection parameters. Replace these with your actual credentials.
* `--allow_write`: (Optional) Enables write operations. Use with caution.
* `--log_dir`: (Optional) Specifies the directory for log files.
* `--log_level`: (Optional) Sets the logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL).
* `--exclude_tools`: (Optional) Excludes specific tools from being exposed to the LLM.
This method allows for local development and testing of the Snowflake MCP server.
Install Claude AI Desktop App: Download and install the Claude AI Desktop App from https://claude.ai/download.
Install uv
: Install the uv
package manager using the following command:
curl -LsSf https://astral.sh/uv/install.sh | sh
.env
file: Create a .env
file in the root directory of the project and add your Snowflake credentials:SNOWFLAKE_USER="xxx@your_email.com" SNOWFLAKE_ACCOUNT="xxx" SNOWFLAKE_ROLE="xxx" SNOWFLAKE_DATABASE="xxx" SNOWFLAKE_SCHEMA="xxx" SNOWFLAKE_WAREHOUSE="xxx" SNOWFLAKE_PASSWORD="xxx" # Alternatively, use external browser authentication: # SNOWFLAKE_AUTHENTICATOR="externalbrowser"
* Replace the placeholder values with your actual Snowflake credentials.
* Alternatively, you can use external browser authentication by setting the `SNOWFLAKE_AUTHENTICATOR` variable to `externalbrowser`.
4. [Optional] Modify runtime_config.json
: Customize the server's behavior by modifying the runtime_config.json
file. This file allows you to set exclusion patterns for databases, schemas, or tables, preventing them from being exposed to the LLM.
uv --directory /absolute/path/to/mcp_snowflake_server run mcp_snowflake_server
* Replace `/absolute/path/to/mcp_snowflake_server` with the actual path to the server's directory.
6. Add the server to your claude_desktop_config.json
: Add the following configuration block to your claude_desktop_config.json
file:
"mcpServers": { "snowflake_local": { "command": "/absolute/path/to/uv", "args": [ "--python=3.12", // Optional "--directory", "/absolute/path/to/mcp_snowflake_server", "run", "mcp_snowflake_server" // Optionally: "--allow_write" // Optionally: "--log_dir", "/absolute/path/to/logs" // Optionally: "--log_level", "DEBUG"/"INFO"/"WARNING"/"ERROR"/"CRITICAL" // Optionally: "--exclude_tools", "{tool_name}", ["{other_tool_name}"] ] } }
* **`command`**: Specifies the command to execute the server.
* **`args`**: An array of arguments passed to the command.
* `--python=3.12`: (Optional) Specifies the Python version to use.
* `--directory`: Specifies the directory containing the server's code.
* `run mcp_snowflake_server`: Executes the server.
* `--allow_write`: (Optional) Enables write operations. Use with caution.
* `--log_dir`: (Optional) Specifies the directory for log files.
* `--log_level`: (Optional) Sets the logging level (DEBUG, INFO, WARNING, ERROR, CRITICAL).
* `--exclude_tools`: (Optional) Excludes specific tools from being exposed to the LLM.
Write Operations: By default, write operations are disabled for security reasons. To enable them, you must explicitly use the --allow-write
flag. Exercise caution when enabling write operations, as they can potentially modify data within your Snowflake instance.
Data Filtering: The server supports filtering out specific databases, schemas, or tables via exclusion patterns defined in the runtime_config.json
file. This allows you to control the data that is exposed to the LLM, ensuring data privacy and security.
Prefetching: Enabling prefetching exposes additional per-table context resources, providing LLMs with more detailed information about the data structure. This can improve the accuracy and relevance of LLM-generated queries and analyses.
Dynamic Insights: The append_insight
tool dynamically updates the memo://insights
resource, ensuring that LLMs always have access to the latest discoveries and insights.
๐ โ๏ธ Biomedical research server providing access to PubMed, ClinicalTrials.gov, and MyVariant.info.
๐ MCP server that provides SQL analysis, linting, and dialect conversion using [SQLGlot](https://github.com/tobymao/sqlglot)
๐ ๐ All-in-one MCP server for Postgres development and operations, with tools for performance analysis, tuning, and health checks
Supabase MCP Server with support for SQL query execution and database exploration tools