Unlocking AI Potential: Connecting Snowflake to Claude with MCP in Cursor

Adrian Lee Xinhan
10 min readMar 31, 2025

--

Note: Views are on my own and do not represent my employer

Data professionals are constantly striving to enhance database efficiency through AI. Snowflake provides powerful, accessible tools like Cortex Analyst, enabling intuitive ‘talk-to-your-data’ interactions, and Cortex Search, which facilitates hybrid searches across unstructured data.

Beyond these user friendly capabilities, developers frequently require more flexible, code-centric methods for integrating AI with their database workflows especially when a good portion of their time is spent around an IDE.

Today, I’m excited to share how I’ve been using Model Context Protocol (MCP) to connect Claude AI directly to Snowflake databases through Cursor IDE — opening up powerful new possibilities for data query and analysis.

What is Cursor?

Cursor is a proprietary AI-powered integrated development environment (IDE) available for Windows, macOS, and Linux. Developed by Anysphere Inc, it’s designed to enhance developer productivity by integrating advanced artificial intelligence features directly into the coding environment.

As a fork of Visual Studio Code, Cursor maintains compatibility with your favorite extensions while adding powerful AI capabilities:

  • AI-powered code generation that lets you write code using natural language instructions
  • Intelligent autocompletion that predicts your next edits
  • Codebase understanding for querying entire codebases in natural language
  • Smart rewrites for efficiently updating multiple code lines simultaneously

One of Cursor’s most impressive features is its “agent mode,” which can complete tasks end-to-end while keeping programmers in the loop.

Understanding the Model Context Protocol (MCP)

The Model Context Protocol (MCP) is an open standard that enables AI models to connect seamlessly with external tools and services. Developed and open-sourced by Anthropic in November 2024, MCP serves as a communication bridge between AI models and external systems.

Think of MCP as a “universal USB-C connector for AI”. It allows language models to fetch information, interact with APIs, and execute tasks beyond their built-in knowledge. This is crucial because even the most sophisticated AI models are constrained when isolated from data sources5.

MCP follows a client-server architecture with three key components:

  • MCP Hosts: These are AI applications or interfaces (like Cursor IDE) that connect to multiple MCP servers
  • MCP Clients: These act as intermediaries managing secure connections between hosts and servers
  • MCP Servers: External programs providing specific capabilities that connect to various data sources

Why MCP Is Revolutionary for AI Integration

MCP represents a fundamental shift in how we connect AI with data sources and tools. Here’s why it matters:

  1. Standardization: MCP provides a universal way to connect AI models with external tools, eliminating the need for custom connectors for each data source.
  2. Solving Integration Complexity: MCP overcomes the M×N integration problem, where each new language model (M) would otherwise require custom connectors for each enterprise tool (N). This reduces integration complexity from M×N to M+N.
  3. Enhanced AI Capabilities: By connecting models to external data sources, MCP enables AI assistants to produce better, more relevant responses based on your specific data.
  4. Security and Scalability: MCP enables secure and scalable integration with enterprise applications, with clear separation points where security controls can be applied.
  5. Community Innovation: The open-source nature of MCP has fostered community-built servers and innovations, creating compounding benefits for everyone.

Overall Architecture Flow

  • Cursor: This is the IDE where you interact with the system. You can write code, ask questions, and use the AI capabilities.
  • MCP Client: This acts as an intermediary, handling communication between Cursor and the MCP Server.
  • MCP Server (Python): This is your custom Python server that connects to the Snowflake database. It receives requests from Cursor, executes SQL queries on Snowflake, and returns the results back to Cursor.
  • Snowflake : This is where your data resides. The MCP Server interacts with Snowflake to fetch data and execute commands.
  • Claude AI: This is the AI model that can be used in conjunction with the system. It can generate code, answer questions, and assist with data analysis based on the information retrieved from Snowflake.

Deploying a Snowflake MCP Server in Cursor: Step-by-Step Guide

Let’s walk through setting up an MCP server that connects Claude to your Snowflake database within the Cursor IDE:

Step 1: Prepare Your Environment

First, let’s prepare our Snowflake Server. This server allows Claude to execute SQL queries on Snowflake databases and automatically handle database connection lifecycle.

For this there is this great repository which I found: https://github.com/datawiz168/mcp-snowflake-service which you can git clone immediately to start using.

Or if not you can definitely start from fresh, create a folder and first have the server.py.

#!/usr/bin/env python
"""
Snowflake Model Context Protocol (MCP) Server

This server enables Claude to execute SQL queries on Snowflake databases through
the Model Context Protocol (MCP). It handles connection lifecycle management,
query execution, and result formatting.
"""
import os
import asyncio
import logging
import json
import time
from typing import Optional, Any, Dict, List

import snowflake.connector
from dotenv import load_dotenv
import mcp.server.stdio
from mcp.server import Server
from mcp.types import Tool, TextContent

# Configure logging
logging.basicConfig(
level=logging.DEBUG,
format='%(asctime)s - %(name)s - %(levelname)s - %(message)s'
)
logger = logging.getLogger('snowflake_server')

# Load environment variables
load_dotenv()


class SnowflakeConnection:
"""Manages Snowflake database connections and query execution."""

def __init__(self) -> None:
"""Initialize Snowflake connection configuration from environment variables."""
self.config = {
"user": os.getenv("SNOWFLAKE_USER"),
"password": os.getenv("SNOWFLAKE_PASSWORD"),
"account": os.getenv("SNOWFLAKE_ACCOUNT"),
"database": os.getenv("SNOWFLAKE_DATABASE"),
"warehouse": os.getenv("SNOWFLAKE_WAREHOUSE"),
}
self.conn: Optional[snowflake.connector.SnowflakeConnection] = None

# Log configuration (excluding password)
safe_config = {k: v for k, v in self.config.items() if k != 'password'}
logger.info(f"Initialized with config: {json.dumps(safe_config)}")

def verify_link(self) -> snowflake.connector.SnowflakeConnection:
"""
Ensure the database connection is available and valid.

Returns:
A valid Snowflake connection

Raises:
Exception: If connection cannot be established
"""
try:
# Create new connection if needed
if self.conn is None:
logger.info("Creating new Snowflake connection...")
self.conn = snowflake.connector.connect(
**self.config,
client_session_keep_alive=True,
network_timeout=15,
login_timeout=15
)
self.conn.cursor().execute("ALTER SESSION SET TIMEZONE = 'UTC'")
logger.info("New connection established and configured")

# Test if connection is valid
try:
self.conn.cursor().execute("SELECT 1")
except:
logger.info("Connection lost, reconnecting...")
self.conn = None
return self.verify_link()

return self.conn

except Exception as e:
logger.error(f"Connection error: {str(e)}")
raise

def process_request(self, command: str) -> List[Dict[str, Any]]:
"""
Execute a SQL query and return the results.

Args:
command: SQL query to execute

Returns:
List of dictionaries containing query results

Raises:
Exception: If query execution fails
"""
start_time = time.time()
command_preview = command[:200] + "..." if len(command) > 200 else command
logger.info(f"Executing query: {command_preview}")

try:
conn = self.verify_link()

# Determine if this is a data modification query
is_write_operation = any(
command.strip().upper().startswith(word)
for word in ['INSERT', 'UPDATE', 'DELETE', 'CREATE', 'DROP', 'ALTER']
)

with conn.cursor() as cursor:
if is_write_operation:
# Use transaction for write operations
cursor.execute("BEGIN")
try:
cursor.execute(command)
conn.commit()
execution_time = time.time() - start_time
logger.info(f"Write query executed in {execution_time:.2f}s")
return [{"affected_rows": cursor.rowcount}]
except Exception as e:
conn.rollback()
raise
else:
# Read operations
cursor.execute(command)
execution_time = time.time() - start_time

if cursor.description:
columns = [col[0] for col in cursor.description]
rows = cursor.fetchall()
results = [dict(zip(columns, row)) for row in rows]
logger.info(f"Read query returned {len(results)} rows in {execution_time:.2f}s")
return results

logger.info(f"Read query executed in {execution_time:.2f}s (no results)")
return []

except snowflake.connector.errors.ProgrammingError as e:
logger.error(f"SQL Error: {str(e)}")
logger.error(f"Error Code: {getattr(e, 'errno', 'unknown')}")
raise
except Exception as e:
logger.error(f"Query error: {str(e)}")
logger.error(f"Error type: {type(e).__name__}")
raise

def cleanup(self) -> None:
"""Safely close the database connection."""
if self.conn:
try:
self.conn.close()
logger.info("Connection closed")
except Exception as e:
logger.error(f"Error closing connection: {str(e)}")
finally:
self.conn = None


class SnowflakeServer(Server):
"""MCP server that handles Snowflake database operations."""

def __init__(self) -> None:
"""Initialize the MCP server with a Snowflake connection."""
super().__init__(name="snowflake-server")
self.db = SnowflakeConnection()
logger.info("SnowflakeServer initialized")

@self.list_tools()
async def get_supported_operations():
"""Return list of available tools."""
return [
Tool(
name="execute_query",
description="Execute a SQL query on Snowflake",
inputSchema={
"type": "object",
"properties": {
"query": {
"type": "string",
"description": "SQL query to execute"
}
},
"required": ["query"]
}
)
]

@self.call_tool()
async def handle_operation(name: str, arguments: Dict[str, Any]):
"""
Handle tool call requests.

Args:
name: Tool name
arguments: Tool arguments

Returns:
List of TextContent objects with execution results
"""
if name == "execute_query":
start_time = time.time()
try:
result = self.db.process_request(arguments["query"])
execution_time = time.time() - start_time

return [TextContent(
type="text",
text=f"Results (execution time: {execution_time:.2f}s):\n{result}"
)]
except Exception as e:
error_message = f"Error executing query: {str(e)}"
logger.error(error_message)
return [TextContent(
type="text",
text=error_message
)]

def __del__(self) -> None:
"""Clean up resources when the server is deleted."""
if hasattr(self, 'db'):
self.db.cleanup()


async def start_service() -> None:
"""Start and run the MCP server."""
try:
# Initialize the server
server = SnowflakeServer()
initialization_options = server.create_initialization_options()
logger.info("Starting server")

# Run the server using stdio communication
async with mcp.server.stdio.stdio_server() as (read_stream, write_stream):
await server.run(
read_stream,
write_stream,
initialization_options
)
except Exception as e:
logger.critical(f"Server failed: {str(e)}", exc_info=True)
raise
finally:
logger.info("Server shutting down")


if __name__ == "__main__":
asyncio.run(start_service())Step 2: Setting Up the MCP Server in Cursor

We would also need to update the env file to include in our Snowflake variables such as

SNOWFLAKE_ACCOUNT=your_account
SNOWFLAKE_USER=your_username
SNOWFLAKE_PASSWORD=your_password
SNOWFLAKE_DATABASE=your_database
SNOWFLAKE_SCHEMA=your_schema
SNOWFLAKE_WAREHOUSE=your_warehouse

You can copy the SNOWFLAKE_ACCOUNT under the account identifier from Snowflake.

Step 2: Setting up MCP Server in Cursor

  1. Open the folder in Cursor IDE
  2. Navigate to: Settings > Cursor Settings > MCP > Add New global MCP Server

Configure your server with the following details:

  • Name: Give it a descriptive name (e.g., “snowflake”)
  • Command: We’ll use the Python run time
  • Args: Replace with the actual location where you cloned the repository or where your folder is if you want to start from scratch

Then set the command to run your server:

python server.py

If all is good you should see that the MCP server for Snowflake is green.

Step 3: Testing Your Snowflake MCP Connection

Now that your server is set up, let’s test its capabilities:

Open a new chat in Cursor’s “Agent” mode. To bring up the chat inside cursor press “Command + L” (if you are using Macbook)

We will be running through 3 scenarios with cursor

Scenario 1: Querying existing databases

Let’s start off by querying our existing database assets.

I start off by asking the question “What databases do I have access to” and we can see that it returns me the databases which I am able to have access which are in this case SNOWFLAKE and the SNOWFLAKE_SAMPLE_DATA

The MCP immediately prompts me to ask what what schemas are available in the SNOWFLAKE_SAMPLE_DATA database which it can then extract out. It then prompts me to sample some tables inside the TPCH_SF1 schema.

I am going to ask a follow up question which is “Can you tell me some details about the customer table?” and we can see that details of the customer table are being returned to us.

Scenario 2: Creating new databases and simulating data

I am now going to create a travel database and simulate some data.

Please create a travel database for me. Inside this Travel database, I will like to have 3 tables

  1. travel_destinations where it details travel region such as asia, the country such as singapore and the geographical lat and long of the country
  2. travel_flights where it details plane flight number, the source country like singapore and the destination country like germany
  3. travel_passengers where it details the customer_ id, customer name and the fight number

Do suggest the appropriate datatypes for each tables. Generate 1000 rows for me for each database.”

We can see that the MCP then creates the tables.

It then starts inserting in some mock data into the tables

If we go into our Snowflake environment, we can then start viewing the data generated in each table.

Travel destination table
Travel Flights table

Scenario 3: Creating a Dynamic table in Snowflake

In this Scenario we are going to simulate creating a Dynamic table with Claude.

“Would you be able to create a dynamic table for me that creates an eventual table in snowflake that does a combination of the TRAVEL_PASSENGERS and the TRAVEL_FLIGHTS by a common key? Suggest a key after looking at the data”

Immediately it starts to query the relevant tables to describe them

It also retrieve a sample of data from both tables

Finally it creates the Dynamic table in Snowflake and if we head over to our Snowflake console, we can see the Dynamic table created.

Benefits of Using MCP with Snowflake

Connecting Claude to Snowflake through MCP offers several advantages:

  • Natural Language Data Analysis: Aside from Cortex Analyst, this way provides developers the ability to ask questions about your data in plain English instead of writing complex SQL
  • Context-Aware Queries: Claude maintains context across queries, understanding references to previous results
  • Secure Access: MCP’s architecture provides secure, authenticated, and auditable access to your Snowflake data
  • Automated Documentation: Use Claude to automatically document your queries and findings

Conclusion

The Model Context Protocol represents a significant advancement in AI integration, enabling language models to interact directly with our most important data sources. By connecting Claude to Snowflake through MCP in Cursor, we enable developers to unlock a more intuitive, powerful way to work with our data.

As the MCP ecosystem continues to grow, with thousands of community-built servers already available, the possibilities for AI-powered data analysis will only expand. I’m excited to see how this technology evolves and how we can leverage it to build more intelligent, data-driven applications.

--

--

Adrian Lee Xinhan
Adrian Lee Xinhan

Written by Adrian Lee Xinhan

Views are personal and don't represent my employer

Responses (2)