# How to integrate Snowflake MCP with Pydantic AI

```json
{
  "title": "How to integrate Snowflake MCP with Pydantic AI",
  "toolkit": "Snowflake",
  "toolkit_slug": "snowflake",
  "framework": "Pydantic AI",
  "framework_slug": "pydantic-ai",
  "url": "https://composio.dev/toolkits/snowflake/framework/pydantic-ai",
  "markdown_url": "https://composio.dev/toolkits/snowflake/framework/pydantic-ai.md",
  "updated_at": "2026-05-12T10:26:42.921Z"
}
```

## Introduction

This guide walks you through connecting Snowflake to Pydantic AI using the Composio tool router. By the end, you'll have a working Snowflake agent that can run a sql query to list today's new users, cancel a long-running data import statement, show all unresolved incidents in snowflake through natural language commands.
This guide will help you understand how to give your Pydantic AI agent real control over a Snowflake account through Composio's Snowflake MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Snowflake with

- [ChatGPT](https://composio.dev/toolkits/snowflake/framework/chatgpt)
- [OpenAI Agents SDK](https://composio.dev/toolkits/snowflake/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/snowflake/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/snowflake/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/snowflake/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/snowflake/framework/codex)
- [Cursor](https://composio.dev/toolkits/snowflake/framework/cursor)
- [VS Code](https://composio.dev/toolkits/snowflake/framework/vscode)
- [OpenCode](https://composio.dev/toolkits/snowflake/framework/opencode)
- [OpenClaw](https://composio.dev/toolkits/snowflake/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/snowflake/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/snowflake/framework/cli)
- [Google ADK](https://composio.dev/toolkits/snowflake/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/snowflake/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/snowflake/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/snowflake/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/snowflake/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/snowflake/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- How to set up your Composio API key and User ID
- How to create a Composio Tool Router session for Snowflake
- How to attach an MCP Server to a Pydantic AI agent
- How to stream responses and maintain chat history
- How to build a simple REPL-style chat interface to test your Snowflake workflows

## What is Pydantic AI?

Pydantic AI is a Python framework for building AI agents with strong typing and validation. It leverages Pydantic's data validation capabilities to create robust, type-safe AI applications.
Key features include:
- Type Safety: Built on Pydantic for automatic data validation
- MCP Support: Native support for Model Context Protocol servers
- Streaming: Built-in support for streaming responses
- Async First: Designed for async/await patterns

## What is the Snowflake MCP server, and what's possible with it?

The Snowflake MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Snowflake account. It provides structured and secure access to your cloud data warehouse, so your agent can run complex SQL queries, monitor system health, check scheduled maintenances, and manage incidents seamlessly—no manual intervention needed.
- Automated SQL execution and data retrieval: Direct your agent to execute SQL statements and instantly fetch query results from your data warehouse.
- Query management and cancellation: Have your agent monitor and cancel long-running or stuck SQL statements to keep your workflows running smoothly.
- Maintenance and system status monitoring: Let your agent check for active, upcoming, or completed scheduled maintenances and get real-time updates on system components.
- Incident detection and reporting: Enable your agent to retrieve unresolved incidents and receive summaries of any issues currently affecting your Snowflake environment.
- Integration metadata access: Fetch details about catalog integrations and system status rollups so your agent can keep tabs on the overall health of your Snowflake setup.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `SNOWFLAKE_CANCEL_STATEMENT_EXECUTION` | Cancel Statement Execution | Cancels the execution of a running SQL statement. Use this action to stop a long-running query. |
| `SNOWFLAKE_CHECK_STATEMENT_STATUS` | Check Statement Status | Retrieves the status and results of a previously submitted SQL statement using its statement handle. Use this to poll async queries submitted via SNOWFLAKE_SUBMIT_SQL_STATEMENT; call repeatedly until status is no longer pending. Use SNOWFLAKE_CANCEL_STATEMENT to abort a hanging query. |
| `SNOWFLAKE_EXECUTE_SQL` | Execute SQL | Execute SQL statements in Snowflake and retrieve results. Supports SELECT queries for data retrieval, DDL statements (CREATE, ALTER, DROP) for schema management, and DML statements (INSERT, UPDATE, DELETE) for data modification. Returns comprehensive result metadata including column types, row counts, and execution status. Unquoted SQL identifiers are auto-uppercased by Snowflake — use matching case in `database`, `schema_name`, `warehouse`, and `role` parameters to avoid 'object not found' errors. Always apply explicit time-range filters and a LIMIT clause to unbounded SELECT queries to prevent large, slow result sets. |
| `SNOWFLAKE_FETCH_CATALOG_INTEGRATION` | Fetch Catalog Integration | Retrieves detailed configuration and metadata for a specific catalog integration. Catalog integrations allow Snowflake to connect to external Apache Iceberg catalogs (AWS Glue, Snowflake Open Catalog/Polaris, or Apache Iceberg REST catalogs) to query Iceberg tables managed by those external systems. |
| `SNOWFLAKE_GET_ACTIVE_SCHEDULED_MAINTENANCES` | Get Active Scheduled Maintenances | Retrieves a list of any active scheduled maintenances currently in the In Progress or Verifying state. |
| `SNOWFLAKE_GET_ALL_SCHEDULED_MAINTENANCES` | Get All Scheduled Maintenances | Retrieves a list of the 50 most recent scheduled maintenances, including those in the Completed state. |
| `SNOWFLAKE_GET_COMPONENT_STATUS` | Get Component Status | Retrieves the status of individual components, each listed with its current status. |
| `SNOWFLAKE_GET_STATUS_ROLLUP` | Get Status Rollup | Retrieves the status rollup for the entire page, including indicators and human-readable descriptions of the blended component status. |
| `SNOWFLAKE_GET_STATUS_SUMMARY` | Get Status Summary | Retrieves the current status summary from Snowflake's public status page (status.snowflake.com). Returns overall system status, operational status of all regional components (AWS, Azure, GCP regions), any unresolved incidents, and upcoming or in-progress scheduled maintenances. This is a public endpoint that provides global Snowflake service status, not account-specific information. |
| `SNOWFLAKE_GET_UNRESOLVED_INCIDENTS` | Get Unresolved Incidents | Retrieves a list of any unresolved incidents from the Snowflake status page. This endpoint returns incidents currently in the Investigating, Identified, or Monitoring state. Returns an empty list if there are no active incidents. This is a public status page API that does not require authentication. |
| `SNOWFLAKE_GET_UPCOMING_SCHEDULED_MAINTENANCES` | Get Upcoming Scheduled Maintenances | Retrieves upcoming scheduled maintenances from Snowflake's public status page. This action queries the Snowflake status API to get a list of any scheduled maintenance events that are still in the 'Scheduled' state (not yet started or completed). The response includes maintenance details such as impact level, scheduled time windows, incident updates, and direct links to the maintenance notices. Note: This uses Snowflake's public status API and does not require authentication. |
| `SNOWFLAKE_SHOW_DATABASES` | Show Databases | Lists all databases for which you have access privileges. Shows database metadata including name, creation date, owner, retention time, and more. Can filter results and include dropped databases within Time Travel retention period. |
| `SNOWFLAKE_SHOW_SCHEMAS` | Show Schemas | Lists all schemas for which you have access privileges. Shows schema metadata including name, creation date, owner, database, retention time, and more. Can filter results and include dropped schemas within Time Travel retention period. |
| `SNOWFLAKE_SHOW_TABLES` | Show Tables | Lists all tables for which you have access privileges. Shows table metadata including name, creation date, owner, database, schema, row count, size in bytes, clustering keys, and more. Can filter results and include dropped tables within Time Travel retention period. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Snowflake MCP server is an implementation of the Model Context Protocol that connects your AI agent to Snowflake. It provides structured and secure access so your agent can perform Snowflake operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before starting, make sure you have:
- Python 3.9 or higher
- A Composio account with an active API key
- Basic familiarity with Python and async programming

### 1. Getting API Keys for OpenAI and Composio

OpenAI API Key
- Go to the [OpenAI dashboard](https://platform.openai.com/settings/organization/api-keys) and create an API key. You'll need credits to use the models, or you can connect to another model provider.
- Keep the API key safe.
Composio API Key
- Log in to the [Composio dashboard](https://dashboard.composio.dev?utm_source=toolkits&utm_medium=framework_docs).
- Navigate to your API settings and generate a new API key.
- Store this key securely as you'll need it for authentication.

### 2. Install dependencies

Install the required libraries.
What's happening:
- composio connects your agent to external SaaS tools like Snowflake
- pydantic-ai lets you create structured AI agents with tool support
- python-dotenv loads your environment variables securely from a .env file
```bash
pip install composio pydantic-ai python-dotenv
```

### 3. Set up environment variables

Create a .env file in your project root.
What's happening:
- COMPOSIO_API_KEY authenticates your agent to Composio's API
- USER_ID associates your session with your account for secure tool access
- OPENAI_API_KEY to access OpenAI LLMs
```bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key
```

### 4. Import dependencies

What's happening:
- We load environment variables and import required modules
- Composio manages connections to Snowflake
- MCPServerStreamableHTTP connects to the Snowflake MCP server endpoint
- Agent from Pydantic AI lets you define and run the AI assistant
```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()
```

### 5. Create a Tool Router Session

What's happening:
- We're creating a Tool Router session that gives your agent access to Snowflake tools
- The create method takes the user ID and specifies which toolkits should be available
- The returned session.mcp.url is the MCP server URL that your agent will use
```python
async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Snowflake
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["snowflake"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")
```

### 6. Initialize the Pydantic AI Agent

What's happening:
- The MCP client connects to the Snowflake endpoint
- The agent uses GPT-5 to interpret user commands and perform Snowflake operations
- The instructions field defines the agent's role and behavior
```python
# Attach the MCP server to a Pydantic AI Agent
snowflake_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
agent = Agent(
    "openai:gpt-5",
    toolsets=[snowflake_mcp],
    instructions=(
        "You are a Snowflake assistant. Use Snowflake tools to help users "
        "with their requests. Ask clarifying questions when needed."
    ),
)
```

### 7. Build the chat interface

What's happening:
- The agent reads input from the terminal and streams its response
- Snowflake API calls happen automatically under the hood
- The model keeps conversation history to maintain context across turns
```python
# Simple REPL with message history
history = []
print("Chat started! Type 'exit' or 'quit' to end.\n")
print("Try asking the agent to help you with Snowflake.\n")

while True:
    user_input = input("You: ").strip()
    if user_input.lower() in {"exit", "quit", "bye"}:
        print("\nGoodbye!")
        break
    if not user_input:
        continue

    print("\nAgent is thinking...\n", flush=True)

    async with agent.run_stream(user_input, message_history=history) as stream_result:
        collected_text = ""
        async for chunk in stream_result.stream_output():
            text_piece = None
            if isinstance(chunk, str):
                text_piece = chunk
            elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                text_piece = chunk.delta
            elif hasattr(chunk, "text"):
                text_piece = chunk.text
            if text_piece:
                collected_text += text_piece
        result = stream_result

    print(f"Agent: {collected_text}\n")
    history = result.all_messages()
```

### 8. Run the application

What's happening:
- The asyncio loop launches the agent and keeps it running until you exit
```python
if __name__ == "__main__":
    asyncio.run(main())
```

## Complete Code

```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()

async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Snowflake
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["snowflake"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")

    # Attach the MCP server to a Pydantic AI Agent
    snowflake_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
    agent = Agent(
        "openai:gpt-5",
        toolsets=[snowflake_mcp],
        instructions=(
            "You are a Snowflake assistant. Use Snowflake tools to help users "
            "with their requests. Ask clarifying questions when needed."
        ),
    )

    # Simple REPL with message history
    history = []
    print("Chat started! Type 'exit' or 'quit' to end.\n")
    print("Try asking the agent to help you with Snowflake.\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() in {"exit", "quit", "bye"}:
            print("\nGoodbye!")
            break
        if not user_input:
            continue

        print("\nAgent is thinking...\n", flush=True)

        async with agent.run_stream(user_input, message_history=history) as stream_result:
            collected_text = ""
            async for chunk in stream_result.stream_output():
                text_piece = None
                if isinstance(chunk, str):
                    text_piece = chunk
                elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                    text_piece = chunk.delta
                elif hasattr(chunk, "text"):
                    text_piece = chunk.text
                if text_piece:
                    collected_text += text_piece
            result = stream_result

        print(f"Agent: {collected_text}\n")
        history = result.all_messages()

if __name__ == "__main__":
    asyncio.run(main())
```

## Conclusion

You've built a Pydantic AI agent that can interact with Snowflake through Composio's Tool Router. With this setup, your agent can perform real Snowflake actions through natural language.
You can extend this further by:
- Adding other toolkits like Gmail, HubSpot, or Salesforce
- Building a web-based chat interface around this agent
- Using multiple MCP endpoints to enable cross-app workflows (for example, Gmail + Snowflake for workflow automation)
This architecture makes your AI agent "agent-native", able to securely use APIs in a unified, composable way without custom integrations.

## How to build Snowflake MCP Agent with another framework

- [ChatGPT](https://composio.dev/toolkits/snowflake/framework/chatgpt)
- [OpenAI Agents SDK](https://composio.dev/toolkits/snowflake/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/snowflake/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/snowflake/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/snowflake/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/snowflake/framework/codex)
- [Cursor](https://composio.dev/toolkits/snowflake/framework/cursor)
- [VS Code](https://composio.dev/toolkits/snowflake/framework/vscode)
- [OpenCode](https://composio.dev/toolkits/snowflake/framework/opencode)
- [OpenClaw](https://composio.dev/toolkits/snowflake/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/snowflake/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/snowflake/framework/cli)
- [Google ADK](https://composio.dev/toolkits/snowflake/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/snowflake/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/snowflake/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/snowflake/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/snowflake/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/snowflake/framework/crew-ai)

## Related Toolkits

- [Firecrawl](https://composio.dev/toolkits/firecrawl) - Firecrawl automates large-scale web crawling and data extraction. It helps organizations efficiently gather, index, and analyze content from online sources.
- [Tavily](https://composio.dev/toolkits/tavily) - Tavily offers powerful search and data retrieval from documents, databases, and the web. It helps teams locate and filter information instantly, saving hours on research.
- [Exa](https://composio.dev/toolkits/exa) - Exa is a data extraction and search platform for gathering and analyzing information from websites, APIs, or databases. It helps teams quickly surface insights and automate data-driven workflows.
- [Serpapi](https://composio.dev/toolkits/serpapi) - SerpApi is a real-time API for structured search engine results. It lets you automate SERP data collection, parsing, and analysis for SEO and research.
- [Peopledatalabs](https://composio.dev/toolkits/peopledatalabs) - Peopledatalabs delivers B2B data enrichment and identity resolution APIs. Supercharge your apps with accurate, up-to-date business and contact data.
- [Posthog](https://composio.dev/toolkits/posthog) - PostHog is an open-source analytics platform for tracking user interactions and product metrics. It helps teams refine features, analyze funnels, and reduce churn with actionable insights.
- [Amplitude](https://composio.dev/toolkits/amplitude) - Amplitude is a digital analytics platform for product and behavioral data insights. It helps teams analyze user journeys and make data-driven decisions quickly.
- [Bright Data MCP](https://composio.dev/toolkits/brightdata_mcp) - Bright Data MCP is an AI-powered web scraping and data collection platform. Instantly access public web data in real time with advanced scraping tools.
- [Browseai](https://composio.dev/toolkits/browseai) - Browseai is a web automation and data extraction platform that turns any website into an API. It's perfect for monitoring websites and retrieving structured data without manual scraping.
- [ClickHouse](https://composio.dev/toolkits/clickhouse) - ClickHouse is an open-source, column-oriented database for real-time analytics and big data processing using SQL. Its lightning-fast query performance makes it ideal for handling large datasets and delivering instant insights.
- [Coinmarketcal](https://composio.dev/toolkits/coinmarketcal) - CoinMarketCal is a community-powered crypto calendar for upcoming events, announcements, and releases. It helps traders track market-moving developments and stay ahead in the crypto space.
- [Control d](https://composio.dev/toolkits/control_d) - Control d is a customizable DNS filtering and traffic redirection platform. It helps you manage internet access, enforce policies, and monitor usage across devices and networks.
- [Databox](https://composio.dev/toolkits/databox) - Databox is a business analytics platform that connects your data from any tool and device. It helps you track KPIs, build dashboards, and discover actionable insights.
- [Databricks](https://composio.dev/toolkits/databricks) - Databricks is a unified analytics platform for big data and AI on the lakehouse architecture. It empowers data teams to collaborate, analyze, and build scalable solutions efficiently.
- [Datagma](https://composio.dev/toolkits/datagma) - Datagma delivers data intelligence and analytics for business growth and market discovery. Get actionable market insights and track competitors to inform your strategy.
- [Delighted](https://composio.dev/toolkits/delighted) - Delighted is a customer feedback platform based on the Net Promoter System®. It helps you quickly gather, track, and act on customer sentiment.
- [Dovetail](https://composio.dev/toolkits/dovetail) - Dovetail is a research analysis platform for transcript review and insight generation. It helps teams code interviews, analyze feedback, and create actionable research summaries.
- [Dub](https://composio.dev/toolkits/dub) - Dub is a short link management platform with analytics and API access. Use it to easily create, manage, and track branded short links for your business.
- [Elasticsearch](https://composio.dev/toolkits/elasticsearch) - Elasticsearch is a distributed, RESTful search and analytics engine for all types of data. It delivers fast, scalable search and powerful analytics across massive datasets.
- [Fireflies](https://composio.dev/toolkits/fireflies) - Fireflies.ai is an AI-powered meeting assistant that records, transcribes, and analyzes voice conversations. It helps teams capture call notes automatically and search or summarize meetings effortlessly.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Snowflake MCP?

With a standalone Snowflake MCP server, the agents and LLMs can only access a fixed set of Snowflake tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Snowflake and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with Pydantic AI?

Yes, you can. Pydantic AI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Snowflake tools.

### Can I manage the permissions and scopes for Snowflake while using Tool Router?

Yes, absolutely. You can configure which Snowflake scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Snowflake data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
