# How to integrate Prisma MCP with Pydantic AI

```json
{
  "title": "How to integrate Prisma MCP with Pydantic AI",
  "toolkit": "Prisma",
  "toolkit_slug": "prisma",
  "framework": "Pydantic AI",
  "framework_slug": "pydantic-ai",
  "url": "https://composio.dev/toolkits/prisma/framework/pydantic-ai",
  "markdown_url": "https://composio.dev/toolkits/prisma/framework/pydantic-ai.md",
  "updated_at": "2026-05-12T10:22:42.226Z"
}
```

## Introduction

This guide walks you through connecting Prisma to Pydantic AI using the Composio tool router. By the end, you'll have a working Prisma agent that can create a new postgres database in your project, run a sql query to list all users, delete a database connection by name through natural language commands.
This guide will help you understand how to give your Pydantic AI agent real control over a Prisma account through Composio's Prisma MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Prisma with

- [OpenAI Agents SDK](https://composio.dev/toolkits/prisma/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/prisma/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/prisma/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/prisma/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/prisma/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/prisma/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/prisma/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/prisma/framework/cli)
- [Google ADK](https://composio.dev/toolkits/prisma/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/prisma/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/prisma/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/prisma/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/prisma/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/prisma/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- How to set up your Composio API key and User ID
- How to create a Composio Tool Router session for Prisma
- How to attach an MCP Server to a Pydantic AI agent
- How to stream responses and maintain chat history
- How to build a simple REPL-style chat interface to test your Prisma workflows

## What is Pydantic AI?

Pydantic AI is a Python framework for building AI agents with strong typing and validation. It leverages Pydantic's data validation capabilities to create robust, type-safe AI applications.
Key features include:
- Type Safety: Built on Pydantic for automatic data validation
- MCP Support: Native support for Model Context Protocol servers
- Streaming: Built-in support for streaming responses
- Async First: Designed for async/await patterns

## What is the Prisma MCP server, and what's possible with it?

The Prisma MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Prisma account. It provides structured and secure access to your database management, so your agent can perform actions like creating projects, managing databases, executing SQL queries, and handling API keys on your behalf.
- Automated project and database provisioning: Instantly create new Prisma projects and managed PostgreSQL databases in your workspace, complete with connection strings and API keys for fast onboarding.
- On-demand SQL execution and analysis: Have your agent run SQL commands or select queries for reporting, data inspection, or schema changes—without manual intervention.
- API key and connection management: Programmatically generate, rotate, or revoke database API keys, ensuring secure and controlled access for all your applications.
- Workspace and resource monitoring: Retrieve detailed information about your workspaces, projects, and databases, allowing your agent to validate deployments or monitor status in real time.
- Safe resource cleanup and deletion: Direct your agent to delete databases, projects, or specific connections—helping you maintain a tidy, secure, and cost-effective data platform.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `PRISMA_CREATE_CONNECTION` | Create Database Connection | Create new API key connection for database access. Creates connection string with embedded credentials for application database access. Returns complete connection details ready for immediate use. |
| `PRISMA_CREATE_DATABASE` | Create Project Database | Create new postgres database in an existing Prisma project. Creates database in specified region with connection strings and API keys. Returns complete database details ready for immediate use. |
| `PRISMA_CREATE_PROJECT` | Create Prisma Project | Create new Prisma project with managed postgres database. Creates project in authenticated user's workspace with postgres database in specified region. Returns complete project details including connection strings and API keys. |
| `PRISMA_DELETE_CONNECTION` | Delete Database Connection | Permanently delete database connection and revoke API key access. WARNING: This immediately revokes database access for any applications using this connection string. Ensure no critical systems depend on this connection. |
| `PRISMA_DELETE_DATABASE` | Delete Prisma Database | Permanently delete Prisma database and all stored data. WARNING: This action cannot be undone. All data in the database will be permanently destroyed. Default databases typically cannot be deleted. |
| `PRISMA_DELETE_PROJECT` | Delete Prisma Project | Permanently delete Prisma project and all associated resources. WARNING: This action cannot be undone. All databases, environments, and project data will be permanently destroyed. Use with extreme caution in production environments. |
| `PRISMA_EXECUTE_SQL_COMMAND` | Execute SQL Command | Execute SQL commands that modify database data or structure. Runs INSERT, UPDATE, DELETE, CREATE TABLE, and other data modification commands safely through PostgreSQL driver with parameterized query support. |
| `PRISMA_EXECUTE_SQL_QUERY` | Execute SQL Query | Execute SQL SELECT queries against Prisma Postgres databases. Runs read-only queries safely through direct PostgreSQL connection with SSL. Uses credentials from create_connection action (host, user, pass fields). Perfect for data analysis, schema inspection, and reporting operations. |
| `PRISMA_GET_DATABASE` | Get Prisma Database | Retrieve specific Prisma database by ID. Returns database details including status, project context, and regional deployment. Use for database monitoring, validation, and administrative operations. |
| `PRISMA_GET_DATABASE_USAGE` | Get Database Usage Metrics | Retrieve usage metrics for a specific Prisma database. Returns metrics including storage usage and operation counts (reads/writes) for the specified time period. Use for monitoring resource consumption, cost analysis, and capacity planning. |
| `PRISMA_GET_PROJECT` | Get Prisma Project | Retrieve specific Prisma project by ID. Returns project details including name, creation timestamp, and workspace information. Use for project detail views, validation, and administrative operations. |
| `PRISMA_INSPECT_DATABASE_SCHEMA` | Inspect Database Schema | Inspect database schema structure and table information. Returns comprehensive schema details including tables, columns, data types, constraints, and relationships. Essential for understanding database structure before executing queries. |
| `PRISMA_LIST_ACCELERATE_REGIONS` | List Prisma Accelerate Regions | Retrieve all available regions for Prisma Accelerate. Returns regions where Accelerate global database cache can be deployed. Use for cache region selection to minimize latency for your users. |
| `PRISMA_LIST_BACKUPS` | List Database Backups | Retrieve list of available backups for a specific database. Returns backup details including status, size, type, and restoration readiness. Use for backup monitoring, restoration planning, and compliance auditing. |
| `PRISMA_LIST_CONNECTIONS` | List Database Connections | Retrieve paginated list of connections for a specific database. Returns connection details including names, creation dates, and database context. Use for API key management, security audits, and access control. |
| `PRISMA_LIST_DATABASES` | List Project Databases | Retrieve paginated list of databases for a specific Prisma project. Returns database details including status, region, and project context. Use for database discovery, monitoring, and project administration. |
| `PRISMA_LIST_POSTGRES_REGIONS` | List Prisma Postgres Regions | Retrieve all available regions for Prisma Postgres. Returns regions where Prisma Postgres databases can be deployed with current availability status. Use for region selection during database creation and capacity planning. |
| `PRISMA_LIST_PROJECTS` | List Prisma Projects | Retrieve paginated list of Prisma projects accessible to authenticated user. Returns project IDs, names, workspace info, and timestamps with cursor-based pagination. Use for project discovery, UI selection flows, and administrative operations. |
| `PRISMA_LIST_WORKSPACE_INTEGRATIONS` | List Workspace Integrations | Retrieve paginated list of integrations for a specific Prisma workspace. Returns integration details including OAuth client info, granted scopes, and creator. Use for security audits, integration management, and workspace administration. |
| `PRISMA_LIST_WORKSPACES` | List Prisma Workspaces | Retrieve paginated list of Prisma workspaces accessible to authenticated user. Returns workspace IDs, names, creation timestamps with cursor-based pagination. Use for workspace discovery, UI selection flows, and administrative operations. |
| `PRISMA_RESTORE_BACKUP` | Restore Database Backup | Restore database backup to new database instance. Creates new database from existing backup with specified name. Operation is asynchronous - monitor the returned database status for completion. Restoration may take several minutes. |
| `PRISMA_TRANSFER_PROJECT` | Transfer Prisma Project | Transfer Prisma project ownership to another user's workspace. Transfers project ownership from the current authenticated user to the recipient specified by their OAuth2 access token. This is typically used in partner integrations where databases are provisioned on the partner's workspace and later transferred to end users. The project and all its databases are moved to the recipient's workspace. The current owner loses access unless the new owner explicitly grants it. Requirements: - Valid project ID owned by the current user - Valid OAuth2 access token for the recipient user - Recipient workspace must have sufficient quota for the project |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Prisma MCP server is an implementation of the Model Context Protocol that connects your AI agent to Prisma. It provides structured and secure access so your agent can perform Prisma operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before starting, make sure you have:
- Python 3.9 or higher
- A Composio account with an active API key
- Basic familiarity with Python and async programming

### 1. Getting API Keys for OpenAI and Composio

OpenAI API Key
- Go to the [OpenAI dashboard](https://platform.openai.com/settings/organization/api-keys) and create an API key. You'll need credits to use the models, or you can connect to another model provider.
- Keep the API key safe.
Composio API Key
- Log in to the [Composio dashboard](https://dashboard.composio.dev?utm_source=toolkits&utm_medium=framework_docs).
- Navigate to your API settings and generate a new API key.
- Store this key securely as you'll need it for authentication.

### 2. Install dependencies

Install the required libraries.
What's happening:
- composio connects your agent to external SaaS tools like Prisma
- pydantic-ai lets you create structured AI agents with tool support
- python-dotenv loads your environment variables securely from a .env file
```bash
pip install composio pydantic-ai python-dotenv
```

### 3. Set up environment variables

Create a .env file in your project root.
What's happening:
- COMPOSIO_API_KEY authenticates your agent to Composio's API
- USER_ID associates your session with your account for secure tool access
- OPENAI_API_KEY to access OpenAI LLMs
```bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key
```

### 4. Import dependencies

What's happening:
- We load environment variables and import required modules
- Composio manages connections to Prisma
- MCPServerStreamableHTTP connects to the Prisma MCP server endpoint
- Agent from Pydantic AI lets you define and run the AI assistant
```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()
```

### 5. Create a Tool Router Session

What's happening:
- We're creating a Tool Router session that gives your agent access to Prisma tools
- The create method takes the user ID and specifies which toolkits should be available
- The returned session.mcp.url is the MCP server URL that your agent will use
```python
async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Prisma
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["prisma"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")
```

### 6. Initialize the Pydantic AI Agent

What's happening:
- The MCP client connects to the Prisma endpoint
- The agent uses GPT-5 to interpret user commands and perform Prisma operations
- The instructions field defines the agent's role and behavior
```python
# Attach the MCP server to a Pydantic AI Agent
prisma_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
agent = Agent(
    "openai:gpt-5",
    toolsets=[prisma_mcp],
    instructions=(
        "You are a Prisma assistant. Use Prisma tools to help users "
        "with their requests. Ask clarifying questions when needed."
    ),
)
```

### 7. Build the chat interface

What's happening:
- The agent reads input from the terminal and streams its response
- Prisma API calls happen automatically under the hood
- The model keeps conversation history to maintain context across turns
```python
# Simple REPL with message history
history = []
print("Chat started! Type 'exit' or 'quit' to end.\n")
print("Try asking the agent to help you with Prisma.\n")

while True:
    user_input = input("You: ").strip()
    if user_input.lower() in {"exit", "quit", "bye"}:
        print("\nGoodbye!")
        break
    if not user_input:
        continue

    print("\nAgent is thinking...\n", flush=True)

    async with agent.run_stream(user_input, message_history=history) as stream_result:
        collected_text = ""
        async for chunk in stream_result.stream_output():
            text_piece = None
            if isinstance(chunk, str):
                text_piece = chunk
            elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                text_piece = chunk.delta
            elif hasattr(chunk, "text"):
                text_piece = chunk.text
            if text_piece:
                collected_text += text_piece
        result = stream_result

    print(f"Agent: {collected_text}\n")
    history = result.all_messages()
```

### 8. Run the application

What's happening:
- The asyncio loop launches the agent and keeps it running until you exit
```python
if __name__ == "__main__":
    asyncio.run(main())
```

## Complete Code

```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()

async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Prisma
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["prisma"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")

    # Attach the MCP server to a Pydantic AI Agent
    prisma_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
    agent = Agent(
        "openai:gpt-5",
        toolsets=[prisma_mcp],
        instructions=(
            "You are a Prisma assistant. Use Prisma tools to help users "
            "with their requests. Ask clarifying questions when needed."
        ),
    )

    # Simple REPL with message history
    history = []
    print("Chat started! Type 'exit' or 'quit' to end.\n")
    print("Try asking the agent to help you with Prisma.\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() in {"exit", "quit", "bye"}:
            print("\nGoodbye!")
            break
        if not user_input:
            continue

        print("\nAgent is thinking...\n", flush=True)

        async with agent.run_stream(user_input, message_history=history) as stream_result:
            collected_text = ""
            async for chunk in stream_result.stream_output():
                text_piece = None
                if isinstance(chunk, str):
                    text_piece = chunk
                elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                    text_piece = chunk.delta
                elif hasattr(chunk, "text"):
                    text_piece = chunk.text
                if text_piece:
                    collected_text += text_piece
            result = stream_result

        print(f"Agent: {collected_text}\n")
        history = result.all_messages()

if __name__ == "__main__":
    asyncio.run(main())
```

## Conclusion

You've built a Pydantic AI agent that can interact with Prisma through Composio's Tool Router. With this setup, your agent can perform real Prisma actions through natural language.
You can extend this further by:
- Adding other toolkits like Gmail, HubSpot, or Salesforce
- Building a web-based chat interface around this agent
- Using multiple MCP endpoints to enable cross-app workflows (for example, Gmail + Prisma for workflow automation)
This architecture makes your AI agent "agent-native", able to securely use APIs in a unified, composable way without custom integrations.

## How to build Prisma MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/prisma/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/prisma/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/prisma/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/prisma/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/prisma/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/prisma/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/prisma/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/prisma/framework/cli)
- [Google ADK](https://composio.dev/toolkits/prisma/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/prisma/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/prisma/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/prisma/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/prisma/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/prisma/framework/crew-ai)

## Related Toolkits

- [Supabase](https://composio.dev/toolkits/supabase) - Supabase is an open-source backend platform offering scalable Postgres databases, authentication, storage, and real-time APIs. It lets developers build modern apps without managing infrastructure.
- [Codeinterpreter](https://composio.dev/toolkits/codeinterpreter) - Codeinterpreter is a Python-based coding environment with built-in data analysis and visualization. It lets you instantly run scripts, plot results, and prototype solutions inside supported platforms.
- [GitHub](https://composio.dev/toolkits/github) - GitHub is a code hosting platform for version control and collaborative software development. It streamlines project management, code review, and team workflows in one place.
- [Ably](https://composio.dev/toolkits/ably) - Ably is a real-time messaging platform for live chat and data sync in modern apps. It offers global scale and rock-solid reliability for seamless, instant experiences.
- [Abuselpdb](https://composio.dev/toolkits/abuselpdb) - Abuselpdb is a central database for reporting and checking IPs linked to malicious online activity. Use it to quickly identify and report suspicious or abusive IP addresses.
- [Alchemy](https://composio.dev/toolkits/alchemy) - Alchemy is a blockchain development platform offering APIs and tools for Ethereum apps. It simplifies building and scaling Web3 projects with robust infrastructure.
- [Algolia](https://composio.dev/toolkits/algolia) - Algolia is a hosted search API that powers lightning-fast, relevant search experiences for web and mobile apps. It helps developers deliver instant, typo-tolerant, and scalable search without complex infrastructure.
- [Anchor browser](https://composio.dev/toolkits/anchor_browser) - Anchor browser is a developer platform for AI-powered web automation. It transforms complex browser actions into easy API endpoints for streamlined web interaction.
- [Apiflash](https://composio.dev/toolkits/apiflash) - Apiflash is a website screenshot API for programmatically capturing web pages. It delivers high-quality screenshots on demand for automation, monitoring, or reporting.
- [Apiverve](https://composio.dev/toolkits/apiverve) - Apiverve delivers a suite of powerful APIs that simplify integration for developers. It's designed for reliability and scalability so you can build faster, smarter applications without the integration headache.
- [Appcircle](https://composio.dev/toolkits/appcircle) - Appcircle is an enterprise-grade mobile CI/CD platform for building, testing, and publishing mobile apps. It streamlines mobile DevOps so teams ship faster and with more confidence.
- [Appdrag](https://composio.dev/toolkits/appdrag) - Appdrag is a cloud platform for building websites, APIs, and databases with drag-and-drop tools and code editing. It accelerates development and iteration by combining hosting, database management, and low-code features in one place.
- [Appveyor](https://composio.dev/toolkits/appveyor) - AppVeyor is a cloud-based continuous integration service for building, testing, and deploying applications. It helps developers automate and streamline their software delivery pipelines.
- [Backendless](https://composio.dev/toolkits/backendless) - Backendless is a backend-as-a-service platform for mobile and web apps, offering database, file storage, user authentication, and APIs. It helps developers ship scalable applications faster without managing server infrastructure.
- [Baserow](https://composio.dev/toolkits/baserow) - Baserow is an open-source no-code database platform for building collaborative data apps. It makes it easy for teams to organize data and automate workflows without writing code.
- [Bench](https://composio.dev/toolkits/bench) - Bench is a benchmarking tool for automated performance measurement and analysis. It helps you quickly evaluate, compare, and track your systems or workflows.
- [Better stack](https://composio.dev/toolkits/better_stack) - Better Stack is a monitoring, logging, and incident management solution for apps and services. It helps teams ensure application reliability and performance with real-time insights.
- [Bitbucket](https://composio.dev/toolkits/bitbucket) - Bitbucket is a Git-based code hosting and collaboration platform for teams. It enables secure repository management and streamlined code reviews.
- [Blazemeter](https://composio.dev/toolkits/blazemeter) - Blazemeter is a continuous testing platform for web and mobile app performance. It empowers teams to automate and analyze large-scale tests with ease.
- [Blocknative](https://composio.dev/toolkits/blocknative) - Blocknative delivers real-time mempool monitoring and transaction management for public blockchains. Instantly track pending transactions and optimize blockchain interactions with live data.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Prisma MCP?

With a standalone Prisma MCP server, the agents and LLMs can only access a fixed set of Prisma tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Prisma and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with Pydantic AI?

Yes, you can. Pydantic AI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Prisma tools.

### Can I manage the permissions and scopes for Prisma while using Tool Router?

Yes, absolutely. You can configure which Prisma scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Prisma data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
