# How to integrate Amplitude MCP with Pydantic AI

```json
{
  "title": "How to integrate Amplitude MCP with Pydantic AI",
  "toolkit": "Amplitude",
  "toolkit_slug": "amplitude",
  "framework": "Pydantic AI",
  "framework_slug": "pydantic-ai",
  "url": "https://composio.dev/toolkits/amplitude/framework/pydantic-ai",
  "markdown_url": "https://composio.dev/toolkits/amplitude/framework/pydantic-ai.md",
  "updated_at": "2026-05-12T10:01:10.106Z"
}
```

## Introduction

This guide walks you through connecting Amplitude to Pydantic AI using the Composio tool router. By the end, you'll have a working Amplitude agent that can get daily active users for last month, generate funnel analysis for onboarding flow, list top events for premium users through natural language commands.
This guide will help you understand how to give your Pydantic AI agent real control over a Amplitude account through Composio's Amplitude MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Amplitude with

- [OpenAI Agents SDK](https://composio.dev/toolkits/amplitude/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/amplitude/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/amplitude/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/amplitude/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/amplitude/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/amplitude/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/amplitude/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/amplitude/framework/cli)
- [Google ADK](https://composio.dev/toolkits/amplitude/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/amplitude/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/amplitude/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/amplitude/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/amplitude/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/amplitude/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- How to set up your Composio API key and User ID
- How to create a Composio Tool Router session for Amplitude
- How to attach an MCP Server to a Pydantic AI agent
- How to stream responses and maintain chat history
- How to build a simple REPL-style chat interface to test your Amplitude workflows

## What is Pydantic AI?

Pydantic AI is a Python framework for building AI agents with strong typing and validation. It leverages Pydantic's data validation capabilities to create robust, type-safe AI applications.
Key features include:
- Type Safety: Built on Pydantic for automatic data validation
- MCP Support: Native support for Model Context Protocol servers
- Streaming: Built-in support for streaming responses
- Async First: Designed for async/await patterns

## What is the Amplitude MCP server, and what's possible with it?

The Amplitude MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Amplitude account. It provides structured and secure access to your analytics platform, so your agent can perform actions like managing event types, organizing cohorts, updating user properties, and tracking event categories on your behalf.
- Cohort and user management: Ask your agent to request, download, and check the status of specific user cohorts for advanced segmentation or analysis.
- Event type and category administration: Effortlessly create, update, or delete event types and categories, keeping your analytics taxonomy organized and up to date.
- User property updates: Direct your agent to set or modify user properties—like device information or location—without sending new events, making user profile management a breeze.
- Comprehensive analytics lookup: Retrieve detailed information about event types and categories, enabling your agent to provide insights or answer analytics questions in real time.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `AMPLITUDE_ASSIGN_ANNOTATIONS_TO_CATEGORY` | Bulk Assign Annotations to Category | Tool to bulk assign multiple annotations to a category in Amplitude. Use when you need to organize annotations by assigning them to a specific category. |
| `AMPLITUDE_CANCEL_DELETION` | Cancel User Deletion | Cancel a pending user deletion request in Amplitude. Use this to remove a user from a scheduled deletion job before it completes. Only works on deletion jobs in 'Staging' status (not yet submitted). |
| `AMPLITUDE_CHECK_COHORT_STATUS` | Check Amplitude Cohort Status | Check the status of a cohort export request. This action allows you to: - Poll the status of an in-progress cohort download request - Determine if a cohort is ready for download |
| `AMPLITUDE_CREATE_ANNOTATION` | Create Chart Annotation in Amplitude | Create a chart annotation in Amplitude to mark important dates. Use to highlight key events like feature releases, marketing campaigns, or product updates on analytics charts. |
| `AMPLITUDE_CREATE_ANNOTATION_CATEGORY` | Create Annotation Category | Tool to create an annotation category in Amplitude to organize annotations. Use when you need to create a new category for grouping related annotations. |
| `AMPLITUDE_CREATE_EVENT_CATEGORY` | Create Amplitude Event Category | Create a new event category in Amplitude. This action allows you to: - Create a new event category to organize event types - Validate category name before creation Key features: - Creates event categories for organizing events - Returns success/failure status |
| `AMPLITUDE_CREATE_EVENT_TYPE` | Create Amplitude Event Type | Create a new event type in Amplitude. This action allows you to: - Define a new event type with various properties - Associate the event with a category - Add metadata like description, tags, and owner Key features: - Creates trackable events in your Amplitude project - Supports full event type configuration |
| `AMPLITUDE_CREATE_RELEASE` | Create Amplitude Release | Create a release to document product changes. Use when you want to track app version releases and their impact on metrics. Can be integrated into deployment workflows to automatically log releases in Amplitude. |
| `AMPLITUDE_DELETE_ANNOTATION` | Delete Amplitude Chart Annotation | Delete a chart annotation from Amplitude. Use to remove existing annotations from charts. |
| `AMPLITUDE_DELETE_ANNOTATION_CATEGORY` | Delete Amplitude Annotation Category | Delete an annotation category from Amplitude. Use when you need to remove an annotation category that is no longer needed. |
| `AMPLITUDE_DELETE_EVENT_CATEGORY` | Delete Amplitude Event Category | Delete an event category from Amplitude. This action allows you to: - Delete an existing event category - Remove category organization from events Key features: - Permanently removes event categories - Returns success/failure status |
| `AMPLITUDE_DELETE_EVENT_TYPE` | Delete Amplitude Event Type | Delete an event type from Amplitude. This action allows you to: - Remove an event type from your project - Mark live events as deleted - Remove planned events from the tracking plan Key features: - Different behavior based on event status (live, planned, etc.) - Returns success/failure status |
| `AMPLITUDE_DELETE_USERS` | Delete Amplitude Users | Submit user deletion requests for GDPR/CCPA compliance. Supports up to 100 users per request. Use when you need to delete user data from Amplitude in compliance with privacy regulations. Either amplitude_ids or user_ids must be provided. |
| `AMPLITUDE_DOWNLOAD_COHORT_FILE` | Download Amplitude Cohort File | Download the cohort file after request is complete. Use this action after checking that the cohort status is 'JOB COMPLETED'. The download link is valid for 7 days, but the S3 link is valid for only 1 minute. |
| `AMPLITUDE_FIND_USER` | Search Amplitude User | Search for users in Amplitude by canonical identifier (Amplitude ID, device ID, user ID, or user ID prefix). Use this to find matching Amplitude IDs for deterministic user mapping. Important: This searches only canonical identifiers (Amplitude ID, device_id, user_id), NOT arbitrary user properties like email unless email is your actual user_id. |
| `AMPLITUDE_GET_ACTIVE_USERS` | Get Active or New Users | Get the number of active or new users for a date range with optional segmentation. Use when you need user count metrics aggregated by day, week, or month, optionally grouped by user properties. |
| `AMPLITUDE_GET_ANNOTATION` | Get Amplitude Annotation | Get a single chart annotation by ID from Amplitude. Use when you need to retrieve detailed information about a specific annotation including its label, timestamps, category, and associated chart. |
| `AMPLITUDE_GET_ANNOTATION_CATEGORY` | Get Amplitude Annotation Category | Get a single annotation category by ID from Amplitude. Use when you need to retrieve details about a specific annotation category. |
| `AMPLITUDE_GET_COHORT` | Request Amplitude Cohort | Get a single cohort by ID and initiate download. This action allows you to: - Request a specific cohort from Amplitude - Optionally include user properties in the response - Start the asynchronous download process Key features: - Supports filtering by specific user properties - Returns a request ID used for polling status and downloading - Supports EU data residency |
| `AMPLITUDE_GET_DELETION_REQUESTS` | Get User Deletion Requests | Get the status of user deletion requests within a date range. Use this to track GDPR/CCPA deletion compliance and monitor deletion job progress. |
| `AMPLITUDE_GET_EVENT_CATEGORIES` | Get Amplitude Event Categories | Get event categories from Amplitude. This action allows you to: - Get all event categories in your project - Get a specific category by name |
| `AMPLITUDE_GET_EVENT_PROPERTY` | Get Amplitude Event Property | Get a specific event property from Amplitude taxonomy. Use when you need to retrieve details about a specific event property including its type, validation rules, and metadata. |
| `AMPLITUDE_GET_EVENT_SEGMENTATION` | Get Event Segmentation Data | Get event segmentation data from Amplitude Analytics API. Use this to analyze event metrics over time with optional grouping by properties. Supports multiple metrics (uniques, totals, percentage of DAU, averages) and time intervals (realtime, hourly, daily, weekly). |
| `AMPLITUDE_GET_EVENT_TYPE` | Get Amplitude Event Type | Get a specific event type from Amplitude by name. This action allows you to: - Retrieve detailed information about a single event type - Get all properties and metadata for the event Key features: - Retrieves comprehensive event type details - Returns metadata like category, description, and settings - Supports lookup by exact event name |
| `AMPLITUDE_GET_EVENT_TYPES` | Get Amplitude Event Types | Get all event types from Amplitude. This action allows you to: - Retrieve all event types in your project - Optionally include deleted events |
| `AMPLITUDE_GET_FUNNEL_DATA` | Get Funnel Analysis Data | Get funnel analysis data showing user conversion through a sequence of events. Use this to analyze user drop-off rates and conversion times across multiple steps in a user journey. |
| `AMPLITUDE_GET_REALTIME_ACTIVE_USERS` | Get Real-time Active Users | Get real-time active users count from Amplitude. Returns active user counts with 5-minute granularity (configurable) for today and yesterday. Use this to monitor current user activity and compare with historical data. |
| `AMPLITUDE_GET_RETENTION` | Get User Retention Analysis | Get user retention analysis showing how users return over time after a starting action. Use when analyzing user engagement patterns, measuring feature stickiness, or understanding long-term user behavior across cohorts. |
| `AMPLITUDE_GET_REVENUE_LTV` | Get Revenue LTV Metrics | Get revenue lifetime value (LTV) metrics including ARPU, ARPPU, and total revenue. Use when you need to analyze revenue trends over time for user cohorts. |
| `AMPLITUDE_GET_SESSION_AVERAGE` | Get Session Average Length | Get average session length (in seconds) for a specified date range from Amplitude. Use when you need to analyze user engagement patterns and session duration trends over time. |
| `AMPLITUDE_GET_SESSION_LENGTH` | Get Session Length Distribution | Tool to retrieve session length distribution data for a specified date range from Amplitude. Use when you need to analyze how long users' sessions typically last or visualize session duration patterns across time buckets. |
| `AMPLITUDE_GET_SESSIONS_PER_USER` | Get Sessions Per User from Amplitude | Tool to get average number of sessions per user for each day in a date range from Amplitude. Use when analyzing user engagement patterns or session frequency over time. |
| `AMPLITUDE_GET_USER_ACTIVITY` | Get User Activity from Amplitude | Fetch a single user's profile summary and event stream by Amplitude ID. Use when you need to extract attribution data (UTM parameters, referrers) from early events or user properties, or when analyzing user behavior patterns. |
| `AMPLITUDE_GET_USER_COMPOSITION` | Get User Composition by Property | Tool to get user composition breakdown by property (platform, version, country, etc.). Use when analyzing user distribution across property values during a date range. |
| `AMPLITUDE_GET_USER_MAPPINGS` | Get User Mappings | Get the list of user mappings for provided user IDs. Use when you need to retrieve aliasing relationships between user identifiers in Amplitude. Returns mapping data showing which users map into and out of the requested user IDs. |
| `AMPLITUDE_GET_USER_PROPERTY` | Get Amplitude User Property | Get a specific user property from Amplitude taxonomy. Use when you need to retrieve details about a specific user property including its type, validation rules, and classifications. |
| `AMPLITUDE_IDENTIFY` | Update User Properties in Amplitude | Update user properties using Amplitude's Identify API. This action allows you to: - Set or update the User ID for a Device ID - Update user properties without sending an event - Perform operations on user properties (set, append, etc.) - Update user attributes like device info and location |
| `AMPLITUDE_LIST_ANNOTATION_CATEGORIES` | List Amplitude Annotation Categories | List all annotation categories from Amplitude. Use to retrieve available categories for chart annotations. |
| `AMPLITUDE_LIST_ANNOTATIONS` | List Chart Annotations | Tool to get all chart annotations with optional filtering by category, chart, and date range. Use when you need to retrieve annotations that mark important events or milestones on Amplitude charts. |
| `AMPLITUDE_LIST_COHORTS` | List Amplitude Cohorts | List all discoverable cohorts for an Amplitude project. This action allows you to: - Get a list of all cohorts in your Amplitude project - Optionally include sync information for each cohort Key features: - Returns cohort details including ID, name, size, and definition - Optionally includes sync metadata for integration with other tools - Supports EU data residency. An empty result may indicate insufficient permissions to view cohorts rather than an absence of cohorts in the project. |
| `AMPLITUDE_LIST_EVENT_PROPERTIES` | List Amplitude Event Properties | Get all event properties from Amplitude, optionally filtered by event type or property name. Use when you need to retrieve property definitions, data types, or validation rules for events. |
| `AMPLITUDE_LIST_EVENTS` | List Amplitude Events | Tool to get a list of all event types in your Amplitude project with current week's statistics. Use when you need to see all events and their recent activity metrics including totals, uniques, and DAU percentages. |
| `AMPLITUDE_LIST_USER_PROPERTIES` | List Amplitude User Properties | Tool to get all user properties in your Amplitude project. Use when you need to retrieve the complete list of user properties including both default and custom properties. |
| `AMPLITUDE_MAP_USER` | Map Users in Amplitude | Map users with different user IDs together (alias/merge users) in Amplitude. Use this to merge user identities across different identifiers or unmap previously merged users. Supports up to 2000 mappings per request with 1MB size limit. |
| `AMPLITUDE_RESTORE_EVENT_TYPE` | Restore Amplitude Event Type | Restore a deleted event type in Amplitude. This action allows you to: - Restore a previously deleted event type - Make the event available again in the UI and API Key features: - Undoes the deletion of an event type - Returns success/failure status |
| `AMPLITUDE_SEND_EVENTS` | Send Events to Amplitude | Send events to Amplitude using the HTTP V2 API. This action allows you to send events to Amplitude for tracking user behavior and analytics. It supports all Amplitude event fields, handles proper validation, and includes comprehensive error handling. |
| `AMPLITUDE_SET_GROUP_PROPERTIES` | Set Group Properties in Amplitude | Set group properties for account-level reporting without sending an event. Use this action to update group attributes like company name, industry, or plan type. Requires Enterprise plan with Accounts add-on. |
| `AMPLITUDE_UPDATE_ANNOTATION` | Update Amplitude Chart Annotation | Tool to update an existing chart annotation in Amplitude. Use when you need to modify annotation properties such as label, timestamps, category, or chart association. Supports partial updates - only include fields you want to change. |
| `AMPLITUDE_UPDATE_ANNOTATION_CATEGORY` | Update Amplitude Annotation Category | Tool to update an annotation category in Amplitude. Use when you need to rename or modify an existing annotation category for organizing chart annotations. |
| `AMPLITUDE_UPDATE_COHORT_MEMBERSHIP` | Update Amplitude Cohort Membership | Incrementally update cohort membership by adding or removing IDs. This action allows you to: - Add new IDs to an existing cohort - Remove IDs from an existing cohort - Perform multiple operations in a single request |
| `AMPLITUDE_UPDATE_EVENT_CATEGORY` | Update Amplitude Event Category | Update an existing event category in Amplitude. This action allows you to: - Update the name of an existing event category - Validate the new category name Key features: - Updates category names - Returns success/failure status |
| `AMPLITUDE_UPDATE_EVENT_TYPE` | Update Amplitude Event Type | Update an existing event type in Amplitude. This action allows you to: - Change event type properties - Update event name, category, metadata, and settings - Modify display name for ingested events Key features: - Updates event type configuration - Supports partial updates (only specified fields are changed) |
| `AMPLITUDE_UPLOAD_BATCH_EVENTS` | Batch Upload Events to Amplitude | Bulk upload events to Amplitude using the Batch Event Upload API. Supports larger payloads (20MB) and higher throttling limits than HTTP V2 API. Use when you need to send large batches of events efficiently. |
| `AMPLITUDE_UPLOAD_COHORT` | Upload Amplitude Cohort | Generate a new cohort or update an existing cohort by uploading user IDs or Amplitude IDs. Use when you need to create cohorts from a specific list of users. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Amplitude MCP server is an implementation of the Model Context Protocol that connects your AI agent to Amplitude. It provides structured and secure access so your agent can perform Amplitude operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before starting, make sure you have:
- Python 3.9 or higher
- A Composio account with an active API key
- Basic familiarity with Python and async programming

### 1. Getting API Keys for OpenAI and Composio

OpenAI API Key
- Go to the [OpenAI dashboard](https://platform.openai.com/settings/organization/api-keys) and create an API key. You'll need credits to use the models, or you can connect to another model provider.
- Keep the API key safe.
Composio API Key
- Log in to the [Composio dashboard](https://dashboard.composio.dev?utm_source=toolkits&utm_medium=framework_docs).
- Navigate to your API settings and generate a new API key.
- Store this key securely as you'll need it for authentication.

### 2. Install dependencies

Install the required libraries.
What's happening:
- composio connects your agent to external SaaS tools like Amplitude
- pydantic-ai lets you create structured AI agents with tool support
- python-dotenv loads your environment variables securely from a .env file
```bash
pip install composio pydantic-ai python-dotenv
```

### 3. Set up environment variables

Create a .env file in your project root.
What's happening:
- COMPOSIO_API_KEY authenticates your agent to Composio's API
- USER_ID associates your session with your account for secure tool access
- OPENAI_API_KEY to access OpenAI LLMs
```bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key
```

### 4. Import dependencies

What's happening:
- We load environment variables and import required modules
- Composio manages connections to Amplitude
- MCPServerStreamableHTTP connects to the Amplitude MCP server endpoint
- Agent from Pydantic AI lets you define and run the AI assistant
```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()
```

### 5. Create a Tool Router Session

What's happening:
- We're creating a Tool Router session that gives your agent access to Amplitude tools
- The create method takes the user ID and specifies which toolkits should be available
- The returned session.mcp.url is the MCP server URL that your agent will use
```python
async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Amplitude
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["amplitude"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")
```

### 6. Initialize the Pydantic AI Agent

What's happening:
- The MCP client connects to the Amplitude endpoint
- The agent uses GPT-5 to interpret user commands and perform Amplitude operations
- The instructions field defines the agent's role and behavior
```python
# Attach the MCP server to a Pydantic AI Agent
amplitude_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
agent = Agent(
    "openai:gpt-5",
    toolsets=[amplitude_mcp],
    instructions=(
        "You are a Amplitude assistant. Use Amplitude tools to help users "
        "with their requests. Ask clarifying questions when needed."
    ),
)
```

### 7. Build the chat interface

What's happening:
- The agent reads input from the terminal and streams its response
- Amplitude API calls happen automatically under the hood
- The model keeps conversation history to maintain context across turns
```python
# Simple REPL with message history
history = []
print("Chat started! Type 'exit' or 'quit' to end.\n")
print("Try asking the agent to help you with Amplitude.\n")

while True:
    user_input = input("You: ").strip()
    if user_input.lower() in {"exit", "quit", "bye"}:
        print("\nGoodbye!")
        break
    if not user_input:
        continue

    print("\nAgent is thinking...\n", flush=True)

    async with agent.run_stream(user_input, message_history=history) as stream_result:
        collected_text = ""
        async for chunk in stream_result.stream_output():
            text_piece = None
            if isinstance(chunk, str):
                text_piece = chunk
            elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                text_piece = chunk.delta
            elif hasattr(chunk, "text"):
                text_piece = chunk.text
            if text_piece:
                collected_text += text_piece
        result = stream_result

    print(f"Agent: {collected_text}\n")
    history = result.all_messages()
```

### 8. Run the application

What's happening:
- The asyncio loop launches the agent and keeps it running until you exit
```python
if __name__ == "__main__":
    asyncio.run(main())
```

## Complete Code

```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()

async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Amplitude
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["amplitude"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")

    # Attach the MCP server to a Pydantic AI Agent
    amplitude_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
    agent = Agent(
        "openai:gpt-5",
        toolsets=[amplitude_mcp],
        instructions=(
            "You are a Amplitude assistant. Use Amplitude tools to help users "
            "with their requests. Ask clarifying questions when needed."
        ),
    )

    # Simple REPL with message history
    history = []
    print("Chat started! Type 'exit' or 'quit' to end.\n")
    print("Try asking the agent to help you with Amplitude.\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() in {"exit", "quit", "bye"}:
            print("\nGoodbye!")
            break
        if not user_input:
            continue

        print("\nAgent is thinking...\n", flush=True)

        async with agent.run_stream(user_input, message_history=history) as stream_result:
            collected_text = ""
            async for chunk in stream_result.stream_output():
                text_piece = None
                if isinstance(chunk, str):
                    text_piece = chunk
                elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                    text_piece = chunk.delta
                elif hasattr(chunk, "text"):
                    text_piece = chunk.text
                if text_piece:
                    collected_text += text_piece
            result = stream_result

        print(f"Agent: {collected_text}\n")
        history = result.all_messages()

if __name__ == "__main__":
    asyncio.run(main())
```

## Conclusion

You've built a Pydantic AI agent that can interact with Amplitude through Composio's Tool Router. With this setup, your agent can perform real Amplitude actions through natural language.
You can extend this further by:
- Adding other toolkits like Gmail, HubSpot, or Salesforce
- Building a web-based chat interface around this agent
- Using multiple MCP endpoints to enable cross-app workflows (for example, Gmail + Amplitude for workflow automation)
This architecture makes your AI agent "agent-native", able to securely use APIs in a unified, composable way without custom integrations.

## How to build Amplitude MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/amplitude/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/amplitude/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/amplitude/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/amplitude/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/amplitude/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/amplitude/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/amplitude/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/amplitude/framework/cli)
- [Google ADK](https://composio.dev/toolkits/amplitude/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/amplitude/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/amplitude/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/amplitude/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/amplitude/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/amplitude/framework/crew-ai)

## Related Toolkits

- [Firecrawl](https://composio.dev/toolkits/firecrawl) - Firecrawl automates large-scale web crawling and data extraction. It helps organizations efficiently gather, index, and analyze content from online sources.
- [Tavily](https://composio.dev/toolkits/tavily) - Tavily offers powerful search and data retrieval from documents, databases, and the web. It helps teams locate and filter information instantly, saving hours on research.
- [Exa](https://composio.dev/toolkits/exa) - Exa is a data extraction and search platform for gathering and analyzing information from websites, APIs, or databases. It helps teams quickly surface insights and automate data-driven workflows.
- [Serpapi](https://composio.dev/toolkits/serpapi) - SerpApi is a real-time API for structured search engine results. It lets you automate SERP data collection, parsing, and analysis for SEO and research.
- [Peopledatalabs](https://composio.dev/toolkits/peopledatalabs) - Peopledatalabs delivers B2B data enrichment and identity resolution APIs. Supercharge your apps with accurate, up-to-date business and contact data.
- [Snowflake](https://composio.dev/toolkits/snowflake) - Snowflake is a cloud data warehouse built for elastic scaling, secure data sharing, and fast SQL analytics across major clouds.
- [Posthog](https://composio.dev/toolkits/posthog) - PostHog is an open-source analytics platform for tracking user interactions and product metrics. It helps teams refine features, analyze funnels, and reduce churn with actionable insights.
- [Bright Data MCP](https://composio.dev/toolkits/brightdata_mcp) - Bright Data MCP is an AI-powered web scraping and data collection platform. Instantly access public web data in real time with advanced scraping tools.
- [Browseai](https://composio.dev/toolkits/browseai) - Browseai is a web automation and data extraction platform that turns any website into an API. It's perfect for monitoring websites and retrieving structured data without manual scraping.
- [ClickHouse](https://composio.dev/toolkits/clickhouse) - ClickHouse is an open-source, column-oriented database for real-time analytics and big data processing using SQL. Its lightning-fast query performance makes it ideal for handling large datasets and delivering instant insights.
- [Coinmarketcal](https://composio.dev/toolkits/coinmarketcal) - CoinMarketCal is a community-powered crypto calendar for upcoming events, announcements, and releases. It helps traders track market-moving developments and stay ahead in the crypto space.
- [Control d](https://composio.dev/toolkits/control_d) - Control d is a customizable DNS filtering and traffic redirection platform. It helps you manage internet access, enforce policies, and monitor usage across devices and networks.
- [Databox](https://composio.dev/toolkits/databox) - Databox is a business analytics platform that connects your data from any tool and device. It helps you track KPIs, build dashboards, and discover actionable insights.
- [Databricks](https://composio.dev/toolkits/databricks) - Databricks is a unified analytics platform for big data and AI on the lakehouse architecture. It empowers data teams to collaborate, analyze, and build scalable solutions efficiently.
- [Datagma](https://composio.dev/toolkits/datagma) - Datagma delivers data intelligence and analytics for business growth and market discovery. Get actionable market insights and track competitors to inform your strategy.
- [Delighted](https://composio.dev/toolkits/delighted) - Delighted is a customer feedback platform based on the Net Promoter System®. It helps you quickly gather, track, and act on customer sentiment.
- [Dovetail](https://composio.dev/toolkits/dovetail) - Dovetail is a research analysis platform for transcript review and insight generation. It helps teams code interviews, analyze feedback, and create actionable research summaries.
- [Dub](https://composio.dev/toolkits/dub) - Dub is a short link management platform with analytics and API access. Use it to easily create, manage, and track branded short links for your business.
- [Elasticsearch](https://composio.dev/toolkits/elasticsearch) - Elasticsearch is a distributed, RESTful search and analytics engine for all types of data. It delivers fast, scalable search and powerful analytics across massive datasets.
- [Fireflies](https://composio.dev/toolkits/fireflies) - Fireflies.ai is an AI-powered meeting assistant that records, transcribes, and analyzes voice conversations. It helps teams capture call notes automatically and search or summarize meetings effortlessly.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Amplitude MCP?

With a standalone Amplitude MCP server, the agents and LLMs can only access a fixed set of Amplitude tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Amplitude and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with Pydantic AI?

Yes, you can. Pydantic AI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Amplitude tools.

### Can I manage the permissions and scopes for Amplitude while using Tool Router?

Yes, absolutely. You can configure which Amplitude scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Amplitude data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
