# How to integrate Composio search MCP with Pydantic AI

```json
{
  "title": "How to integrate Composio search MCP with Pydantic AI",
  "toolkit": "Composio search",
  "toolkit_slug": "composio_search",
  "framework": "Pydantic AI",
  "framework_slug": "pydantic-ai",
  "url": "https://composio.dev/toolkits/composio_search/framework/pydantic-ai",
  "markdown_url": "https://composio.dev/toolkits/composio_search/framework/pydantic-ai.md",
  "updated_at": "2026-05-12T10:07:17.725Z"
}
```

## Introduction

This guide walks you through connecting Composio search to Pydantic AI using the Composio tool router. By the end, you'll have a working Composio search agent that can find recent news about electric vehicles, search for top-rated hotels in paris, get latest stock info for apple through natural language commands.
This guide will help you understand how to give your Pydantic AI agent real control over a Composio search account through Composio's Composio search MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Composio search with

- [OpenAI Agents SDK](https://composio.dev/toolkits/composio_search/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/composio_search/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/composio_search/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/composio_search/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/composio_search/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/composio_search/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/composio_search/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/composio_search/framework/cli)
- [Google ADK](https://composio.dev/toolkits/composio_search/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/composio_search/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/composio_search/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/composio_search/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/composio_search/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/composio_search/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- How to set up your Composio API key and User ID
- How to create a Composio Tool Router session for Composio search
- How to attach an MCP Server to a Pydantic AI agent
- How to stream responses and maintain chat history
- How to build a simple REPL-style chat interface to test your Composio search workflows

## What is Pydantic AI?

Pydantic AI is a Python framework for building AI agents with strong typing and validation. It leverages Pydantic's data validation capabilities to create robust, type-safe AI applications.
Key features include:
- Type Safety: Built on Pydantic for automatic data validation
- MCP Support: Native support for Model Context Protocol servers
- Streaming: Built-in support for streaming responses
- Async First: Designed for async/await patterns

## What is the Composio search MCP server, and what's possible with it?

The Composio search MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to the entire Composio Search suite. It provides structured and secure access to powerful web, travel, shopping, news, academic, and financial search tools, so your agent can perform actions like searching the web, finding events, locating places, pulling news, and fetching academic research on your behalf.
- Comprehensive web and news search: Instantly ask your agent to fetch up-to-date web pages, breaking news, or current events using Google, DuckDuckGo, or news-specific search APIs.
- Travel and local discovery: Let your agent find nearby hotels, flights, events, or map locations using Google Maps and events search for seamless travel planning and local exploration.
- E-commerce and product lookup: Have your agent search for products, deals, and reviews across major retailers like Amazon and Walmart to help you shop smarter and faster.
- Financial and market data retrieval: Direct your agent to pull real-time stock information, financial news, and market trends with just a query—no manual research needed.
- Academic and scholarly research: Empower your agent to find relevant academic papers, citations, and scholarly articles using Google Scholar and Exa Answer for research-heavy tasks.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `COMPOSIO_SEARCH_AMAZON` | Amazon Product Search | Search Amazon for products across different country marketplaces. This tool searches Amazon's product catalog with support for multiple international Amazon domains/marketplaces. Each domain serves a different country with local pricing, currency, shipping options, and product availability. Perfect for product research, international price comparison, and finding products available in specific countries. Returns product details, pricing in local currency, ratings, reviews, and seller information. Examples: query="gaming laptop" + amazon_domain="amazon.com" for US marketplace query="coffee maker" + amazon_domain="amazon.de" for German marketplace query="iPhone 15" + amazon_domain="amazon.co.uk" for UK marketplace with GBP pricing |
| `COMPOSIO_SEARCH_DUCK_DUCK_GO_SEARCH` | Composio DuckDuckGo Search | The DuckDuckGoSearch class utilizes the Composio DuckDuckGo Search API to perform searches, focusing on web information and details. It leverages the DuckDuckGo search engine via the Composio DuckDuckGo Search API to retrieve relevant web data based on the provided query. |
| `COMPOSIO_SEARCH_EVENT_SEARCH` | Composio Google Events Search | Search for upcoming events, concerts, festivals, conferences, and other activities. Supports location-based search (city, neighborhood, address), date filtering (today, tomorrow, weekend, next week/month), virtual event discovery, international search with 100+ languages and countries, and pagination. If results are sparse or events_results_state is 'Fully empty', treat as limited coverage and try COMPOSIO_SEARCH_WEB instead of retrying. Rate limit: ~2 requests/second; apply exponential backoff (1s, 2s, 4s) on 429 errors. Examples: "Tech conferences" in "San Francisco, CA" with gl="us" and hl="en" \| "Virtual networking events" with htichips="event_type:Virtual-Event,date:next_week" \| "Music festivals" in "London, UK" with gl="uk" and hl="en" |
| `COMPOSIO_SEARCH_EXA_SIMILARLINK` | Composio Similarlinks | Perform a search to find similar links and retrieve a list of relevant results. The search can optionally return contents. |
| `COMPOSIO_SEARCH_FETCH_URL_CONTENT` | Fetch URL Content | Fetch and extract clean, readable page text (markdown) from public web pages (HTML content) using the Exa API. Use when you need to retrieve actual content from search results or documentation links to extract setup steps, requirements, or citations. Only works with web page URLs - does not support direct links to images, PDFs, or other binary files. |
| `COMPOSIO_SEARCH_FINANCE` | Composio Finance Search | Get real-time stock prices, market data, financial news, and company information with historical analysis. Retrieves stock quotes, market indices, cryptocurrency prices, exchange rates, and financial news. Supports discrete time windows (1D, 5D, 1M, 6M, YTD, 1Y, 5Y, MAX) for historical analysis. Returns numeric time-series graph data, summary information, and key events (6M+ windows only). Always verify finance_results_state in the response — HTTP 200 with status='Success' can still return empty data. Derived indicators (MACD, moving averages) are not included and must be computed from returned price series. Examples: query="AAPL:NASDAQ" + window="1Y" for Apple's 1-year chart query="GOOGL:NASDAQ" + window="6M" + hl="en" for Alphabet with news events query="WMT:NYSE" + window="MAX" for Walmart's full history query="BTC-USD" for Bitcoin query="EUR-USD" for Euro/USD rate |
| `COMPOSIO_SEARCH_FLIGHTS` | Flight Search | Search for flights with comprehensive pricing, schedule, and airline information. This tool finds available flights between cities/airports with detailed pricing, multiple airlines, departure/arrival times, flight duration, and booking options. Supports round-trip and one-way searches, multiple passenger types (adults, children, infants), different travel classes, and international pricing in various currencies. Perfect for travel planning, price comparison, and finding the best flight options. You can use either: 1. Natural language query: query="Lahore to San Francisco" or query="NYC to London on March 15, 2025" 2. Structured parameters: departure_id="JFK", arrival_id="LAX", outbound_date="2025-12-25" Examples: query="Lahore to San Francisco" query="New York to London on December 25, 2025" departure_id="JFK" + arrival_id="LAX" + outbound_date="2025-12-25" + return_date="2025-12-30" departure_id="LGA" + arrival_id="LHR" + outbound_date="2025-06-01" + adults=2 |
| `COMPOSIO_SEARCH_GOOGLE_MAPS` | Composio Google Maps Search | Performs a location-specific search via the Composio Google Maps Search API, returning results under `results.local_results` (multi-place queries) or `results.place_results` (single place); always handle both branches. Fields like `opening_hours`, `phone`, `rating`, and `user_reviews` may be absent — treat missing values as unknown. Check `business_status`: closed businesses (`CLOSED_PERMANENTLY`, `TEMPORARILY_CLOSED`) can appear in results; filter for `OPERATIONAL` entries. Rate-limit calls to ~1 req/sec; HTTP 429 / `OVER_QUERY_LIMIT` indicates quota exhaustion. `gps_coordinates` may appear under `place_results` or inside individual `local_results` items. |
| `COMPOSIO_SEARCH_GROQ_CHAT` | Groq Chat Completion | Execute fast LLM inference using Groq's optimized hardware and API. Groq provides ultra-fast inference for open-source models including LLaMA 3, Mixtral, and Gemma via OpenAI-compatible chat completions API. Use cases: real-time chat, content generation, Q&A, code generation, summarization, translation. For structured JSON output, instruct the model explicitly to return valid JSON with no markdown code fences or prose; validate with json.loads — batches of 150+ items are prone to malformed JSON. Strip triple-backtick wrappers from code/JSON outputs before parsing. Always validate that the response contains at least one choice with non-empty message.content before downstream use. On HTTP 429 or 5xx errors, use exponential backoff with a small retry cap; limit concurrency to ~3 concurrent calls. Model-reported character/word counts are approximate — verify independently when strict limits matter. |
| `COMPOSIO_SEARCH_HOTELS` | Hotel Search | Search for hotels and vacation rentals with comprehensive filtering and pricing. Returns results under results.properties and results.ads; extract numeric pricing from rate_per_night.extracted_lowest, total_rate.extracted_lowest, or extracted_price (fields vary by property). Deduplicate across sections using property_token. Additional pages available via serpapi_pagination.next_page_token. Start with minimal filters — combining max_price, min_price, hotel_class, free_cancellation, gl, or hl too strictly can return empty results. Examples: q="New York" + check_in_date="2025-06-01" + check_out_date="2025-06-05" + adults=2 q="Paris" + check_in_date="2025-03-15" + check_out_date="2025-03-18" + min_price=100 + max_price=300 q="Tokyo" + check_in_date="2025-12-20" + check_out_date="2025-12-25" + hotel_class="4,5" + sort_by=3 |
| `COMPOSIO_SEARCH_IMAGE` | Composio Image Search | The ImageSearch class performs an image search using the Composio Image Search API, targeting image metadata and URLs (not binary data) via Google Images. Returns results under `results.images_results`; each entry exposes `original` (full resolution) and `thumbnail` (low resolution) URLs — verify URLs are publicly accessible before passing downstream. Check `license_details_url` per result before commercial reuse. The number of results is controlled by `num` (1–100, default 20). For >100 images, paginate using `serpapi_pagination.next`. Rate limit: throttle to ~1–2 calls/second; apply exponential backoff on HTTP 429. |
| `COMPOSIO_SEARCH_NEWS` | Composio News Search | Search for the latest news articles and current events with smart filtering. Searches news-oriented sources only (not blogs, docs, or company pages). Results nest under `data.results.news_results`; a 200 response with empty `news_results` or `news_results_state: 'Fully empty'` means no results — broaden query or `when` window. Auto-correction (`showing_results_for`) can silently redirect queries; verify returned articles match intended topic. Only first page returned by default; follow `pagination.next` for more pages. Supports 100+ languages and countries. Use advanced operators like 'site:' for publisher filtering directly in your query. Examples: query="artificial intelligence" + when="w" for past week's AI news query="climate change" + gl="us" + hl="en" for US climate news query="business news" + when="d" for today's business news query="site:bbc.com tesla" + when="d" for today's BBC Tesla news |
| `COMPOSIO_SEARCH_NPPESNPI_LOOKUP` | NPPES NPI Registry Lookup | Lookup US healthcare provider details from the CMS NPI Registry (NPPES) using an NPI number or search filters. Returns normalized structured fields including provider name, taxonomy/specialty, addresses, identifiers, and endpoints. Use this for deterministic, authoritative provider lookups when you need structured NPI/NPPES data rather than noisy web search results. |
| `COMPOSIO_SEARCH_SCHOLAR` | Composio Scholar Search | Scholar API scrapes Google Scholar search results via SERP API, returning academic papers and scholarly articles. Results are nested under results.organic_results; access this key explicitly and use defensive .get() patterns as fields like DOI, citation_count, and author may be absent. Many results are paywalled — rely on titles, abstracts, and snippets when full text is unavailable. Results may include duplicate preprint and journal versions of the same work — deduplicate by DOI or normalized title. Only the first page is returned by default; follow pagination.next for additional pages. PDF links appear only under organic_results[n].resources where file_format is 'PDF'. Results may lag on recent work. |
| `COMPOSIO_SEARCH_SEC_FILINGS` | Composio SEC EDGAR Filings Search | Retrieve authoritative SEC EDGAR filing metadata (10-K/10-Q/8-K etc.) and construct primary document/index URLs using SEC's public data APIs. Use when you need to search for company financial filings, annual reports, quarterly reports, or other SEC-mandated disclosures. |
| `COMPOSIO_SEARCH_SHOPPING` | Composio Shopping Search | Search for products with advanced price filtering, location targeting, and deal discovery. This tool provides comprehensive product search with price range filtering, geographic targeting for local retailers, sorting by price (low to high, high to low), and filtering for free shipping or sale items. Perfect for product research, price comparison, finding deals, and discovering where to buy items. Returns product details, prices, availability, seller information, and reviews. Examples: query="gaming laptop" + min_price=800 + max_price=1500 + sort_by=1 query="running shoes" + location="Seattle, WA" + free_shipping=True query="coffee maker" + on_sale=True + gl="us" |
| `COMPOSIO_SEARCH_TAVILY` | Composio LLM Search | The Composio LLM Search class serves as a gateway to the Composio LLM Search API, allowing users to perform searches across a broad range of content with multiple filtering options. It accommodates complex queries, including both keyword and phrase searches, with additional parameters to fine-tune the search results. This class enables a tailored search experience by allowing users to specify the search depth, include images and direct answers, apply domain-specific filters, and control the number of results returned. It is designed to meet various search requirements, from quick lookups to in-depth research. |
| `COMPOSIO_SEARCH_TRENDS` | Composio Trends Search | Discover trending topics, search patterns, and popularity data. Analyzes search interest over time, compares multiple topics, and identifies rising trends. Returns normalized 0–100 relative interest indices (not absolute search volumes) via interest_over_time.timeline_data; values are objects with 'extracted_value' (may be string '<1'). The final timeline entry may have partial_data=true for incomplete periods. Data typically lags ~24–48 hours. Response schema varies by data_type: TIMESERIES returns timeline_data, GEO_MAP returns regional breakdowns, RELATED_TOPICS/RELATED_QUERIES return topic/query lists — parse accordingly. Ideal for market research, content planning, and SEO analysis. |
| `COMPOSIO_SEARCH_TRIP_ADVISOR` | TripAdvisor Travel Search | Search TripAdvisor for travel recommendations and itinerary planning without authentication (unlike TRIPADVISOR_CONTENT_API_SEARCH_LOCATIONS and other TripAdvisor tools requiring an active connection). Searches attractions, restaurants, hotels, tours, and activities. Returns detailed ratings, reviews, photos, and traveler recommendations. Response data is nested under results → locations. Results may include evergreen articles alongside specific venues; verify result type before use in itineraries. Examples: query="things to do in Paris" + ssrc="A" for attractions only \| query="best restaurants in Tokyo" + ssrc="r" \| query="hotels in Bali" + ssrc="h" + tripadvisor_domain="tripadvisor.com" |
| `COMPOSIO_SEARCH_VERCEL_AI_CHAT` | Vercel AI Gateway Chat | Execute LLM inference through Vercel AI Gateway's unified API. Provides access to OpenAI, Anthropic, Google, and AWS Bedrock models. Supports LaunchDarkly config for model selection, fallbacks, and routing. Use cases: - Multi-provider AI applications with automatic fallback - Content generation and text analysis - Question answering and code generation |
| `COMPOSIO_SEARCH_WALMART` | Walmart Product Search | Search Walmart for products with price filtering. This tool searches Walmart's product catalog including groceries, electronics, clothing, home goods, pharmacy, and auto services. Supports basic price range filtering for finding products within budget. Results may appear under response.data.results.shopping_results or data.products — do not hardcode a single response shape. Results mix sponsored and organic listings; check ad/sponsorship flags when ranking. Some products omit fields like bought_last_month; handle nulls in sorting logic. Examples: query="wireless headphones" + min_price=50 + max_price=200 query="gaming laptop" + max_price=800 query="organic coffee" + min_price=10 |
| `COMPOSIO_SEARCH_WEB` | Composio Web Search | Perform a web search using the Exa API. Returns a nested structure: narrative summary under results.answer, sources under results.citations (and optionally results.organic_results). Prioritize results.citations as primary evidence over results.answer, which can be vague. Only indexes publicly available content — no paywalled, login-gated, or private content. No date filters enforced; include recency terms in query and verify dates in snippets manually. Throttle to ~1–2 requests/second; bursty queries trigger HTTP 429. Large responses may be stored remotely — use structure_info to locate results.answer and results.citations. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Composio search MCP server is an implementation of the Model Context Protocol that connects your AI agent to Composio search. It provides structured and secure access so your agent can perform Composio search operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before starting, make sure you have:
- Python 3.9 or higher
- A Composio account with an active API key
- Basic familiarity with Python and async programming

### 1. Getting API Keys for OpenAI and Composio

OpenAI API Key
- Go to the [OpenAI dashboard](https://platform.openai.com/settings/organization/api-keys) and create an API key. You'll need credits to use the models, or you can connect to another model provider.
- Keep the API key safe.
Composio API Key
- Log in to the [Composio dashboard](https://dashboard.composio.dev?utm_source=toolkits&utm_medium=framework_docs).
- Navigate to your API settings and generate a new API key.
- Store this key securely as you'll need it for authentication.

### 2. Install dependencies

Install the required libraries.
What's happening:
- composio connects your agent to external SaaS tools like Composio search
- pydantic-ai lets you create structured AI agents with tool support
- python-dotenv loads your environment variables securely from a .env file
```bash
pip install composio pydantic-ai python-dotenv
```

### 3. Set up environment variables

Create a .env file in your project root.
What's happening:
- COMPOSIO_API_KEY authenticates your agent to Composio's API
- USER_ID associates your session with your account for secure tool access
- OPENAI_API_KEY to access OpenAI LLMs
```bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key
```

### 4. Import dependencies

What's happening:
- We load environment variables and import required modules
- Composio manages connections to Composio search
- MCPServerStreamableHTTP connects to the Composio search MCP server endpoint
- Agent from Pydantic AI lets you define and run the AI assistant
```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()
```

### 5. Create a Tool Router Session

What's happening:
- We're creating a Tool Router session that gives your agent access to Composio search tools
- The create method takes the user ID and specifies which toolkits should be available
- The returned session.mcp.url is the MCP server URL that your agent will use
```python
async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Composio search
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["composio_search"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")
```

### 6. Initialize the Pydantic AI Agent

What's happening:
- The MCP client connects to the Composio search endpoint
- The agent uses GPT-5 to interpret user commands and perform Composio search operations
- The instructions field defines the agent's role and behavior
```python
# Attach the MCP server to a Pydantic AI Agent
composio_search_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
agent = Agent(
    "openai:gpt-5",
    toolsets=[composio_search_mcp],
    instructions=(
        "You are a Composio search assistant. Use Composio search tools to help users "
        "with their requests. Ask clarifying questions when needed."
    ),
)
```

### 7. Build the chat interface

What's happening:
- The agent reads input from the terminal and streams its response
- Composio search API calls happen automatically under the hood
- The model keeps conversation history to maintain context across turns
```python
# Simple REPL with message history
history = []
print("Chat started! Type 'exit' or 'quit' to end.\n")
print("Try asking the agent to help you with Composio search.\n")

while True:
    user_input = input("You: ").strip()
    if user_input.lower() in {"exit", "quit", "bye"}:
        print("\nGoodbye!")
        break
    if not user_input:
        continue

    print("\nAgent is thinking...\n", flush=True)

    async with agent.run_stream(user_input, message_history=history) as stream_result:
        collected_text = ""
        async for chunk in stream_result.stream_output():
            text_piece = None
            if isinstance(chunk, str):
                text_piece = chunk
            elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                text_piece = chunk.delta
            elif hasattr(chunk, "text"):
                text_piece = chunk.text
            if text_piece:
                collected_text += text_piece
        result = stream_result

    print(f"Agent: {collected_text}\n")
    history = result.all_messages()
```

### 8. Run the application

What's happening:
- The asyncio loop launches the agent and keeps it running until you exit
```python
if __name__ == "__main__":
    asyncio.run(main())
```

## Complete Code

```python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()

async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Composio search
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["composio_search"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")

    # Attach the MCP server to a Pydantic AI Agent
    composio_search_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
    agent = Agent(
        "openai:gpt-5",
        toolsets=[composio_search_mcp],
        instructions=(
            "You are a Composio search assistant. Use Composio search tools to help users "
            "with their requests. Ask clarifying questions when needed."
        ),
    )

    # Simple REPL with message history
    history = []
    print("Chat started! Type 'exit' or 'quit' to end.\n")
    print("Try asking the agent to help you with Composio search.\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() in {"exit", "quit", "bye"}:
            print("\nGoodbye!")
            break
        if not user_input:
            continue

        print("\nAgent is thinking...\n", flush=True)

        async with agent.run_stream(user_input, message_history=history) as stream_result:
            collected_text = ""
            async for chunk in stream_result.stream_output():
                text_piece = None
                if isinstance(chunk, str):
                    text_piece = chunk
                elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                    text_piece = chunk.delta
                elif hasattr(chunk, "text"):
                    text_piece = chunk.text
                if text_piece:
                    collected_text += text_piece
            result = stream_result

        print(f"Agent: {collected_text}\n")
        history = result.all_messages()

if __name__ == "__main__":
    asyncio.run(main())
```

## Conclusion

You've built a Pydantic AI agent that can interact with Composio search through Composio's Tool Router. With this setup, your agent can perform real Composio search actions through natural language.
You can extend this further by:
- Adding other toolkits like Gmail, HubSpot, or Salesforce
- Building a web-based chat interface around this agent
- Using multiple MCP endpoints to enable cross-app workflows (for example, Gmail + Composio search for workflow automation)
This architecture makes your AI agent "agent-native", able to securely use APIs in a unified, composable way without custom integrations.

## How to build Composio search MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/composio_search/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/composio_search/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/composio_search/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/composio_search/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/composio_search/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/composio_search/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/composio_search/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/composio_search/framework/cli)
- [Google ADK](https://composio.dev/toolkits/composio_search/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/composio_search/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/composio_search/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/composio_search/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/composio_search/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/composio_search/framework/crew-ai)

## Related Toolkits

- [Composio](https://composio.dev/toolkits/composio) - Composio is an integration platform that connects AI agents with hundreds of business tools. It streamlines authentication and lets you trigger actions across services—no custom code needed.
- [Perplexityai](https://composio.dev/toolkits/perplexityai) - Perplexityai delivers natural, conversational AI models for generating human-like text. Instantly get context-aware, high-quality responses for chat, search, or complex workflows.
- [Browser tool](https://composio.dev/toolkits/browser_tool) - Browser tool is a virtual browser integration that lets AI agents interact with the web programmatically. It enables automated browsing, scraping, and action-taking from any AI workflow.
- [Ai ml api](https://composio.dev/toolkits/ai_ml_api) - Ai ml api is a suite of AI/ML models for natural language and image tasks. It provides fast, scalable access to advanced AI capabilities for your apps and workflows.
- [Aivoov](https://composio.dev/toolkits/aivoov) - Aivoov is an AI-powered text-to-speech platform offering 1,000+ voices in over 150 languages. Instantly turn written content into natural, human-like audio for any application.
- [All images ai](https://composio.dev/toolkits/all_images_ai) - All-Images.ai is an AI-powered image generation and management platform. It helps you create, search, and organize images effortlessly with advanced AI capabilities.
- [Anthropic administrator](https://composio.dev/toolkits/anthropic_administrator) - Anthropic administrator is an API for managing Anthropic organizational resources like members, workspaces, and API keys. It helps you automate admin tasks and streamline resource management across your Anthropic organization.
- [Api labz](https://composio.dev/toolkits/api_labz) - Api labz is a platform offering a suite of AI-driven APIs and workflow tools. It helps developers automate tasks and build smarter, more efficient applications.
- [Apipie ai](https://composio.dev/toolkits/apipie_ai) - Apipie ai is an AI model aggregator offering a single API for accessing top AI models from multiple providers. It helps developers build cost-efficient, latency-optimized AI solutions without juggling multiple integrations.
- [Astica ai](https://composio.dev/toolkits/astica_ai) - Astica ai provides APIs for computer vision, NLP, and voice synthesis. Integrate advanced AI features into your app with a single API key.
- [Bigml](https://composio.dev/toolkits/bigml) - BigML is a machine learning platform that lets you build, train, and deploy predictive models from your data. Its intuitive interface and robust API make machine learning accessible and efficient.
- [Botbaba](https://composio.dev/toolkits/botbaba) - Botbaba is a platform for building, managing, and deploying conversational AI chatbots across messaging channels. It streamlines chatbot automation, making it easier to integrate AI into customer interactions.
- [Botpress](https://composio.dev/toolkits/botpress) - Botpress is an open-source platform for building, deploying, and managing chatbots. It helps teams automate conversations and deliver rich, interactive messaging experiences.
- [Chatbotkit](https://composio.dev/toolkits/chatbotkit) - Chatbotkit is a platform for building and managing AI-powered chatbots using robust APIs and SDKs. It lets you easily add conversational AI to your apps for better user engagement.
- [Cody](https://composio.dev/toolkits/cody) - Cody is an AI assistant built for businesses, trained on your company's knowledge and data. It delivers instant answers and insights, tailored for your team.
- [Context7 MCP](https://composio.dev/toolkits/context7_mcp) - Context7 MCP delivers live, version-specific code docs and examples right from the source. It helps developers and AI agents instantly retrieve authoritative programming info—no more out-of-date docs.
- [Customgpt](https://composio.dev/toolkits/customgpt) - CustomGPT.ai lets you build and deploy chatbots tailored to your own data and business needs. Get precise and context-aware AI conversations without writing code.
- [Datarobot](https://composio.dev/toolkits/datarobot) - Datarobot is a machine learning platform that automates model development, deployment, and monitoring. It empowers organizations to quickly gain predictive insights from large datasets.
- [Deepgram](https://composio.dev/toolkits/deepgram) - Deepgram is an AI-powered speech recognition platform for accurate audio transcription and understanding. It enables fast, scalable speech-to-text with advanced audio intelligence features.
- [DeepImage](https://composio.dev/toolkits/deepimage) - DeepImage is an AI-powered image enhancer and upscaler. Get higher-quality images with just a few clicks.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Composio search MCP?

With a standalone Composio search MCP server, the agents and LLMs can only access a fixed set of Composio search tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Composio search and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with Pydantic AI?

Yes, you can. Pydantic AI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Composio search tools.

### Can I manage the permissions and scopes for Composio search while using Tool Router?

Yes, absolutely. You can configure which Composio search scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Composio search data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
