How to integrate Scrape do MCP with Pydantic AI

Framework Integration Gradient
Scrape do Logo
Pydantic AI Logo
divider

Introduction

This guide walks you through connecting Scrape do to Pydantic AI using the Composio tool router. By the end, you'll have a working Scrape do agent that can scrape product prices from a dynamic website, extract news headlines with javascript rendering, bypass cloudflare to get full page html, scrape mobile version of a web page through natural language commands.

This guide will help you understand how to give your Pydantic AI agent real control over a Scrape do account through Composio's Scrape do MCP server.

Before we dive in, let's take a quick look at the key ideas and tools involved.

TL;DR

Here's what you'll learn:
  • How to set up your Composio API key and User ID
  • How to create a Composio Tool Router session for Scrape do
  • How to attach an MCP Server to a Pydantic AI agent
  • How to stream responses and maintain chat history
  • How to build a simple REPL-style chat interface to test your Scrape do workflows

What is Pydantic AI?

Pydantic AI is a Python framework for building AI agents with strong typing and validation. It leverages Pydantic's data validation capabilities to create robust, type-safe AI applications.

Key features include:

  • Type Safety: Built on Pydantic for automatic data validation
  • MCP Support: Native support for Model Context Protocol servers
  • Streaming: Built-in support for streaming responses
  • Async First: Designed for async/await patterns

What is the Scrape do MCP server, and what's possible with it?

The Scrape do MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Scrape do account. It provides structured and secure access to robust web scraping tools, so your agent can perform actions like scraping dynamic pages, managing sessions, setting custom headers or proxies, and extracting structured data from any website on your behalf.

  • Dynamic page scraping with headless browsers: Retrieve fully rendered HTML content from JavaScript-heavy or protected websites by leveraging advanced browser emulation and proxy rotation.
  • Custom scraping session management: Set device type, cookies, wait times, and custom headers to imitate different users, maintain sessions, or access device-specific content for tailored data extraction.
  • Proxy and anti-bot bypass control: Enable super or proxy modes to utilize residential, mobile, or datacenter proxies, helping your agent bypass strict anti-bot systems and geo-restrictions seamlessly.
  • Targeted resource filtering: Block specific URLs like ads or analytics scripts during scraping to increase speed, avoid distractions, and improve privacy.
  • Account usage and statistics retrieval: Access real-time usage stats, subscription status, and remaining request limits so your agent can monitor scraping quotas and avoid interruptions.

Supported Tools & Triggers

Tools
Get Account InformationRetrieves account information and usage statistics from scrape.
Get rendered page contentThis tool allows you to scrape web pages with javascript rendering enabled.
Scrape webpage using scrape.doA tool to scrape web pages using scrape.
Use Scrape.do Proxy ModeThis tool implements the proxy mode functionality of scrape.
Set Cookies for ScrapingThis tool allows users to set specific cookies for their scraping requests to a target website.
Set Scrape.do Super ModeThe scrape do set super mode tool enables enhanced scraping by using residential and mobile proxies, bypassing blocks and restrictions associated with datacenter ips.
Block specific URLs during scrapingThis tool allows users to block specific urls during the scraping process.
Set custom headers for scrape.do requestA tool to send custom headers with scrape.
Set Custom Wait TimeThis tool sets the custom wait time in milliseconds after page load when using the render option in scrape.
Set Device Type for ScrapingThis tool allows users to set the device type (desktop, mobile, or tablet) for making scraping requests.
Set Disable RedirectionControls the automatic redirection behavior of scrape.
Set Pure Cookies ModeThis tool enables getting the original set-cookie headers from target websites instead of the processed scrape.
Set Regional Geolocation for ScrapingThis tool allows users to set a broader geographical targeting by specifying a region code instead of a specific country code.
Set Retry TimeoutThis tool allows users to set the maximum wait time (in milliseconds) before retrying a failed request in scrape.
Set Screenshot Capture for ScrapingThis tool enables the screenshot functionality for the scrape.
Set Session ID for Sticky SessionsThis tool implements the session id functionality for scrape.
Set Wait For SelectorThis action allows setting a css selector to wait for before considering the page load complete.
Set Wait Until ConditionThis tool sets the waituntil parameter for the scrape.
Monitor WebSocket requests using scrape.doThis tool provides the ability to view websocket requests made by a webpage.

What is the Composio tool router, and how does it fit here?

What is Tool Router?

Composio's Tool Router helps agents find the right tools for a task at runtime. You can plug in multiple toolkits (like Gmail, HubSpot, and GitHub), and the agent will identify the relevant app and action to complete multi-step workflows. This can reduce token usage and improve the reliability of tool calls. Read more here: Getting started with Tool Router

The tool router generates a secure MCP URL that your agents can access to perform actions.

How the Tool Router works

The Tool Router follows a three-phase workflow:

  1. Discovery: Searches for tools matching your task and returns relevant toolkits with their details.
  2. Authentication: Checks for active connections. If missing, creates an auth config and returns a connection URL via Auth Link.
  3. Execution: Executes the action using the authenticated connection.

Step-by-step Guide

Prerequisites

Before starting, make sure you have:
  • Python 3.9 or higher
  • A Composio account with an active API key
  • Basic familiarity with Python and async programming

Getting API Keys for OpenAI and Composio

OpenAI API Key
  • Go to the OpenAI dashboard and create an API key. You'll need credits to use the models, or you can connect to another model provider.
  • Keep the API key safe.
Composio API Key
  • Log in to the Composio dashboard.
  • Navigate to your API settings and generate a new API key.
  • Store this key securely as you'll need it for authentication.

Install dependencies

bash
pip install composio pydantic-ai python-dotenv

Install the required libraries.

What's happening:

  • composio connects your agent to external SaaS tools like Scrape do
  • pydantic-ai lets you create structured AI agents with tool support
  • python-dotenv loads your environment variables securely from a .env file

Set up environment variables

bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key

Create a .env file in your project root.

What's happening:

  • COMPOSIO_API_KEY authenticates your agent to Composio's API
  • USER_ID associates your session with your account for secure tool access
  • OPENAI_API_KEY to access OpenAI LLMs

Import dependencies

python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()
What's happening:
  • We load environment variables and import required modules
  • Composio manages connections to Scrape do
  • MCPServerStreamableHTTP connects to the Scrape do MCP server endpoint
  • Agent from Pydantic AI lets you define and run the AI assistant

Create a Tool Router Session

python
async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Scrape do
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["scrape_do"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")
What's happening:
  • We're creating a Tool Router session that gives your agent access to Scrape do tools
  • The create method takes the user ID and specifies which toolkits should be available
  • The returned session.mcp.url is the MCP server URL that your agent will use

Initialize the Pydantic AI Agent

python
# Attach the MCP server to a Pydantic AI Agent
scrape_do_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
agent = Agent(
    "openai:gpt-5",
    toolsets=[scrape_do_mcp],
    instructions=(
        "You are a Scrape do assistant. Use Scrape do tools to help users "
        "with their requests. Ask clarifying questions when needed."
    ),
)
What's happening:
  • The MCP client connects to the Scrape do endpoint
  • The agent uses GPT-5 to interpret user commands and perform Scrape do operations
  • The instructions field defines the agent's role and behavior

Build the chat interface

python
# Simple REPL with message history
history = []
print("Chat started! Type 'exit' or 'quit' to end.\n")
print("Try asking the agent to help you with Scrape do.\n")

while True:
    user_input = input("You: ").strip()
    if user_input.lower() in {"exit", "quit", "bye"}:
        print("\nGoodbye!")
        break
    if not user_input:
        continue

    print("\nAgent is thinking...\n", flush=True)

    async with agent.run_stream(user_input, message_history=history) as stream_result:
        collected_text = ""
        async for chunk in stream_result.stream_output():
            text_piece = None
            if isinstance(chunk, str):
                text_piece = chunk
            elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                text_piece = chunk.delta
            elif hasattr(chunk, "text"):
                text_piece = chunk.text
            if text_piece:
                collected_text += text_piece
        result = stream_result

    print(f"Agent: {collected_text}\n")
    history = result.all_messages()
What's happening:
  • The agent reads input from the terminal and streams its response
  • Scrape do API calls happen automatically under the hood
  • The model keeps conversation history to maintain context across turns

Run the application

python
if __name__ == "__main__":
    asyncio.run(main())
What's happening:
  • The asyncio loop launches the agent and keeps it running until you exit

Complete Code

Here's the complete code to get you started with Scrape do and Pydantic AI:

python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()

async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Scrape do
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["scrape_do"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")

    # Attach the MCP server to a Pydantic AI Agent
    scrape_do_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
    agent = Agent(
        "openai:gpt-5",
        toolsets=[scrape_do_mcp],
        instructions=(
            "You are a Scrape do assistant. Use Scrape do tools to help users "
            "with their requests. Ask clarifying questions when needed."
        ),
    )

    # Simple REPL with message history
    history = []
    print("Chat started! Type 'exit' or 'quit' to end.\n")
    print("Try asking the agent to help you with Scrape do.\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() in {"exit", "quit", "bye"}:
            print("\nGoodbye!")
            break
        if not user_input:
            continue

        print("\nAgent is thinking...\n", flush=True)

        async with agent.run_stream(user_input, message_history=history) as stream_result:
            collected_text = ""
            async for chunk in stream_result.stream_output():
                text_piece = None
                if isinstance(chunk, str):
                    text_piece = chunk
                elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                    text_piece = chunk.delta
                elif hasattr(chunk, "text"):
                    text_piece = chunk.text
                if text_piece:
                    collected_text += text_piece
            result = stream_result

        print(f"Agent: {collected_text}\n")
        history = result.all_messages()

if __name__ == "__main__":
    asyncio.run(main())

Conclusion

You've built a Pydantic AI agent that can interact with Scrape do through Composio's Tool Router. With this setup, your agent can perform real Scrape do actions through natural language. You can extend this further by:
  • Adding other toolkits like Gmail, HubSpot, or Salesforce
  • Building a web-based chat interface around this agent
  • Using multiple MCP endpoints to enable cross-app workflows (for example, Gmail + Scrape do for workflow automation)
This architecture makes your AI agent "agent-native", able to securely use APIs in a unified, composable way without custom integrations.

How to build Scrape do MCP Agent with another framework

FAQ

What are the differences in Tool Router MCP and Scrape do MCP?

With a standalone Scrape do MCP server, the agents and LLMs can only access a fixed set of Scrape do tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Scrape do and many other apps based on the task at hand, all through a single MCP endpoint.

Can I use Tool Router MCP with Pydantic AI?

Yes, you can. Pydantic AI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Scrape do tools.

Can I manage the permissions and scopes for Scrape do while using Tool Router?

Yes, absolutely. You can configure which Scrape do scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Scrape do data and credentials are handled as safely as possible.

Used by agents from

Context
ASU
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
ASU
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
ASU
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.