How to integrate Scrapegraph ai MCP with CrewAI

Trusted by
AWS
Glean
Zoom
Airtable

30 min · no commitment · see it on your stack

Scrapegraph ai logo
CrewAI logo
divider

Introduction

This guide walks you through connecting Scrapegraph ai to CrewAI using the Composio tool router. By the end, you'll have a working Scrapegraph ai agent that can extract product prices from amazon search results, summarize latest news headlines from bbc homepage, convert wikipedia article to markdown format through natural language commands.

This guide will help you understand how to give your CrewAI agent real control over a Scrapegraph ai account through Composio's Scrapegraph ai MCP server.

Before we dive in, let's take a quick look at the key ideas and tools involved.

Also integrate Scrapegraph ai with

TL;DR

Here's what you'll learn:
  • Get a Composio API key and configure your Scrapegraph ai connection
  • Set up CrewAI with an MCP enabled agent
  • Create a Tool Router session or standalone MCP server for Scrapegraph ai
  • Build a conversational loop where your agent can execute Scrapegraph ai operations

What is CrewAI?

CrewAI is a powerful framework for building multi-agent AI systems. It provides primitives for defining agents with specific roles, creating tasks, and orchestrating workflows through crews.

Key features include:

  • Agent Roles: Define specialized agents with specific goals and backstories
  • Task Management: Create tasks with clear descriptions and expected outputs
  • Crew Orchestration: Combine agents and tasks into collaborative workflows
  • MCP Integration: Connect to external tools through Model Context Protocol

What is the Scrapegraph ai MCP server, and what's possible with it?

The Scrapegraph ai MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Scrapegraph ai account. It provides structured and secure access to powerful web scraping and data extraction tools, so your agent can perform actions like running AI-powered scrapers, converting webpages to markdown, monitoring job statuses, and managing your account usage with ease.

  • AI-powered web scraping and search: Instruct your agent to extract structured data from any website or perform detailed web searches with parsed, organized results.
  • Webpage to markdown conversion: Let your agent instantly convert any webpage into clean, readable markdown for easy documentation or analysis.
  • Automated job status tracking: Check on the progress and results of ongoing scraping, crawling, or conversion jobs to stay updated without manual effort.
  • Smart multi-page crawling: Direct the agent to launch intelligent crawlers that gather data across multiple linked pages in a single workflow.
  • Account usage monitoring and feedback: Retrieve your remaining credits, track API usage, and submit feedback on completed tasks—all through your AI agent.

Supported Tools & Triggers

Tools
Convert Webpage to Markdown (V2)Tool to convert any webpage into clean, well-formatted Markdown with full parameter control.
Generate SchemaGenerate or modify a JSON schema based on a search query for structured data extraction.
Get Agentic Scraper HistoryRetrieve paginated history of agentic scraper jobs.
Get Crawler HistoryRetrieve the history of crawler jobs for your account.
Get CreditsRetrieve remaining and used credits for your ScrapeGraphAI account.
Get Endpoint SuggestionsTool to get AI-powered suggestions for creating scraping endpoints.
Get Live Session URLTool to get a URL for a live browser session.
Get Markdownify HistoryTool to retrieve the history of markdownify webpage-to-Markdown conversion jobs.
Get Scrape HistoryRetrieve the history of scrape jobs from your ScrapeGraphAI account.
Get Searchscraper HistoryGet the history of searchscraper jobs with pagination support.
Get Sitemap HistoryTool to retrieve the history of sitemap extraction jobs.
Get Smartscraper HistoryTool to retrieve the history of smartscraper jobs.
Get Usage TimelineTool to retrieve usage timeline statistics for your ScrapeGraphAI account.
Get Webhook LogsTool to retrieve webhook delivery logs for a crawler job.
List Scheduled JobsRetrieve a paginated list of all scheduled scraping jobs for your account.
Markdownify StatusCheck the status and retrieve results of a Markdownify webpage-to-Markdown conversion job.
Save Endpoint ConfigurationTool to save custom scraping endpoint configurations to ScrapeGraphAI.
Search ScraperPerform AI-powered web searches with structured, parsed results.
Check SearchScraper StatusCheck the status and results of an asynchronous SearchScraper job.
SmartCrawler StatusCheck the status and retrieve results of a SmartCrawler web crawling job.
Start Smart ScraperStart AI-powered web scraping with natural language extraction prompts.
SmartScraper StatusCheck the status and retrieve results of a SmartScraper web scraping job.
Start Smart Crawler (Async)Tool to start a multi-page web crawl using SmartCrawler for AI-powered data extraction.
Submit FeedbackSubmit feedback and ratings for completed ScrapeGraphAI requests.
Submit Product FeedbackSubmit product feedback for ScrapeGraphAI.
Convert JSON to TOON FormatTool to convert JSON data to TOON (Token-Oriented Object Notation) format.
Validate API KeyValidate your ScrapeGraphAI API key to ensure it is active and authorized.

What is the Composio tool router, and how does it fit here?

What is Composio SDK?

Composio's Composio SDK helps agents find the right tools for a task at runtime. You can plug in multiple toolkits (like Gmail, HubSpot, and GitHub), and the agent will identify the relevant app and action to complete multi-step workflows. This can reduce token usage and improve the reliability of tool calls. Read more here: Getting started with Composio SDK

The tool router generates a secure MCP URL that your agents can access to perform actions.

How the Composio SDK works

The Composio SDK follows a three-phase workflow:

  1. Discovery: Searches for tools matching your task and returns relevant toolkits with their details.
  2. Authentication: Checks for active connections. If missing, creates an auth config and returns a connection URL via Auth Link.
  3. Execution: Executes the action using the authenticated connection.

Step-by-step Guide

Prerequisites

Before starting, make sure you have:
  • Python 3.9 or higher
  • A Composio account and API key
  • A Scrapegraph ai connection authorized in Composio
  • An OpenAI API key for the CrewAI LLM
  • Basic familiarity with Python

Getting API Keys for OpenAI and Composio

OpenAI API Key
  • Go to the OpenAI dashboard and create an API key. You'll need credits to use the models, or you can connect to another model provider.
  • Keep the API key safe.
Composio API Key
  • Log in to the Composio dashboard.
  • Navigate to your API settings and generate a new API key.
  • Store this key securely as you'll need it for authentication.

Install dependencies

bash
pip install composio crewai crewai-tools[mcp] python-dotenv
What's happening:
  • composio connects your agent to Scrapegraph ai via MCP
  • crewai provides Agent, Task, Crew, and LLM primitives
  • crewai-tools[mcp] includes MCP helpers
  • python-dotenv loads environment variables from .env

Set up environment variables

bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key_here

Create a .env file in your project root.

What's happening:

  • COMPOSIO_API_KEY authenticates with Composio
  • USER_ID scopes the session to your account
  • OPENAI_API_KEY lets CrewAI use your chosen OpenAI model

Import dependencies

python
import os
from composio import Composio
from crewai import Agent, Task, Crew
from crewai_tools import MCPServerAdapter
import dotenv

dotenv.load_dotenv()

COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set")
What's happening:
  • CrewAI classes define agents and tasks, and run the workflow
  • MCPServerHTTP connects the agent to an MCP endpoint
  • Composio will give you a short lived Scrapegraph ai MCP URL

Create a Composio Tool Router session for Scrapegraph ai

python
composio_client = Composio(api_key=COMPOSIO_API_KEY)
session = composio_client.create(user_id=COMPOSIO_USER_ID, toolkits=["scrapegraph_ai"])

url = session.mcp.url
What's happening:
  • You create a Scrapegraph ai only session through Composio
  • Composio returns an MCP HTTP URL that exposes Scrapegraph ai tools

Initialize the MCP Server

python
server_params = {
    "url": url,
    "transport": "streamable-http",
    "headers": {"x-api-key": COMPOSIO_API_KEY},
}

with MCPServerAdapter(server_params) as tools:
    agent = Agent(
        role="Search Assistant",
        goal="Help users search the internet effectively",
        backstory="You are a helpful assistant with access to search tools.",
        tools=tools,
        verbose=False,
        max_iter=10,
    )
What's Happening:
  • Server Configuration: The code sets up connection parameters including the MCP server URL, streamable HTTP transport, and Composio API key authentication.
  • MCP Adapter Bridge: MCPServerAdapter acts as a context manager that converts Composio MCP tools into a CrewAI-compatible format.
  • Agent Setup: Creates a CrewAI Agent with a defined role (Search Assistant), goal (help with internet searches), and access to the MCP tools.
  • Configuration Options: The agent includes settings like verbose=False for clean output and max_iter=10 to prevent infinite loops.
  • Dynamic Tool Usage: Once created, the agent automatically accesses all Composio Search tools and decides when to use them based on user queries.

Create a CLI Chatloop and define the Crew

python
print("Chat started! Type 'exit' or 'quit' to end.\n")

conversation_context = ""

while True:
    user_input = input("You: ").strip()

    if user_input.lower() in ["exit", "quit", "bye"]:
        print("\nGoodbye!")
        break

    if not user_input:
        continue

    conversation_context += f"\nUser: {user_input}\n"
    print("\nAgent is thinking...\n")

    task = Task(
        description=(
            f"Conversation history:\n{conversation_context}\n\n"
            f"Current request: {user_input}"
        ),
        expected_output="A helpful response addressing the user's request",
        agent=agent,
    )

    crew = Crew(agents=[agent], tasks=[task], verbose=False)
    result = crew.kickoff()
    response = str(result)

    conversation_context += f"Agent: {response}\n"
    print(f"Agent: {response}\n")
What's Happening:
  • Interactive CLI Setup: The code creates an infinite loop that continuously prompts for user input and maintains the entire conversation history in a string variable.
  • Input Validation: Empty inputs are ignored to prevent processing blank messages and keep the conversation clean.
  • Context Building: Each user message is appended to the conversation context, which preserves the full dialogue history for better agent responses.
  • Dynamic Task Creation: For every user input, a new Task is created that includes both the full conversation history and the current request as context.
  • Crew Execution: A Crew is instantiated with the agent and task, then kicked off to process the request and generate a response.
  • Response Management: The agent's response is converted to a string, added to the conversation context, and displayed to the user, maintaining conversational continuity.

Complete Code

Here's the complete code to get you started with Scrapegraph ai and CrewAI:

python
from crewai import Agent, Task, Crew, LLM
from crewai_tools import MCPServerAdapter
from composio import Composio
from dotenv import load_dotenv
import os

load_dotenv()

GOOGLE_API_KEY = os.getenv("GOOGLE_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not GOOGLE_API_KEY:
    raise ValueError("GOOGLE_API_KEY is not set in the environment.")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set in the environment.")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set in the environment.")

# Initialize Composio and create a session
composio = Composio(api_key=COMPOSIO_API_KEY)
session = composio.create(
    user_id=COMPOSIO_USER_ID,
    toolkits=["scrapegraph_ai"],
)
url = session.mcp.url

# Configure LLM
llm = LLM(
    model="gpt-5",
    api_key=os.getenv("OPENAI_API_KEY"),
)

server_params = {
    "url": url,
    "transport": "streamable-http",
    "headers": {"x-api-key": COMPOSIO_API_KEY},
}

with MCPServerAdapter(server_params) as tools:
    agent = Agent(
        role="Search Assistant",
        goal="Help users with internet searches",
        backstory="You are an expert assistant with access to Composio Search tools.",
        tools=tools,
        llm=llm,
        verbose=False,
        max_iter=10,
    )

    print("Chat started! Type 'exit' or 'quit' to end.\n")

    conversation_context = ""

    while True:
        user_input = input("You: ").strip()

        if user_input.lower() in ["exit", "quit", "bye"]:
            print("\nGoodbye!")
            break

        if not user_input:
            continue

        conversation_context += f"\nUser: {user_input}\n"
        print("\nAgent is thinking...\n")

        task = Task(
            description=(
                f"Conversation history:\n{conversation_context}\n\n"
                f"Current request: {user_input}"
            ),
            expected_output="A helpful response addressing the user's request",
            agent=agent,
        )

        crew = Crew(agents=[agent], tasks=[task], verbose=False)
        result = crew.kickoff()
        response = str(result)

        conversation_context += f"Agent: {response}\n"
        print(f"Agent: {response}\n")

Conclusion

You now have a CrewAI agent connected to Scrapegraph ai through Composio's Tool Router. The agent can perform Scrapegraph ai operations through natural language commands.

Next steps:

  • Add role-specific instructions to customize agent behavior
  • Plug in more toolkits for multi-app workflows
  • Chain tasks for complex multi-step operations

How to build Scrapegraph ai MCP Agent with another framework

FAQ

What are the differences in Tool Router MCP and Scrapegraph ai MCP?

With a standalone Scrapegraph ai MCP server, the agents and LLMs can only access a fixed set of Scrapegraph ai tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Scrapegraph ai and many other apps based on the task at hand, all through a single MCP endpoint.

Can I use Tool Router MCP with CrewAI?

Yes, you can. CrewAI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Scrapegraph ai tools.

Can I manage the permissions and scopes for Scrapegraph ai while using Tool Router?

Yes, absolutely. You can configure which Scrapegraph ai scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Scrapegraph ai data and credentials are handled as safely as possible.

Used by agents from

Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.