# Scrapfly

```json
{
  "name": "Scrapfly",
  "slug": "scrapfly",
  "url": "https://composio.dev/toolkits/scrapfly",
  "markdown_url": "https://composio.dev/toolkits/scrapfly.md",
  "logo_url": "https://logos.composio.dev/api/scrapfly",
  "categories": [
    "data & analytics"
  ],
  "is_composio_managed": false,
  "updated_at": "2026-05-12T10:24:48.991Z"
}
```

![Scrapfly logo](https://logos.composio.dev/api/scrapfly)

## Description

Securely connect your AI agents and chatbots (Claude, ChatGPT, Cursor, etc) with Scrapfly MCP or direct API to scrape web pages, extract structured data, render JavaScript sites, and bypass anti-bot protection through natural language.

## Summary

Scrapfly is a powerful web scraping API that extracts data from any website. It lets developers handle tough sites with JavaScript rendering, proxy rotation, and anti-bot protection.

## Categories

- data & analytics

## Toolkit Details

- Tools: 12

## Images

- Logo: https://logos.composio.dev/api/scrapfly

## Authentication

- **Api Key**
  - Type: `api_key`
  - Description: Api Key authentication for Scrapfly.
  - Setup:
    - Configure Api Key credentials for Scrapfly.
    - Use the credentials when creating an auth config in Composio.

## Suggested Prompts

- Extract product prices from Amazon listings
- Scrape job postings from LinkedIn search
- Get latest news headlines from BBC homepage
- Collect real estate listings from Zillow

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `SCRAPFLY_CAPTURE_SCREENSHOT` | Capture Website Screenshot | Tool to capture a full-page or viewport screenshot of a website. Use when you need to take a screenshot with options like JS rendering, custom resolution, or accessibility testing. Returns the screenshot image directly. Supports vision deficiency simulations and dark mode. |
| `SCRAPFLY_CAPTURE_SCREENSHOT_HEAD` | Capture Screenshot Metadata (HEAD) | Tool to capture screenshot metadata without downloading the image body. Use this for async screenshot workflows where you need the URL to retrieve the image later. Returns the screenshot URL in response, saving bandwidth compared to full screenshot retrieval. |
| `SCRAPFLY_CREATE_CRAWLER` | Create Scrapfly Crawler | Tool to create a new web crawler to recursively crawl an entire website. Returns a crawler UUID for tracking progress. Use when you need to crawl multiple pages from a website with configurable limits and extraction rules. |
| `SCRAPFLY_EXTRACT_DATA` | Extract Structured Data | Tool to extract structured data from HTML or other content using AI models, LLM prompts, or custom templates. Use when you need to parse web pages or documents into structured JSON data. Supports predefined extraction models for common types (articles, products, events) or custom extraction via prompts/templates. |
| `SCRAPFLY_GET_ACCOUNT_INFO` | Get Scrapfly Account Information | Tool to retrieve Scrapfly account information. Use after authenticating to get API credit balance and usage stats. Returns comprehensive account data including subscription plan, usage statistics, billing info, and project settings. |
| `SCRAPFLY_GET_CRAWLER_ARTIFACT` | Get Crawler Artifact | Tool to download crawler artifact files in WARC or HAR format. Use when you need to retrieve the complete crawl results as an archive file. WARC format is recommended for large crawls as it includes gzip compression. |
| `SCRAPFLY_GET_CRAWLER_CONTENTS` | Get Crawler Contents | Tool to retrieve extracted content from crawled pages. Supports multiple output formats including markdown, text, HTML, and JSON. Use when you need to access the actual content extracted during a crawl, with optional filtering by URL and format selection. |
| `SCRAPFLY_GET_CRAWLER_STATUS` | Get Crawler Status | Tool to get the current status of a crawler including progress, pages crawled, and completion state. Use for polling workflow to monitor crawl progress. |
| `SCRAPFLY_GET_CRAWLER_URLS` | Get Crawler URLs | Tool to retrieve the list of discovered and crawled URLs from a crawler. Use when you need to get all URLs found during a crawl or filter by status to analyze failed URLs with error codes. Supports pagination for large result sets. |
| `SCRAPFLY_SCRAPE` | Scrapfly Scrape | Tool to perform a web scraping request. Use when you need to fetch a page with custom configuration like JS rendering, proxies, and extraction. |
| `SCRAPFLY_SCRAPE_POST` | Scrapfly Scrape POST | Tool to scrape web pages using POST method to send data in the request body. Use when you need to scrape endpoints that require POST requests, such as form submissions or APIs that expect data payload. |
| `SCRAPFLY_SCRAPE_WITH_PUT` | Scrape With PUT | Tool to scrape web pages using PUT method with body payload. Use when the target API requires PUT requests with data in the request body. Forwards PUT request with custom body to the target URL. If not specified, content-type defaults to application/x-www-form-urlencoded. |

## Supported Triggers

None listed.

## Installation and MCP Setup

### Path 1: SDK Installation

#### Path 1, Step 1: Install Composio

Install the Composio SDK
```python
pip install composio_openai
```

```typescript
npm install @composio/openai
```

#### Path 1, Step 2: Initialize Composio and Create Tool Router Session

Import and initialize Composio client, then create a Tool Router session
```python
from openai import OpenAI
from composio import Composio
from composio_openai import OpenAIResponsesProvider

composio = Composio(provider=OpenAIResponsesProvider())
openai = OpenAI()
session = composio.create(user_id='your-user-id')
```

```typescript
import OpenAI from 'openai';
import { Composio } from '@composio/core';
import { OpenAIResponsesProvider } from '@composio/openai';

const composio = new Composio({
  provider: new OpenAIResponsesProvider(),
});
const openai = new OpenAI({});
const session = await composio.create('your-user-id');
```

#### Path 1, Step 3: Execute Scrapfly Tools via Tool Router with Your Agent

Get tools from Tool Router session and execute Scrapfly actions with your Agent
```python
tools = session.tools
response = openai.responses.create(
  model='gpt-4.1',
  tools=tools,
  input=[{
    'role': 'user',
    'content': 'Scrape product prices from amazon.com laptops section'
  }]
)
result = composio.provider.handle_tool_calls(
  response=response,
  user_id='your-user-id'
)
print(result)
```

```typescript
const tools = session.tools;
const response = await openai.responses.create({
  model: 'gpt-4.1',
  tools: tools,
  input: [{
    role: 'user',
    content: 'Scrape product prices from amazon.com laptops section'
  }],
});
const result = await composio.provider.handleToolCalls(
  'your-user-id',
  response.output
);
console.log(result);
```

### Path 2: MCP Server Setup

#### Path 2, Step 1: Install Composio

Install the Composio SDK and Claude Agent SDK
```python
pip install composio claude-agent-sdk
```

```typescript
npm install @composio/core ai @ai-sdk/openai @ai-sdk/mcp
```

#### Path 2, Step 2: Create Tool Router Session

Initialize the Composio client and create a Tool Router session
```python
from composio import Composio
from claude_agent_sdk import ClaudeSDKClient, ClaudeAgentOptions

composio = Composio(api_key='your-composio-api-key')
session = composio.create(user_id='your-user-id')
url = session.mcp.url
```

```typescript
import { Composio } from '@composio/core';

const composio = new Composio({ apiKey: 'your-api-key' });

console.log("Creating Tool Router session...");
const { mcp } = await composio.create('your-user-id');
console.log(`Tool Router session created: ${mcp.url}`);
```

#### Path 2, Step 3: Connect to AI Agent

Use the MCP server with your AI agent
```python
import asyncio

options = ClaudeAgentOptions(
    permission_mode='bypassPermissions',
    mcp_servers={
        'tool_router': {
            'type': 'http',
            'url': url,
            'headers': {
                'x-api-key': 'your-composio-api-key'
            }
        }
    },
    system_prompt='You are a helpful assistant with access to Scrapfly tools.',
    max_turns=10
)

async def main():
    async with ClaudeSDKClient(options=options) as client:
        await client.query('Scrape the latest news headlines from bbc.com homepage')
        async for message in client.receive_response():
            if hasattr(message, 'content'):
                for block in message.content:
                    if hasattr(block, 'text'):
                        print(block.text)

asyncio.run(main())
```

```typescript
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from '@ai-sdk/mcp';
import { generateText, stepCountIs } from 'ai';

const client = await createMCPClient({
  transport: {
    type: 'http',
    url: mcp.url,
    headers: { 'x-api-key': 'your-composio-api-key' }
  }
});

const tools = await client.tools();

const { text } = await generateText({
  model: openai('gpt-4o'),
  tools,
  messages: [{ role: 'user', content: 'Scrape the latest news headlines from bbc.com homepage' }],
  stopWhen: stepCountIs( 5 )
});

console.log(`Agent: ${text}`);
```

## Why Use Composio?

### 1. AI Native Scrapfly Integration

- Supports both Scrapfly MCP and direct API based integrations
- Structured, LLM-friendly schemas for reliable tool execution
- Rich coverage for reading, writing, and querying your Scrapfly data

### 2. Managed Auth

- Built-in OAuth handling with automatic token refresh and rotation
- Central place to manage, scope, and revoke Scrapfly access
- Per user and per environment credentials instead of hard-coded keys

### 3. Agent Optimized Design

- Tools are tuned using real error and success rates to improve reliability over time
- Comprehensive execution logs so you always know what ran, when, and on whose behalf

### 4. Enterprise Grade Security

- Fine-grained RBAC so you control which agents and users can access Scrapfly
- Scoped, least privilege access to Scrapfly resources
- Full audit trail of agent actions to support review and compliance

## Use Scrapfly with any AI Agent Framework

Choose a framework you want to connect Scrapfly with:

- [OpenAI Agents SDK](https://composio.dev/toolkits/scrapfly/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/scrapfly/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/scrapfly/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/scrapfly/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/scrapfly/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/scrapfly/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/scrapfly/framework/hermes-agent)
- [Google ADK](https://composio.dev/toolkits/scrapfly/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/scrapfly/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/scrapfly/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/scrapfly/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/scrapfly/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/scrapfly/framework/crew-ai)
- [Pydantic AI](https://composio.dev/toolkits/scrapfly/framework/pydantic-ai)
- [AutoGen](https://composio.dev/toolkits/scrapfly/framework/autogen)

## Related Toolkits

- [Excel](https://composio.dev/toolkits/excel) - Microsoft Excel is a robust spreadsheet application for organizing, analyzing, and visualizing data. It's the go-to tool for calculations, reporting, and flexible data management.
- [21risk](https://composio.dev/toolkits/_21risk) - 21RISK is a web app built for easy checklist, audit, and compliance management. It streamlines risk processes so teams can focus on what matters.
- [Abstract](https://composio.dev/toolkits/abstract) - Abstract provides a suite of APIs for automating data validation and enrichment tasks. It helps developers streamline workflows and ensure data quality with minimal effort.
- [Addressfinder](https://composio.dev/toolkits/addressfinder) - Addressfinder is a data quality platform for verifying addresses, emails, and phone numbers. It helps you ensure accurate customer and contact data every time.
- [Agenty](https://composio.dev/toolkits/agenty) - Agenty is a web scraping and automation platform for extracting data and automating browser tasks—no coding needed. It streamlines data collection, monitoring, and repetitive online actions.
- [Ambee](https://composio.dev/toolkits/ambee) - Ambee is an environmental data platform providing real-time, hyperlocal APIs for air quality, weather, and pollen. Get precise environmental insights to power smarter decisions in your apps and workflows.
- [Ambient weather](https://composio.dev/toolkits/ambient_weather) - Ambient Weather is a platform for personal weather stations with a robust API for accessing local, real-time, and historical weather data. Get detailed environmental insights directly from your own sensors for smarter apps and automations.
- [Anonyflow](https://composio.dev/toolkits/anonyflow) - Anonyflow is a service for encryption-based data anonymization and secure data sharing. It helps organizations meet GDPR, CCPA, and HIPAA data privacy compliance requirements.
- [Api ninjas](https://composio.dev/toolkits/api_ninjas) - Api ninjas offers 120+ public APIs spanning categories like weather, finance, sports, and more. Developers use it to supercharge apps with real-time data and actionable endpoints.
- [Api sports](https://composio.dev/toolkits/api_sports) - Api sports is a comprehensive sports data platform covering 2,000+ competitions with live scores and 15+ years of stats. Instantly access up-to-date sports information for analysis, apps, or chatbots.
- [Apify](https://composio.dev/toolkits/apify) - Apify is a cloud platform for building, deploying, and managing web scraping and automation tools called Actors. It lets you automate data extraction and workflow tasks at scale—no infrastructure headaches.
- [Autom](https://composio.dev/toolkits/autom) - Autom is a lightning-fast search engine results data platform for Google, Bing, and Brave. Developers use it to access fresh, low-latency SERP data on demand.
- [Beaconchain](https://composio.dev/toolkits/beaconchain) - Beaconchain is a real-time analytics platform for Ethereum 2.0's Beacon Chain. It provides detailed insights into validators, blocks, and overall network performance.
- [Big data cloud](https://composio.dev/toolkits/big_data_cloud) - BigDataCloud provides APIs for geolocation, reverse geocoding, and address validation. Instantly access reliable location intelligence to enhance your applications and workflows.
- [Bigpicture io](https://composio.dev/toolkits/bigpicture_io) - BigPicture.io offers APIs for accessing detailed company and profile data. Instantly enrich your applications with up-to-date insights on 20M+ businesses.
- [Bitquery](https://composio.dev/toolkits/bitquery) - Bitquery is a blockchain data platform offering indexed, real-time, and historical data from 40+ blockchains via GraphQL APIs. Get unified, reliable access to complex on-chain data for analytics, trading, and research.
- [Brightdata](https://composio.dev/toolkits/brightdata) - Brightdata is a leading web data platform offering advanced scraping, SERP APIs, and anti-bot tools. It lets you collect public web data at scale, bypassing blocks and friction.
- [Builtwith](https://composio.dev/toolkits/builtwith) - BuiltWith is a web technology profiler that uncovers the technologies powering any website. Gain actionable insights into analytics, hosting, and content management stacks for smarter research and lead generation.
- [Byteforms](https://composio.dev/toolkits/byteforms) - Byteforms is an all-in-one platform for creating forms, managing submissions, and integrating data. It streamlines workflows by centralizing form data collection and automation.
- [Cabinpanda](https://composio.dev/toolkits/cabinpanda) - Cabinpanda is a data collection platform for building and managing online forms. It helps streamline how you gather, organize, and analyze responses.

## Frequently Asked Questions

### Do I need my own developer credentials to use Scrapfly with Composio?

Yes, Scrapfly requires you to configure your own API key credentials. Once set up, Composio handles secure credential storage and API request handling for you.

### Can I use multiple toolkits together?

Yes! Composio's Tool Router enables agents to use multiple toolkits. [Learn more](https://docs.composio.dev/tool-router/overview).

### Is Composio secure?

Composio is SOC 2 and ISO 27001 compliant with all data encrypted in transit and at rest. [Learn more](https://trust.composio.dev).

### What if the API changes?

Composio maintains and updates all toolkit integrations automatically, so your agents always work with the latest API versions.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
