# Scrapingant

```json
{
  "name": "Scrapingant",
  "slug": "scrapingant",
  "url": "https://composio.dev/toolkits/scrapingant",
  "markdown_url": "https://composio.dev/toolkits/scrapingant.md",
  "logo_url": "https://logos.composio.dev/api/scrapingant",
  "categories": [
    "data & analytics"
  ],
  "is_composio_managed": false,
  "updated_at": "2026-05-12T10:24:50.976Z"
}
```

![Scrapingant logo](https://logos.composio.dev/api/scrapingant)

## Description

Securely connect your AI agents and chatbots (Claude, ChatGPT, Cursor, etc) with Scrapingant MCP or direct API to extract web data, render JavaScript-heavy pages, rotate proxies, and scrape at scale through natural language.

## Summary

ScrapingAnt is a web scraping API with Chrome rendering and proxy rotation. It enables reliable, scalable extraction from any website, even those with JavaScript or anti-bot protection.

## Categories

- data & analytics

## Toolkit Details

- Tools: 7

## Images

- Logo: https://logos.composio.dev/api/scrapingant

## Authentication

- **Api Key**
  - Type: `api_key`
  - Description: Api Key authentication for Scrapingant.
  - Setup:
    - Configure Api Key credentials for Scrapingant.
    - Use the credentials when creating an auth config in Composio.

## Suggested Prompts

- Extract product prices from Amazon search page
- Convert a blog post to markdown format
- Get API usage stats for my account
- Scrape latest headlines and summaries from CNN

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `SCRAPINGANT_EXTRACT_CONTENT_AS_MARKDOWN` | Extract Content as Markdown | This tool extracts content from a given URL and converts it into Markdown format. It is particularly useful for preparing text for Language Learning Models (LLMs) and Retrieval-Augmented Generation (RAG) systems. It supports GET, POST, PUT, and DELETE methods. |
| `SCRAPINGANT_EXTRACT_DATA_WITH_AI` | Extract Data with AI | This tool allows you to extract structured data from a web page using ScrapingAnt's AI-powered extraction capabilities. You provide a URL and an AI query (prompt) describing what data you want to extract, and the tool returns the extracted data in a structured format. It supports additional parameters for browser rendering, proxies, and cookies to handle dynamic content and localization. |
| `SCRAPINGANT_GET_API_CREDITS_USAGE` | Get API Credits Usage | This tool retrieves the current API credit usage status for the authenticated ScrapingAnt account. It enables users to monitor their consumption of API credits, check their current usage against the subscription limits, and manage their API credits effectively. |
| `SCRAPINGANT_SCRAPE_WEB_PAGE` | Scrape Web Page | This tool scrapes a web page using the ScrapingAnt API. It fetches the HTML content of the specified URL. Users can customize the scraping behavior by enabling a headless browser, using proxies, waiting for specific elements, executing JavaScript, passing cookies, and blocking certain resources. |
| `SCRAPINGANT_SCRAPE_WEBPAGE_POST` | Scrape Webpage via POST | Tool to perform a POST request through ScrapingAnt's proxy to scrape a webpage. Use when you need to scrape pages that require POST method, such as form submissions or APIs that only accept POST requests. Data is forwarded transparently to the target web page. |
| `SCRAPINGANT_SCRAPE_WEBPAGE_PUT` | Scrape Webpage with PUT | Tool to perform a PUT request through ScrapingAnt's proxy to scrape a webpage that requires PUT method. Use when the target webpage requires PUT method for data submission. Data is forwarded transparently to the target web page. |
| `SCRAPINGANT_SCRAPE_WITH_EXTENDED_JSON_OUTPUT` | Scrape with Extended JSON Output | Scrapes a web page and returns comprehensive data including HTML content, plain text, cookies, HTTP headers, XHR/Fetch requests, and iframe content. This tool uses ScrapingAnt's extended endpoint which provides much richer data than standard scraping: - Full HTML and extracted plain text content - All cookies and HTTP response headers from the target page - Captured XHR/Fetch API requests made by the page (useful for finding hidden APIs) - Content from embedded iframes Best used when you need more than just the HTML - such as analyzing cookies, headers, or JavaScript API calls made by a page. For simple HTML scraping, consider using the basic scrape tool instead for lower API credit usage. |

## Supported Triggers

None listed.

## Installation and MCP Setup

### Path 1: SDK Installation

#### Path 1, Step 1: Install Composio

Install the Composio SDK
```python
pip install composio_openai
```

```typescript
npm install @composio/openai
```

#### Path 1, Step 2: Initialize Composio and Create Tool Router Session

Import and initialize Composio client, then create a Tool Router session
```python
from openai import OpenAI
from composio import Composio
from composio_openai import OpenAIResponsesProvider

composio = Composio(provider=OpenAIResponsesProvider())
openai = OpenAI()
session = composio.create(user_id='your-user-id')
```

```typescript
import OpenAI from 'openai';
import { Composio } from '@composio/core';
import { OpenAIResponsesProvider } from '@composio/openai';

const composio = new Composio({
  provider: new OpenAIResponsesProvider(),
});
const openai = new OpenAI({});
const session = await composio.create('your-user-id');
```

#### Path 1, Step 3: Execute Scrapingant Tools via Tool Router with Your Agent

Get tools from Tool Router session and execute Scrapingant actions with your Agent
```python
tools = session.tools
response = openai.responses.create(
  model='gpt-4.1',
  tools=tools,
  input=[{
    'role': 'user',
    'content': 'Extract all product titles and prices from this Amazon search results page.'
  }]
)
result = composio.provider.handle_tool_calls(
  response=response,
  user_id='your-user-id'
)
print(result)
```

```typescript
const tools = session.tools;
const response = await openai.responses.create({
  model: 'gpt-4.1',
  tools: tools,
  input: [{
    role: 'user',
    content: 'Extract all product titles and prices from this Amazon search results page.'
  }],
});
const result = await composio.provider.handleToolCalls(
  'your-user-id',
  response.output
);
console.log(result);
```

### Path 2: MCP Server Setup

#### Path 2, Step 1: Install Composio

Install the Composio SDK and Claude Agent SDK
```python
pip install composio claude-agent-sdk
```

```typescript
npm install @composio/core ai @ai-sdk/openai @ai-sdk/mcp
```

#### Path 2, Step 2: Create Tool Router Session

Initialize the Composio client and create a Tool Router session
```python
from composio import Composio
from claude_agent_sdk import ClaudeSDKClient, ClaudeAgentOptions

composio = Composio(api_key='your-composio-api-key')
session = composio.create(user_id='your-user-id')
url = session.mcp.url
```

```typescript
import { Composio } from '@composio/core';

const composio = new Composio({ apiKey: 'your-api-key' });

console.log("Creating Tool Router session...");
const { mcp } = await composio.create('your-user-id');
console.log(`Tool Router session created: ${mcp.url}`);
```

#### Path 2, Step 3: Connect to AI Agent

Use the MCP server with your AI agent
```python
import asyncio

options = ClaudeAgentOptions(
    permission_mode='bypassPermissions',
    mcp_servers={
        'tool_router': {
            'type': 'http',
            'url': url,
            'headers': {
                'x-api-key': 'your-composio-api-key'
            }
        }
    },
    system_prompt='You are a helpful assistant with access to Scrapingant tools.',
    max_turns=10
)

async def main():
    async with ClaudeSDKClient(options=options) as client:
        await client.query('Extract content as markdown from https://news.ycombinator.com/')
        async for message in client.receive_response():
            if hasattr(message, 'content'):
                for block in message.content:
                    if hasattr(block, 'text'):
                        print(block.text)

asyncio.run(main())
```

```typescript
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from '@ai-sdk/mcp';
import { generateText, stepCountIs } from 'ai';

const client = await createMCPClient({
  transport: {
    type: 'http',
    url: mcp.url,
    headers: { 'x-api-key': 'your-composio-api-key' }
  }
});

const tools = await client.tools();

const { text } = await generateText({
  model: openai('gpt-4o'),
  tools,
  messages: [{ role: 'user', content: 'Extract content as markdown from https://news.ycombinator.com/' }],
  stopWhen: stepCountIs( 5 )
});

console.log(`Agent: ${text}`);
```

## Why Use Composio?

### 1. AI Native Scrapingant Integration

- Supports both Scrapingant MCP and direct API based integrations
- Structured, LLM-friendly schemas for reliable tool execution
- Rich coverage for reading, writing, and querying your Scrapingant data

### 2. Managed Auth

- Built-in API key handling with secure storage
- Central place to manage, scope, and revoke Scrapingant keys
- Per user and per environment credentials instead of hard-coded keys

### 3. Agent Optimized Design

- Tools are tuned using real error and success rates to improve reliability over time
- Comprehensive execution logs so you always know what ran, when, and on whose behalf

### 4. Enterprise Grade Security

- Fine-grained RBAC so you control which agents and users can access Scrapingant
- Scoped, least privilege access to Scrapingant resources
- Full audit trail of agent actions to support review and compliance

## Use Scrapingant with any AI Agent Framework

Choose a framework you want to connect Scrapingant with:

- [OpenAI Agents SDK](https://composio.dev/toolkits/scrapingant/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/scrapingant/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/scrapingant/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/scrapingant/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/scrapingant/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/scrapingant/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/scrapingant/framework/hermes-agent)
- [Google ADK](https://composio.dev/toolkits/scrapingant/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/scrapingant/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/scrapingant/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/scrapingant/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/scrapingant/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/scrapingant/framework/crew-ai)
- [Pydantic AI](https://composio.dev/toolkits/scrapingant/framework/pydantic-ai)
- [AutoGen](https://composio.dev/toolkits/scrapingant/framework/autogen)

## Related Toolkits

- [Excel](https://composio.dev/toolkits/excel) - Microsoft Excel is a robust spreadsheet application for organizing, analyzing, and visualizing data. It's the go-to tool for calculations, reporting, and flexible data management.
- [21risk](https://composio.dev/toolkits/_21risk) - 21RISK is a web app built for easy checklist, audit, and compliance management. It streamlines risk processes so teams can focus on what matters.
- [Abstract](https://composio.dev/toolkits/abstract) - Abstract provides a suite of APIs for automating data validation and enrichment tasks. It helps developers streamline workflows and ensure data quality with minimal effort.
- [Addressfinder](https://composio.dev/toolkits/addressfinder) - Addressfinder is a data quality platform for verifying addresses, emails, and phone numbers. It helps you ensure accurate customer and contact data every time.
- [Agenty](https://composio.dev/toolkits/agenty) - Agenty is a web scraping and automation platform for extracting data and automating browser tasks—no coding needed. It streamlines data collection, monitoring, and repetitive online actions.
- [Ambee](https://composio.dev/toolkits/ambee) - Ambee is an environmental data platform providing real-time, hyperlocal APIs for air quality, weather, and pollen. Get precise environmental insights to power smarter decisions in your apps and workflows.
- [Ambient weather](https://composio.dev/toolkits/ambient_weather) - Ambient Weather is a platform for personal weather stations with a robust API for accessing local, real-time, and historical weather data. Get detailed environmental insights directly from your own sensors for smarter apps and automations.
- [Anonyflow](https://composio.dev/toolkits/anonyflow) - Anonyflow is a service for encryption-based data anonymization and secure data sharing. It helps organizations meet GDPR, CCPA, and HIPAA data privacy compliance requirements.
- [Api ninjas](https://composio.dev/toolkits/api_ninjas) - Api ninjas offers 120+ public APIs spanning categories like weather, finance, sports, and more. Developers use it to supercharge apps with real-time data and actionable endpoints.
- [Api sports](https://composio.dev/toolkits/api_sports) - Api sports is a comprehensive sports data platform covering 2,000+ competitions with live scores and 15+ years of stats. Instantly access up-to-date sports information for analysis, apps, or chatbots.
- [Apify](https://composio.dev/toolkits/apify) - Apify is a cloud platform for building, deploying, and managing web scraping and automation tools called Actors. It lets you automate data extraction and workflow tasks at scale—no infrastructure headaches.
- [Autom](https://composio.dev/toolkits/autom) - Autom is a lightning-fast search engine results data platform for Google, Bing, and Brave. Developers use it to access fresh, low-latency SERP data on demand.
- [Beaconchain](https://composio.dev/toolkits/beaconchain) - Beaconchain is a real-time analytics platform for Ethereum 2.0's Beacon Chain. It provides detailed insights into validators, blocks, and overall network performance.
- [Big data cloud](https://composio.dev/toolkits/big_data_cloud) - BigDataCloud provides APIs for geolocation, reverse geocoding, and address validation. Instantly access reliable location intelligence to enhance your applications and workflows.
- [Bigpicture io](https://composio.dev/toolkits/bigpicture_io) - BigPicture.io offers APIs for accessing detailed company and profile data. Instantly enrich your applications with up-to-date insights on 20M+ businesses.
- [Bitquery](https://composio.dev/toolkits/bitquery) - Bitquery is a blockchain data platform offering indexed, real-time, and historical data from 40+ blockchains via GraphQL APIs. Get unified, reliable access to complex on-chain data for analytics, trading, and research.
- [Brightdata](https://composio.dev/toolkits/brightdata) - Brightdata is a leading web data platform offering advanced scraping, SERP APIs, and anti-bot tools. It lets you collect public web data at scale, bypassing blocks and friction.
- [Builtwith](https://composio.dev/toolkits/builtwith) - BuiltWith is a web technology profiler that uncovers the technologies powering any website. Gain actionable insights into analytics, hosting, and content management stacks for smarter research and lead generation.
- [Byteforms](https://composio.dev/toolkits/byteforms) - Byteforms is an all-in-one platform for creating forms, managing submissions, and integrating data. It streamlines workflows by centralizing form data collection and automation.
- [Cabinpanda](https://composio.dev/toolkits/cabinpanda) - Cabinpanda is a data collection platform for building and managing online forms. It helps streamline how you gather, organize, and analyze responses.

## Frequently Asked Questions

### Do I need my own developer credentials to use Scrapingant with Composio?

Yes, Scrapingant requires you to configure your own API key credentials. Once set up, Composio handles secure credential storage and API request handling for you.

### Can I use multiple toolkits together?

Yes! Composio's Tool Router enables agents to use multiple toolkits. [Learn more](https://docs.composio.dev/tool-router/overview).

### Is Composio secure?

Composio is SOC 2 and ISO 27001 compliant with all data encrypted in transit and at rest. [Learn more](https://trust.composio.dev).

### What if the API changes?

Composio maintains and updates all toolkit integrations automatically, so your agents always work with the latest API versions.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
