# Hyperbrowser

```json
{
  "name": "Hyperbrowser",
  "slug": "hyperbrowser",
  "url": "https://composio.dev/toolkits/hyperbrowser",
  "markdown_url": "https://composio.dev/toolkits/hyperbrowser.md",
  "logo_url": "https://logos.composio.dev/api/hyperbrowser",
  "categories": [
    "workflow automation"
  ],
  "is_composio_managed": false,
  "updated_at": "2026-05-06T08:16:05.497Z"
}
```

![Hyperbrowser logo](https://logos.composio.dev/api/hyperbrowser)

## Description

Securely connect your AI agents and chatbots (Claude, ChatGPT, Cursor, etc) with Hyperbrowser MCP or direct API to automate browsing, extract website data, fill forms, and control browser sessions through natural language.

## Summary

Hyperbrowser is a next-generation platform for scalable browser automation. It empowers AI agents to interact with web apps, automate workflows, and handle browser sessions at scale.

## Categories

- workflow automation

## Toolkit Details

- Tools: 25

## Images

- Logo: https://logos.composio.dev/api/hyperbrowser

## Authentication

- **Api Key**
  - Type: `api_key`
  - Description: Api Key authentication for Hyperbrowser.
  - Setup:
    - Configure Api Key credentials for Hyperbrowser.
    - Use the credentials when creating an auth config in Composio.

## Suggested Prompts

- Start a browser session with stealth mode
- Extract all product titles from this URL
- Check status of my ongoing scrape job
- Delete an unused Hyperbrowser profile

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `HYPERBROWSER_CREATE_PROFILE` | Create Hyperbrowser Profile | Tool to create a new profile. Use when you need to initialize a Hyperbrowser profile before analysis. |
| `HYPERBROWSER_CREATE_SCRAPE_JOB` | Create Scrape Job | Tool to initiate a new scrape job. Use when you need to extract structured content from a target URL with custom session and scrape settings. |
| `HYPERBROWSER_CREATE_SESSION` | Create Session | Tool to create a new browser session with custom stealth, proxy, and privacy settings. Use when initializing an automated browsing session with specific configuration. |
| `HYPERBROWSER_DELETE_PROFILE` | Delete Profile | Tool to delete a profile. Use when you need to remove a profile by its unique identifier after confirming its existence. |
| `HYPERBROWSER_GET_BROWSER_USE_TASK_STATUS` | Get browser-use task status | Tool to retrieve the current status of a browser-use task. Use when checking if a browser automation task has completed or is still pending. |
| `HYPERBROWSER_GET_CLAUDE_COMPUTER_USE_TASK_STATUS` | Get Claude Computer Use Task Status | Tool to retrieve the status of a Claude Computer Use task. Use after creating a task to poll its status. |
| `HYPERBROWSER_GET_CRAWL_JOB_RESULT` | Get Crawl Job Result | Tool to retrieve the result of a completed crawl job. Use after confirming crawl job completion to fetch current page batch and status. Supports pagination via page and batchSize. |
| `HYPERBROWSER_GET_CRAWL_JOB_STATUS` | Get Crawl Job Status | Tool to retrieve the status and results of a specific crawl job. Use after submitting a crawl job to check its progress or fetch results. |
| `HYPERBROWSER_GET_EXTRACT_JOB_RESULT` | Get Extract Job Result | Tool to fetch the status and results of a specific extract job. Use after initiating an extract job to monitor progress and retrieve final data. |
| `HYPERBROWSER_GET_EXTRACT_JOB_STATUS` | Get Extract Job Status | Tool to retrieve the status of an extract job. Use after submitting an extract job to poll its status. |
| `HYPERBROWSER_GET_PROFILE_BY_ID` | Get Profile By ID | Tool to retrieve profile details by ID. Use after confirming the profile ID. |
| `HYPERBROWSER_GET_SCRAPE_JOB_RESULT` | Get Scrape Job Result | Tool to fetch the status and results of a specific scrape job. Use after initiating a scrape job to monitor its progress and retrieve final data. |
| `HYPERBROWSER_GET_SCRAPE_JOB_STATUS` | Get Scrape Job Status | Tool to retrieve the current status of a specific scrape job. Use after initiating a scrape job to poll its status. |
| `HYPERBROWSER_GET_SESSION_DETAILS` | Get Session Details | Tool to retrieve session details by ID. Use after confirming the session ID. |
| `HYPERBROWSER_GET_SESSION_DOWNLOADS_URL` | Get Session Downloads URL | Tool to retrieve the downloads URL for a session. Use when you need the signed URL for session downloads after processing is complete. |
| `HYPERBROWSER_GET_SESSION_RECORDING` | Get Session Recording | Tool to retrieve the recording URL of a session. Use after confirming the session ID and when the recording is expected to be ready. |
| `HYPERBROWSER_LIST_PROFILES` | List Profiles | Tool to list profiles. Use when you need to fetch paginated profiles and optionally filter by name. |
| `HYPERBROWSER_LIST_SESSIONS` | List Sessions | Tool to list sessions with optional status filter. Use when you need a paginated overview of browser sessions before acting on them. |
| `HYPERBROWSER_START_BROWSER_USE_TASK` | Start Browser Use Task | Tool to start an asynchronous browser-use task. Use when you need to automate web interactions given a task instruction. |
| `HYPERBROWSER_START_CLAUDE_COMPUTER_USE_TASK` | Start Claude Computer Use Task | Tool to start a Claude Computer Use task. Use when you need AI-driven automated browser interactions. Call after you have your task prompt and any session preferences configured. |
| `HYPERBROWSER_START_CRAWL_JOB` | Start Crawl Job | Tool to start a new crawl job for a specified URL. Use when you need to initiate a web crawl before checking job status. |
| `HYPERBROWSER_START_EXTRACT_JOB` | Start Extract Job | Tool to start an extract job. Use when you need to initiate a new extraction with custom prompts, schema, and session options. Call after preparing URLs and desired extraction schema. |
| `HYPERBROWSER_STOP_BROWSER_USE_TASK` | Stop Browser Use Task | Tool to stop a running browser-use task. Use when halting an in-progress browser automation task after confirming its task ID. |
| `HYPERBROWSER_STOP_CLAUDE_COMPUTER_USE_TASK` | Stop Claude Computer Use Task | Tool to stop a running Claude computer use task. Use when a Claude computer use task is in progress and needs to be terminated. |
| `HYPERBROWSER_STOP_SESSION` | Stop Session | Tool to stop a running session by ID. Use after confirming the session is active. |

## Supported Triggers

None listed.

## Installation and MCP Setup

### Path 1: SDK Installation

#### Path 1, Step 1: Install Composio

Install the Composio SDK
```python
pip install composio_openai
```

```typescript
npm install @composio/openai
```

#### Path 1, Step 2: Initialize Composio and Create Tool Router Session

Import and initialize Composio client, then create a Tool Router session
```python
from openai import OpenAI
from composio import Composio
from composio_openai import OpenAIResponsesProvider

composio = Composio(provider=OpenAIResponsesProvider())
openai = OpenAI()
session = composio.create(user_id='your-user-id')
```

```typescript
import OpenAI from 'openai';
import { Composio } from '@composio/core';
import { OpenAIResponsesProvider } from '@composio/openai';

const composio = new Composio({
  provider: new OpenAIResponsesProvider(),
});
const openai = new OpenAI({});
const session = await composio.create('your-user-id');
```

#### Path 1, Step 3: Execute Hyperbrowser Tools via Tool Router with Your Agent

Get tools from Tool Router session and execute Hyperbrowser actions with your Agent
```python
tools = session.tools
response = openai.responses.create(
  model='gpt-4.1',
  tools=tools,
  input=[{
    'role': 'user',
    'content': 'Scrape product details from a given e-commerce page'
  }]
)
result = composio.provider.handle_tool_calls(
  response=response,
  user_id='your-user-id'
)
print(result)
```

```typescript
const tools = session.tools;
const response = await openai.responses.create({
  model: 'gpt-4.1',
  tools: tools,
  input: [{
    role: 'user',
    content: 'Scrape product details from a given e-commerce page'
  }],
});
const result = await composio.provider.handleToolCalls(
  'your-user-id',
  response.output
);
console.log(result);
```

### Path 2: MCP Server Setup

#### Path 2, Step 1: Install Composio

Install the Composio SDK and Claude Agent SDK
```python
pip install composio claude-agent-sdk
```

```typescript
npm install @composio/core ai @ai-sdk/openai @ai-sdk/mcp
```

#### Path 2, Step 2: Create Tool Router Session

Initialize the Composio client and create a Tool Router session
```python
from composio import Composio
from claude_agent_sdk import ClaudeSDKClient, ClaudeAgentOptions

composio = Composio(api_key='your-composio-api-key')
session = composio.create(user_id='your-user-id')
url = session.mcp.url
```

```typescript
import { Composio } from '@composio/core';

const composio = new Composio({ apiKey: 'your-api-key' });

console.log("Creating Tool Router session...");
const { mcp } = await composio.create('your-user-id');
console.log(`Tool Router session created: ${mcp.url}`);
```

#### Path 2, Step 3: Connect to AI Agent

Use the MCP server with your AI agent
```python
import asyncio

options = ClaudeAgentOptions(
    permission_mode='bypassPermissions',
    mcp_servers={
        'tool_router': {
            'type': 'http',
            'url': url,
            'headers': {
                'x-api-key': 'your-composio-api-key'
            }
        }
    },
    system_prompt='You are a helpful assistant with access to Hyperbrowser tools.',
    max_turns=10
)

async def main():
    async with ClaudeSDKClient(options=options) as client:
        await client.query('Start a new scrape job for https://news.ycombinator.com/')
        async for message in client.receive_response():
            if hasattr(message, 'content'):
                for block in message.content:
                    if hasattr(block, 'text'):
                        print(block.text)

asyncio.run(main())
```

```typescript
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from '@ai-sdk/mcp';
import { generateText, stepCountIs } from 'ai';

const client = await createMCPClient({
  transport: {
    type: 'http',
    url: mcp.url,
    headers: { 'x-api-key': 'your-composio-api-key' }
  }
});

const tools = await client.tools();

const { text } = await generateText({
  model: openai('gpt-4o'),
  tools,
  messages: [{ role: 'user', content: 'Start a new scrape job for https://news.ycombinator.com/' }],
  stopWhen: stepCountIs( 5 )
});

console.log(`Agent: ${text}`);
```

## Why Use Composio?

### 1. AI Native Hyperbrowser Integration

- Supports both Hyperbrowser MCP and direct API based integrations
- Structured, LLM-friendly schemas for reliable tool execution
- Rich coverage for launching sessions, automating navigation, and extracting web data

### 2. Managed Auth

- Built-in API key handling with secure storage and rotation
- Central place to manage, scope, and revoke Hyperbrowser access
- Per user and per environment credentials so you never hard-code keys

### 3. Agent Optimized Design

- Tools tuned for high reliability and robust error handling
- Comprehensive execution logs—track which agent did what, when, and why

### 4. Enterprise Grade Security

- Granular RBAC to control which agents/users can run Hyperbrowser automations
- Scoped, least privilege access for safer automation
- Full audit trail of all agent browser actions for compliance and review

## Use Hyperbrowser with any AI Agent Framework

Choose a framework you want to connect Hyperbrowser with:

- [OpenAI Agents SDK](https://composio.dev/toolkits/hyperbrowser/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/hyperbrowser/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/hyperbrowser/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/hyperbrowser/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/hyperbrowser/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/hyperbrowser/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/hyperbrowser/framework/hermes-agent)
- [Google ADK](https://composio.dev/toolkits/hyperbrowser/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/hyperbrowser/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/hyperbrowser/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/hyperbrowser/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/hyperbrowser/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/hyperbrowser/framework/crew-ai)
- [Pydantic AI](https://composio.dev/toolkits/hyperbrowser/framework/pydantic-ai)
- [AutoGen](https://composio.dev/toolkits/hyperbrowser/framework/autogen)

## Related Toolkits

- [Apilio](https://composio.dev/toolkits/apilio) - Apilio is a home automation platform that lets you connect and control smart devices from different brands. It helps you build flexible automations with complex conditions, schedules, and integrations.
- [Basin](https://composio.dev/toolkits/basin) - Basin is a no-code form backend for quickly setting up reliable contact forms. It lets you collect and manage form submissions without writing any server-side code.
- [Bouncer](https://composio.dev/toolkits/bouncer) - Bouncer is an email validation platform that verifies the authenticity of email addresses in real-time and batch. It helps boost deliverability and reduce bounce rates for your communications.
- [Conveyor](https://composio.dev/toolkits/conveyor) - Conveyor is a platform that automates security reviews with a Trust Center and AI-driven questionnaire automation. It streamlines compliance and vendor security processes for faster, hassle-free reviews.
- [Crowdin](https://composio.dev/toolkits/crowdin) - Crowdin is a localization management platform that streamlines translation workflows and collaboration. It helps teams centralize multilingual content, boost productivity, and automate translation processes.
- [Databox](https://composio.dev/toolkits/databox) - Databox is a business analytics platform that connects your data from any tool and device. It helps you track KPIs, build dashboards, and discover actionable insights.
- [Detrack](https://composio.dev/toolkits/detrack) - Detrack is a delivery management platform for real-time tracking and proof of delivery. It helps businesses automate notifications and keep customers updated every step of the way.
- [Dnsfilter](https://composio.dev/toolkits/dnsfilter) - Dnsfilter is a cloud-based DNS security and content filtering solution. It helps organizations block online threats and manage safe internet access with ease.
- [Faraday](https://composio.dev/toolkits/faraday) - Faraday lets you embed AI in workflows across your stack for smarter automation. It boosts your favorite tools with actionable intelligence and seamless integration.
- [Feathery](https://composio.dev/toolkits/feathery) - Feathery is an AI-powered platform for building dynamic data intake forms with advanced logic. It helps teams automate complex workflows and collect structured data with ease.
- [Fillout forms](https://composio.dev/toolkits/fillout_forms) - Fillout forms is an online platform for building and managing forms with a flexible API. It lets you create, distribute, and collect responses from forms with ease.
- [Formdesk](https://composio.dev/toolkits/formdesk) - Formdesk is an online form builder for creating and managing professional forms. It's perfect for collecting data, automating workflows, and integrating form submissions with your favorite services.
- [Formsite](https://composio.dev/toolkits/formsite) - Formsite lets you build online forms and surveys with drag-and-drop simplicity. Capture, manage, and integrate form responses securely for streamlined workflows.
- [Graphhopper](https://composio.dev/toolkits/graphhopper) - GraphHopper is an enterprise-grade Directions API for routing, optimization, and geocoding across multiple vehicle types. It enables fast, reliable route planning and logistics automation for businesses.
- [La Growth Machine](https://composio.dev/toolkits/lagrowthmachine) - La Growth Machine automates multi-channel sales outreach and routine tasks for sales teams. Streamline your workflow and focus on closing more deals.
- [Leverly](https://composio.dev/toolkits/leverly) - Leverly is a workflow automation platform that connects and coordinates actions across your apps. It streamlines repetitive processes so your business runs smoother, faster, and with fewer manual steps.
- [Maintainx](https://composio.dev/toolkits/maintainx) - Maintainx is a cloud-based CMMS for centralizing maintenance data, communication, and workflows. It helps organizations streamline maintenance operations and improve team coordination.
- [Make](https://composio.dev/toolkits/make) - Make is an automation platform that connects your favorite apps and services. Build powerful, custom workflows without writing code.
- [Ntfy](https://composio.dev/toolkits/ntfy) - Ntfy is a notification service to send push messages to phones or desktops. Instantly deliver alerts and updates to users, devices, or teams.
- [Persona](https://composio.dev/toolkits/persona) - Persona offers identity infrastructure to automate user verification and compliance. It helps organizations securely verify users and reduce fraud risk.

## Frequently Asked Questions

### Do I need my own developer credentials to use Hyperbrowser with Composio?

Yes, Hyperbrowser requires you to configure your own API key credentials. Once set up, Composio handles secure credential storage and API request handling for you.

### Can I use multiple toolkits together?

Yes! Composio's Tool Router enables agents to use multiple toolkits. [Learn more](https://docs.composio.dev/tool-router/overview).

### Is Composio secure?

Composio is SOC 2 and ISO 27001 compliant with all data encrypted in transit and at rest. [Learn more](https://trust.composio.dev).

### What if the API changes?

Composio maintains and updates all toolkit integrations automatically, so your agents always work with the latest API versions.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
