Codex is one of the most popular coding harnesses out there. And MCP makes the experience even better. With Scrape do MCP integration, you can draft, triage, summarise emails, and much more, all without leaving the terminal or the app, whichever you prefer.
Table of Contents
Connect Scrape do without Auth hassles
We manage OAuth, API Key, token refresh, and scopes, you just build.
Try for FreeIntroduction
Also integrate Scrape do with
Why use Composio?
Apart from a managed and hosted MCP server, you will get:
- CodeAct: A dedicated workbench that allows GPT to write its code to handle complex tool chaining. Reduces to-and-fro with LLMs for frequent tool calling.
- Large tool responses: Handle them to minimise context rot.
- Dynamic just-in-time access to 20,000 tools across 870+ other Apps for cross-app workflows. It loads the tools you need, so GPTs aren't overwhelmed by tools you don't need.
How to install Scrape do MCP in Codex
Run the setup command
Run this command in your terminal to add the Composio MCP server to Codex.
It will initiate the authentication in a browser window, authorize Codex to access your Composio account.
(Optional) Authenticate with OAuth
To authenticate manually, run the login command to open a browser window and authorize Codex to access your Composio account.
Verify the connection
Run codex mcp list to confirm Composio appears as a registered MCP server.
Codex App
Codex App follows the same approach as VS Code.
- Click ⚙️ on the bottom left → MCP Servers → + Add servers → Streamable HTTP:
- Fill the header and Key fields with
{ "x-consumer-api-key" = "ck_*******" }. - The Key is the Composio API key, that you can find on connect.composio.dev
- Click on Authenticate and authorize Codex to your Composio account and you're all set.
- Restart and verify if it's there in
.codex/config.toml
What is the Scrape do MCP server, and what's possible with it?
The Scrape do MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Scrape do account. It provides structured and secure access to robust web scraping tools, so your agent can perform actions like scraping dynamic pages, managing sessions, setting custom headers or proxies, and extracting structured data from any website on your behalf.
- Dynamic page scraping with headless browsers: Retrieve fully rendered HTML content from JavaScript-heavy or protected websites by leveraging advanced browser emulation and proxy rotation.
- Custom scraping session management: Set device type, cookies, wait times, and custom headers to imitate different users, maintain sessions, or access device-specific content for tailored data extraction.
- Proxy and anti-bot bypass control: Enable super or proxy modes to utilize residential, mobile, or datacenter proxies, helping your agent bypass strict anti-bot systems and geo-restrictions seamlessly.
- Targeted resource filtering: Block specific URLs like ads or analytics scripts during scraping to increase speed, avoid distractions, and improve privacy.
- Account usage and statistics retrieval: Access real-time usage stats, subscription status, and remaining request limits so your agent can monitor scraping quotas and avoid interruptions.
Supported Tools & Triggers
Conclusion
You've successfully integrated Scrape do with Codex using Composio's MCP server. Now you can interact with Scrape do directly from your terminal, VS Code, or the Codex App using natural language commands.
Key benefits of this setup:
- Seamless integration across CLI, VS Code, and standalone app
- Natural language commands for Scrape do operations
- Managed authentication through Composio
- Access to 20,000+ tools across 870+ apps for cross-app workflows
- CodeAct workbench for complex tool chaining
Next steps:
- Try asking Codex to perform various Scrape do operations
- Explore cross-app workflows by connecting more toolkits
- Build automation scripts that leverage Codex's AI capabilities










