How to integrate Bigml MCP with Codex

Bigml logo
Codex logo
divider

Introduction

Codex is one of the most popular coding harnesses out there. And MCP makes the experience even better. With Bigml MCP integration, you can draft, triage, summarise emails, and much more, all without leaving the terminal or the app, whichever you prefer.

Also integrate Bigml with

Why use Composio?

Apart from a managed and hosted MCP server, you will get:

  • CodeAct: A dedicated workbench that allows GPT to write its code to handle complex tool chaining. Reduces to-and-fro with LLMs for frequent tool calling.
  • Large tool responses: Handle them to minimise context rot.
  • Dynamic just-in-time access to 20,000 tools across 870+ other Apps for cross-app workflows. It loads the tools you need, so GPTs aren't overwhelmed by tools you don't need.

How to install Bigml MCP in Codex

Run the setup command

Run this command in your terminal to add the Composio MCP server to Codex.

Terminal

It will initiate the authentication in a browser window, authorize Codex to access your Composio account.

Composio authentication page

(Optional) Authenticate with OAuth

To authenticate manually, run the login command to open a browser window and authorize Codex to access your Composio account.

bash
codex mcp login composio

Verify the connection

Run codex mcp list to confirm Composio appears as a registered MCP server.

bash
codex mcp list

Codex App

Codex App follows the same approach as VS Code.

  1. Click ⚙️ on the bottom left → MCP Servers → + Add servers → Streamable HTTP:
  2. Fill the header and Key fields with { "x-consumer-api-key" = "ck_*******" }.
  3. The Key is the Composio API key, that you can find on connect.composio.dev
  4. Click on Authenticate and authorize Codex to your Composio account and you're all set.
Codex App MCP setup
  1. Restart and verify if it's there in .codex/config.toml
bash
[mcp_servers.composio]
url = "https://connect.composio.dev/mcp"
http_headers = { "x-consumer-api-key" = "ck_*******" }

What is the Bigml MCP server, and what's possible with it?

The Bigml MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Bigml account. It provides structured and secure access to your machine learning environment, so your agent can perform actions like creating projects, managing data connectors, inspecting resources, and analyzing correlations on your behalf.

  • Project creation and organization: Easily direct your agent to create new projects to group related BigML resources for streamlined workflows.
  • External data connector management: Have your agent set up and retrieve external connectors to bring in data from external sources and databases.
  • Resource inspection and retrieval: Let your agent fetch detailed metadata about projects or connectors, helping you monitor and audit your ML assets.
  • Automated project cleanup: Instruct your agent to delete obsolete or unused projects, ensuring your workspace stays organized and efficient.
  • Correlation browsing and analysis: Ask your agent to list and paginate correlation resources, uncovering relationships among your datasets for deeper insights.

Supported Tools & Triggers

Tools
Create External ConnectorTool to create a new external connector for data sources.
Create ProjectTool to create a new project.
Delete ProjectTool to delete an existing project.
Get External ConnectorTool to retrieve details of an external connector.
Get ProjectTool to retrieve details of a project by id.
List CorrelationsTool to list correlation resources.

Conclusion

You've successfully integrated Bigml with Codex using Composio's MCP server. Now you can interact with Bigml directly from your terminal, VS Code, or the Codex App using natural language commands.

Key benefits of this setup:

  • Seamless integration across CLI, VS Code, and standalone app
  • Natural language commands for Bigml operations
  • Managed authentication through Composio
  • Access to 20,000+ tools across 870+ apps for cross-app workflows
  • CodeAct workbench for complex tool chaining

Next steps:

  • Try asking Codex to perform various Bigml operations
  • Explore cross-app workflows by connecting more toolkits
  • Build automation scripts that leverage Codex's AI capabilities

How to build Bigml MCP Agent with another framework

FAQ

What are the differences in Tool Router MCP and Bigml MCP?

With a standalone Bigml MCP server, the agents and LLMs can only access a fixed set of Bigml tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Bigml and many other apps based on the task at hand, all through a single MCP endpoint.

Can I use Tool Router MCP with Codex?

Yes, you can. Codex fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Bigml tools.

Can I manage the permissions and scopes for Bigml while using Tool Router?

Yes, absolutely. You can configure which Bigml scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Bigml data and credentials are handled as safely as possible.

Used by agents from

Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.