How to integrate Gemini MCP with Codex

Gemini logo
Codex logo
divider

Introduction

Codex is one of the most popular coding harnesses out there. And MCP makes the experience even better. With Gemini MCP integration, you can draft, triage, summarise emails, and much more, all without leaving the terminal or the app, whichever you prefer.

Also integrate Gemini with

Why use Composio?

Apart from a managed and hosted MCP server, you will get:

  • CodeAct: A dedicated workbench that allows GPT to write its code to handle complex tool chaining. Reduces to-and-fro with LLMs for frequent tool calling.
  • Large tool responses: Handle them to minimise context rot.
  • Dynamic just-in-time access to 20,000 tools across 870+ other Apps for cross-app workflows. It loads the tools you need, so GPTs aren't overwhelmed by tools you don't need.

How to install Gemini MCP in Codex

Run the setup command

Run this command in your terminal to add the Composio MCP server to Codex.

Terminal

It will initiate the authentication in a browser window, authorize Codex to access your Composio account.

Composio authentication page

(Optional) Authenticate with OAuth

To authenticate manually, run the login command to open a browser window and authorize Codex to access your Composio account.

bash
codex mcp login composio

Verify the connection

Run codex mcp list to confirm Composio appears as a registered MCP server.

bash
codex mcp list

Codex App

Codex App follows the same approach as VS Code.

  1. Click ⚙️ on the bottom left → MCP Servers → + Add servers → Streamable HTTP:
  2. Fill the header and Key fields with { "x-consumer-api-key" = "ck_*******" }.
  3. The Key is the Composio API key, that you can find on connect.composio.dev
  4. Click on Authenticate and authorize Codex to your Composio account and you're all set.
Codex App MCP setup
  1. Restart and verify if it's there in .codex/config.toml
bash
[mcp_servers.composio]
url = "https://connect.composio.dev/mcp"
http_headers = { "x-consumer-api-key" = "ck_*******" }

What is the Gemini MCP server, and what's possible with it?

The Gemini MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Gemini account. It provides structured and secure access to Gemini's multimodal AI features, so your agent can generate text, images, and videos, analyze content, and manage model resources on your behalf.

  • Text and content generation: Instruct your agent to create high-quality, customized text using Gemini's advanced generative models—great for brainstorming, drafting, or summarizing information.
  • Creative image and video generation: Ask the agent to generate original images or high-quality videos from text prompts using Gemini 2.5 Flash and Veo models, with fine control over style and format.
  • Embedding and semantic analysis: Let your agent transform any text into rich semantic embeddings for similarity search, clustering, or classification tasks.
  • Model discovery and optimization: Have the agent list available Gemini and Veo models, check their capabilities, and select the best fit for your project or workflow.
  • Efficient resource management: Enable the agent to track video generation operations, download final assets, and optimize prompt inputs by counting tokens—all without manual intervention.

Supported Tools & Triggers

Tools
Count Tokens (Gemini)Counts the number of tokens in text using gemini tokenization.
Download Video (Veo)Downloads a generated veo video to local storage.
Embed Content (Gemini)Generates text embeddings using gemini embedding models.
Generate Content (Gemini)Generates text content from prompts using gemini models.
Generate Image (Gemini 2.5 Flash)Generates images from text prompts using gemini 2.
Generate Videos (Veo)Generates videos from text prompts using google's veo models.
Get Videos Operation (Veo)Checks the status of a veo video generation operation.
List Models (Gemini API)Lists available gemini and veo models with their capabilities and limits.
Wait For Video (Veo)Polls a veo video generation operation until completion or timeout.

Conclusion

You've successfully integrated Gemini with Codex using Composio's MCP server. Now you can interact with Gemini directly from your terminal, VS Code, or the Codex App using natural language commands.

Key benefits of this setup:

  • Seamless integration across CLI, VS Code, and standalone app
  • Natural language commands for Gemini operations
  • Managed authentication through Composio
  • Access to 20,000+ tools across 870+ apps for cross-app workflows
  • CodeAct workbench for complex tool chaining

Next steps:

  • Try asking Codex to perform various Gemini operations
  • Explore cross-app workflows by connecting more toolkits
  • Build automation scripts that leverage Codex's AI capabilities

How to build Gemini MCP Agent with another framework

FAQ

What are the differences in Tool Router MCP and Gemini MCP?

With a standalone Gemini MCP server, the agents and LLMs can only access a fixed set of Gemini tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Gemini and many other apps based on the task at hand, all through a single MCP endpoint.

Can I use Tool Router MCP with Codex?

Yes, you can. Codex fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Gemini tools.

Can I manage the permissions and scopes for Gemini while using Tool Router?

Yes, absolutely. You can configure which Gemini scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Gemini data and credentials are handled as safely as possible.

Used by agents from

Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.