# How to integrate Customgpt MCP with LlamaIndex

```json
{
  "title": "How to integrate Customgpt MCP with LlamaIndex",
  "toolkit": "Customgpt",
  "toolkit_slug": "customgpt",
  "framework": "LlamaIndex",
  "framework_slug": "llama-index",
  "url": "https://composio.dev/toolkits/customgpt/framework/llama-index",
  "markdown_url": "https://composio.dev/toolkits/customgpt/framework/llama-index.md",
  "updated_at": "2026-05-12T10:08:02.794Z"
}
```

## Introduction

This guide walks you through connecting Customgpt to LlamaIndex using the Composio tool router. By the end, you'll have a working Customgpt agent that can list all your active customgpt projects, show chat history from your latest conversation, get usage limits for your account through natural language commands.
This guide will help you understand how to give your LlamaIndex agent real control over a Customgpt account through Composio's Customgpt MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Customgpt with

- [OpenAI Agents SDK](https://composio.dev/toolkits/customgpt/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/customgpt/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/customgpt/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/customgpt/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/customgpt/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/customgpt/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/customgpt/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/customgpt/framework/cli)
- [Google ADK](https://composio.dev/toolkits/customgpt/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/customgpt/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/customgpt/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/customgpt/framework/mastra-ai)
- [CrewAI](https://composio.dev/toolkits/customgpt/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- Set your OpenAI and Composio API keys
- Install LlamaIndex and Composio packages
- Create a Composio Tool Router session for Customgpt
- Connect LlamaIndex to the Customgpt MCP server
- Build a Customgpt-powered agent using LlamaIndex
- Interact with Customgpt through natural language

## What is LlamaIndex?

LlamaIndex is a data framework for building LLM applications. It provides tools for connecting LLMs to external data sources and services through agents and tools.
Key features include:
- ReAct Agent: Reasoning and acting pattern for tool-using agents
- MCP Tools: Native support for Model Context Protocol
- Context Management: Maintain conversation context across interactions
- Async Support: Built for async/await patterns

## What is the Customgpt MCP server, and what's possible with it?

The Customgpt MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your CustomGPT.ai account. It provides structured and secure access to your chatbot projects, so your agent can list, manage, update, and analyze your AI-powered chatbots and their licenses on your behalf.
- Project and agent management: Effortlessly list all your CustomGPT projects, retrieve their details, and even delete agents you no longer need.
- Comprehensive license handling: Let your agent fetch, update, or remove licenses attached to any of your chatbot projects, ensuring you always have the right access and compliance.
- Chat conversation insights: Retrieve complete chat histories from your AI chatbot conversations to analyze user interactions or debug sessions.
- User profile and usage monitoring: Automatically fetch your account profile and check on your usage limits, including agents, words, and queries, so you never exceed your quotas.
- Project settings inspection: Quickly pull and review configuration details for any chatbot project to audit or adjust your bot's setup.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `CUSTOMGPT_ACTIVATE_PERSONA_VERSION` | Activate Persona Version | Restore a previous persona version for a CustomGPT agent. Activates a previous persona version, making it the current active persona. This creates a new version entry in the history (it doesn't overwrite), preserving the full audit trail. Use this to roll back to a known-good configuration. Requires Custom plan. |
| `CUSTOMGPT_ADD_SOURCE` | Add Source to Project | Add a data source to a CustomGPT agent's knowledge base. Connects content via sitemap URL, file upload, or integration. The system begins indexing immediately after creation. Use when adding documentation, FAQs, or knowledge content to an agent. |
| `CUSTOMGPT_CLONE_PROJECT` | Clone CustomGPT Project | Tool to clone a CustomGPT agent (project). Creates a complete copy of an existing agent, including its knowledge base, persona, and settings. Use this to create variations of an agent for testing, or to use an existing agent as a template for a new one. |
| `CUSTOMGPT_CREATE_CONVERSATION` | Create Conversation | Tool to create a new conversation session for a CustomGPT agent. Use this when starting a new chat interaction - it returns a session ID that you'll use to send messages. Optionally provide a name to help identify the conversation later. |
| `CUSTOMGPT_CREATE_PROJECT` | Create CustomGPT Project | Tool to create a new CustomGPT agent from a sitemap URL or file upload. The agent immediately begins processing the content to build its knowledge base. Use when you need to create a new AI agent with custom knowledge from web content or documents. Either sitemap_path or file must be provided. |
| `CUSTOMGPT_DELETE_PAGE` | Delete Page from Agent | Tool to delete a document from a CustomGPT agent's knowledge base. Permanently removes a document and the agent will no longer reference this content when answering questions. Use this to remove outdated or incorrect information. Warning: This action cannot be undone. |
| `CUSTOMGPT_DELETE_PROJECT` | Delete CustomGPT Project | Tool to delete a CustomGPT project by ID. Use when you need to permanently remove an existing agent after confirming the ID. |
| `CUSTOMGPT_DELETE_PROJECT_LICENSE` | Delete CustomGPT Project License | Deletes a license from a CustomGPT project/agent. Requires numeric project ID and license ID. This action is idempotent - it succeeds even if the license doesn't exist (404). The project must have licenses enabled in its plan for this endpoint to work properly. |
| `CUSTOMGPT_DELETE_SOURCE` | Delete CustomGPT Source | Tool to delete a data source from a CustomGPT agent. Removes the source and all its documents from the agent's knowledge base. Use this to disconnect content that's no longer relevant or to clean up after testing. |
| `CUSTOMGPT_EXPORT_LEADS` | Export Leads | Export leads from a CustomGPT project. Returns lead information captured from conversations including email addresses, names, phone numbers, and custom fields. Supports pagination and date range filtering. Use this to sync leads with CRM or marketing tools. |
| `CUSTOMGPT_GET_MESSAGE` | Get Message | Tool to get message details from a CustomGPT conversation. Returns the complete details for a single message, including the user's prompt, the agent's response, timestamps, citations, and any attached metadata. |
| `CUSTOMGPT_GET_MESSAGE_TRUST_SCORE` | Get Message Trust Score | Tool to retrieve verification trust score for a message in a CustomGPT conversation. Returns a score calculated by checking how well the agent's claims are supported by source documents. Higher scores indicate better-grounded responses with stronger evidence. |
| `CUSTOMGPT_GET_PAGE_METADATA` | Get Page Metadata | Tool to get document metadata including title, source URL, word count, and custom metadata fields. Use this to display document information or manage your knowledge base. |
| `CUSTOMGPT_GET_PLUGINS` | Get Agent Plugins | Tool to retrieve plugin details for a specific CustomGPT agent (project). Use when you need to inspect plugin configuration, status, and metadata for an agent. |
| `CUSTOMGPT_GET_PROJECT` | Get CustomGPT Project | Tool to get agent details. Returns the full configuration and current status for a specific agent. Use this to check processing status, view settings, or retrieve metadata about the agent. |
| `CUSTOMGPT_GET_PROJECT_LICENSE` | Get Project License | Tool to retrieve a license for a specific project. Use when you need to fetch license details by license ID. |
| `CUSTOMGPT_GET_PROJECT_SETTINGS` | Get Project Settings | Retrieve configuration settings for a specific CustomGPT agent/project. Returns settings including: chatbot avatar, background, default prompt, example questions, response source, language, and branding preferences. Use this to inspect agent configuration, audit settings, or retrieve values before making updates. Note: Some newly created projects may not have settings initialized yet and will return a 404. |
| `CUSTOMGPT_GET_REPORT_ANALYSIS` | Get Analytics Chart Data | Tool to retrieve analytics chart data for a CustomGPT project. Returns time-series data formatted for charts, with daily or weekly breakdowns of key metrics including conversation counts, query counts, and queries-per-conversation ratios. Use this to generate usage reports, track project engagement over time, or visualize chatbot performance trends. |
| `CUSTOMGPT_GET_REPORT_CONVERSATIONS` | Get Conversation Analytics | Tool to get conversation analytics for a CustomGPT project. Returns conversation metrics including total conversations, average queries per conversation, and other engagement statistics. Use this to understand how users engage with your agent and analyze conversation patterns over time. |
| `CUSTOMGPT_GET_REPORT_INTELLIGENCE` | Get Customer Intelligence Report | Tool to get customer intelligence for a CustomGPT project. Returns AI-analyzed insights about users including common intents, emotional sentiment, frequently discussed topics, and emerging trends. Use this to understand what users are asking about and identify patterns in user behavior. |
| `CUSTOMGPT_GET_REPORT_TRAFFIC` | Get Traffic Analytics Report | Tool to retrieve traffic analytics for a CustomGPT agent/project. Returns user traffic metrics including unique visitors, session counts, geographic distribution, and device types. Use this to understand who's using your agent and how they're accessing it. |
| `CUSTOMGPT_GET_STATS` | Get Agent Statistics | Tool to get agent statistics. Returns usage metrics and performance statistics for an agent, including total conversations, query counts, document statistics, and processing information. Use when you need to monitor agent performance or generate usage reports. |
| `CUSTOMGPT_GET_USAGE_LIMITS` | Get Usage Limits | Get account usage limits showing current usage vs. maximum allowed for projects, storage credits, and API queries. This returns how many projects, storage credits (characters indexed), and queries you've used compared to your account's maximum limits. Use this to monitor quota consumption. |
| `CUSTOMGPT_GET_USER_PROFILE` | Get Current User Profile | Tool to retrieve the current user's profile information. Use when you need to display or verify authenticated user details after login. |
| `CUSTOMGPT_LIST_CONVERSATION_MESSAGES` | List Conversation Messages | Retrieves all messages from a CustomGPT conversation, including both user queries and AI responses. Use this to view the complete chat history for a specific conversation session. Returns an empty list if the conversation doesn't exist or has no messages. |
| `CUSTOMGPT_LIST_PAGES` | List Agent Documents | Lists all documents in a CustomGPT agent's knowledge base. Returns indexed content including webpages, PDFs, and uploaded files that the agent can reference. Supports filtering by crawl/index status and pagination. Use this to audit knowledge sources or verify successful document ingestion. |
| `CUSTOMGPT_LIST_PERSONAS` | List Persona Versions | Tool to list persona versions for a CustomGPT agent. Use when you need to view the version history of an agent's persona. Every time the persona is updated, a snapshot is automatically saved, allowing you to view changes over time or restore a previous version. Results are paginated. Requires Custom plan. |
| `CUSTOMGPT_LIST_PROJECT_LICENSES` | List CustomGPT Project Licenses | List all licenses for a CustomGPT project/agent. Returns an array of license objects with details like ID, type, status, and timestamps. Returns an empty array if the project has no licenses or if licenses are not enabled for the project. Use this when you need to check what licenses exist for a specific project/agent. |
| `CUSTOMGPT_LIST_PROJECTS` | List CustomGPT Projects | Lists all CustomGPT projects (agents) for the authenticated user. Returns projects with full details including ID, name, type, chat status, and timestamps. Supports pagination via the 'page' parameter. Use this to discover available projects or iterate through all projects. |
| `CUSTOMGPT_LIST_SOURCES` | List Agent Sources | Tool to list all data sources connected to an agent. Returns sources from various origins like sitemaps, Google Drive folders, SharePoint sites, or uploaded files. Use this to manage what content feeds into an agent's knowledge base. |
| `CUSTOMGPT_REINDEX_PAGE` | Reindex Page | Tool to reindex a document in CustomGPT knowledge base. Re-crawls and re-indexes a URL-based document to update its content. Use this when the source content has changed and you want the agent to use the updated version. Only works for URL-based documents. |
| `CUSTOMGPT_SEARCH_TEAM_MEMBERS` | Search Team Members | Tool to search for team members by email address or user ID. Use this to find users when assigning permissions or managing team access. Requires Owner or Admin role to execute. |
| `CUSTOMGPT_SUBMIT_MESSAGE_FEEDBACK` | Submit Message Feedback | Tool to submit feedback (thumbs up/down) for a message in a CustomGPT conversation. Use this to record user satisfaction signals that help identify which AI responses are helpful and which need improvement. Feedback can be changed by submitting a new reaction value. |
| `CUSTOMGPT_UPDATE_PAGE_METADATA` | Update Page Metadata | Update document metadata for a specific page in a CustomGPT project. Updates custom metadata fields such as title, description, URL, and image that help organize and manage your knowledge base. Use when you need to add tags, categories, or other organizational information to documents. |
| `CUSTOMGPT_UPDATE_PROJECT` | Update Project | Updates an existing CustomGPT agent's name or configuration settings. Use this to rename an agent or modify its basic properties without affecting its knowledge base. Returns the complete updated project details including all metadata. |
| `CUSTOMGPT_UPDATE_PROJECT_LICENSE` | Update Project License | Updates the name of an existing license for a CustomGPT project/agent. Prerequisites: - The project must have licenses enabled in its plan - Both project ID and license ID must be valid and exist - Use List Projects to get valid project IDs - Use List Project Licenses to get valid license IDs for a project This action only updates the license name. Other license properties cannot be modified through this endpoint. |
| `CUSTOMGPT_UPDATE_PROJECT_SETTINGS` | Update Project Settings | Update CustomGPT agent configuration settings. Updates persona instructions, response format, citation style, branding, and deployment settings. Only include fields you want to change - omitted fields retain their current values. Use this to configure agent behavior, customize appearance, or adjust user experience settings. |
| `CUSTOMGPT_UPDATE_SOURCE` | Update Source Settings | Update source settings for a CustomGPT agent data source. Configure how the source is indexed and kept up to date by adjusting auto-sync frequency, crawl depth, file filters, and refresh behavior. Use this to fine-tune sitemap crawling (JavaScript execution, image extraction), control which pages are added or removed during syncs, and set up custom refresh schedules. |
| `CUSTOMGPT_UPDATE_USER_PROFILE` | Update User Profile | Updates the authenticated user's profile information in CustomGPT. Use this action to modify profile details such as the user's display name, email address, or profile photo URL. All fields are optional - only the fields you provide will be updated. The action returns the complete updated user profile. |
| `CUSTOMGPT_VERIFY_MESSAGE` | Verify Message Accuracy | Tool to verify message accuracy by triggering a fact-checking verification process. Use when you need to verify claims in a conversation message against source documents. The system compares each claim and reports which claims are supported, partially supported, or unsupported. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Customgpt MCP server is an implementation of the Model Context Protocol that connects your AI agent to Customgpt. It provides structured and secure access so your agent can perform Customgpt operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before you begin, make sure you have:
- Python 3.8/Node 16 or higher installed
- A Composio account with the API key
- An OpenAI API key
- A Customgpt account and project
- Basic familiarity with async Python/Typescript

### 1. Getting API Keys for OpenAI, Composio, and Customgpt

No description provided.

### 2. Installing dependencies

No description provided.
```python
pip install composio-llamaindex llama-index llama-index-llms-openai llama-index-tools-mcp python-dotenv
```

```typescript
npm install @composio/llamaindex @llamaindex/openai @llamaindex/tools @llamaindex/workflow dotenv
```

### 3. Set environment variables

Create a .env file in your project root:
These credentials will be used to:
- Authenticate with OpenAI's GPT-5 model
- Connect to Composio's Tool Router
- Identify your Composio user session for Customgpt access
```bash
OPENAI_API_KEY=your-openai-api-key
COMPOSIO_API_KEY=your-composio-api-key
COMPOSIO_USER_ID=your-user-id
```

### 4. Import modules

No description provided.
```python
import asyncio
import os
import dotenv

from composio import Composio
from composio_llamaindex import LlamaIndexProvider
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.workflow import Context
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec

dotenv.load_dotenv()
```

```typescript
import "dotenv/config";
import readline from "node:readline/promises";
import { stdin as input, stdout as output } from "node:process";

import { Composio } from "@composio/core";

import { mcp } from "@llamaindex/tools";
import { agent as createAgent } from "@llamaindex/workflow";
import { openai } from "@llamaindex/openai";

dotenv.config();
```

### 5. Load environment variables and initialize Composio

No description provided.
```python
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY is not set in the environment")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set in the environment")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set in the environment")
```

```typescript
const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const COMPOSIO_API_KEY = process.env.COMPOSIO_API_KEY;
const COMPOSIO_USER_ID = process.env.COMPOSIO_USER_ID;

if (!OPENAI_API_KEY) throw new Error("OPENAI_API_KEY is not set");
if (!COMPOSIO_API_KEY) throw new Error("COMPOSIO_API_KEY is not set");
if (!COMPOSIO_USER_ID) throw new Error("COMPOSIO_USER_ID is not set");
```

### 6. Create a Tool Router session and build the agent function

What's happening here:
- We create a Composio client using your API key and configure it with the LlamaIndex provider
- We then create a tool router MCP session for your user, specifying the toolkits we want to use (in this case, customgpt)
- The session returns an MCP HTTP endpoint URL that acts as a gateway to all your configured tools
- LlamaIndex will connect to this endpoint to dynamically discover and use the available Customgpt tools.
- The MCP tools are mapped to LlamaIndex-compatible tools and plug them into the Agent.
```python
async def build_agent() -> ReActAgent:
    composio_client = Composio(
        api_key=COMPOSIO_API_KEY,
        provider=LlamaIndexProvider(),
    )

    session = composio_client.create(
        user_id=COMPOSIO_USER_ID,
        toolkits=["customgpt"],
    )

    mcp_url = session.mcp.url
    print(f"Composio MCP URL: {mcp_url}")

    mcp_client = BasicMCPClient(mcp_url, headers={"x-api-key": COMPOSIO_API_KEY})
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    llm = OpenAI(model="gpt-5")

    description = "An agent that uses Composio Tool Router MCP tools to perform Customgpt actions."
    system_prompt = """
    You are a helpful assistant connected to Composio Tool Router.
    Use the available tools to answer user queries and perform Customgpt actions.
    """
    return ReActAgent(tools=tools, llm=llm, description=description, system_prompt=system_prompt, verbose=True)
```

```typescript
async function buildAgent() {

  console.log(`Initializing Composio client...${COMPOSIO_USER_ID!}...`);
  console.log(`COMPOSIO_USER_ID: ${COMPOSIO_USER_ID!}...`);

  const composio = new Composio({
    apiKey: COMPOSIO_API_KEY,
    provider: new LlamaindexProvider(),
  });

  const session = await composio.create(
    COMPOSIO_USER_ID!,
    {
      toolkits: ["customgpt"],
    },
  );

  const mcpUrl = session.mcp.url;
  console.log(`Composio Tool Router MCP URL: ${mcpUrl}`);

  const server = mcp({
    url: mcpUrl,
    clientName: "composio_tool_router_with_llamaindex",
    requestInit: {
      headers: {
        "x-api-key": COMPOSIO_API_KEY!,
      },
    },
    // verbose: true,
  });

  const tools = await server.tools();

  const llm = openai({ apiKey: OPENAI_API_KEY, model: "gpt-5" });

  const agent = createAgent({
    name: "composio_tool_router_with_llamaindex",
        description : "An agent that uses Composio Tool Router MCP tools to perform actions.",
    systemPrompt:
      "You are a helpful assistant connected to Composio Tool Router."+
"Use the available tools to answer user queries and perform Customgpt actions." ,
    llm,
    tools,
  });

  return agent;
}
```

### 7. Create an interactive chat loop

No description provided.
```python
async def chat_loop(agent: ReActAgent) -> None:
    ctx = Context(agent)
    print("Type 'quit', 'exit', or Ctrl+C to stop.")

    while True:
        try:
            user_input = input("\nYou: ").strip()
        except (KeyboardInterrupt, EOFError):
            print("\nBye!")
            break

        if not user_input or user_input.lower() in {"quit", "exit"}:
            print("Bye!")
            break

        try:
            print("Agent: ", end="", flush=True)
            handler = agent.run(user_input, ctx=ctx)

            async for event in handler.stream_events():
                # Stream token-by-token from LLM responses
                if hasattr(event, "delta") and event.delta:
                    print(event.delta, end="", flush=True)
                # Show tool calls as they happen
                elif hasattr(event, "tool_name"):
                    print(f"\n[Using tool: {event.tool_name}]", flush=True)

            # Get final response
            response = await handler
            print()  # Newline after streaming
        except KeyboardInterrupt:
            print("\n[Interrupted]")
            continue
        except Exception as e:
            print(f"\nError: {e}")
```

```typescript
async function chatLoop(agent: ReturnType<typeof createAgent>) {
  const rl = readline.createInterface({ input, output });

  console.log("Type 'quit' or 'exit' to stop.");

  while (true) {
    let userInput: string;

    try {
      userInput = (await rl.question("\nYou: ")).trim();
    } catch {
      console.log("\nAgent: Bye!");
      break;
    }

    if (!userInput) {
      continue;
    }

    const lower = userInput.toLowerCase();
    if (lower === "quit" || lower === "exit") {
      console.log("Agent: Bye!");
      break;
    }

    try {
      process.stdout.write("Agent: ");

      const stream = agent.runStream(userInput);
      let finalResult: any = null;

      for await (const event of stream) {
        // The event.data contains the streamed content
        const data: any = event.data;

        // Check for streaming delta content
        if (data?.delta) {
          process.stdout.write(data.delta);
        }

        // Store final result for fallback
        if (data?.result || data?.message) {
          finalResult = data;
        }
      }

      // If no streaming happened, show the final result
      if (finalResult) {
        const answer =
          finalResult.result ??
          finalResult.message?.content ??
          finalResult.message ??
          "";
        if (answer && typeof answer === "string" && !answer.includes("[object")) {
          process.stdout.write(answer);
        }
      }

      console.log(); // New line after streaming completes
    } catch (err: any) {
      console.error("\nAgent error:", err?.message ?? err);
    }
  }

  rl.close();
}
```

### 8. Define the main entry point

What's happening here:
- We're orchestrating the entire application flow
- The agent gets built with proper error handling
- Then we kick off the interactive chat loop so you can start talking to Customgpt
```python
async def main() -> None:
    agent = await build_agent()
    await chat_loop(agent)

if __name__ == "__main__":
    # Handle Ctrl+C gracefully
    signal.signal(signal.SIGINT, lambda s, f: (print("\nBye!"), exit(0)))
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print("\nBye!")
```

```typescript
async function main() {
  try {
    const agent = await buildAgent();
    await chatLoop(agent);
  } catch (err) {
    console.error("Failed to start agent:", err);
    process.exit(1);
  }
}

main();
```

### 9. Run the agent

When prompted, authenticate and authorise your agent with Customgpt, then start asking questions.
```bash
python llamaindex_agent.py
```

```typescript
npx ts-node llamaindex-agent.ts
```

## Complete Code

```python
import asyncio
import os
import signal
import dotenv

from composio import Composio
from composio_llamaindex import LlamaIndexProvider
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.workflow import Context
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec

dotenv.load_dotenv()

OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY is not set")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set")

async def build_agent() -> ReActAgent:
    composio_client = Composio(
        api_key=COMPOSIO_API_KEY,
        provider=LlamaIndexProvider(),
    )

    session = composio_client.create(
        user_id=COMPOSIO_USER_ID,
        toolkits=["customgpt"],
    )

    mcp_url = session.mcp.url
    print(f"Composio MCP URL: {mcp_url}")

    mcp_client = BasicMCPClient(mcp_url, headers={"x-api-key": COMPOSIO_API_KEY})
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    llm = OpenAI(model="gpt-5")
    description = "An agent that uses Composio Tool Router MCP tools to perform Customgpt actions."
    system_prompt = """
    You are a helpful assistant connected to Composio Tool Router.
    Use the available tools to answer user queries and perform Customgpt actions.
    """
    return ReActAgent(
        tools=tools,
        llm=llm,
        description=description,
        system_prompt=system_prompt,
        verbose=True,
    );

async def chat_loop(agent: ReActAgent) -> None:
    ctx = Context(agent)
    print("Type 'quit', 'exit', or Ctrl+C to stop.")

    while True:
        try:
            user_input = input("\nYou: ").strip()
        except (KeyboardInterrupt, EOFError):
            print("\nBye!")
            break

        if not user_input or user_input.lower() in {"quit", "exit"}:
            print("Bye!")
            break

        try:
            print("Agent: ", end="", flush=True)
            handler = agent.run(user_input, ctx=ctx)

            async for event in handler.stream_events():
                # Stream token-by-token from LLM responses
                if hasattr(event, "delta") and event.delta:
                    print(event.delta, end="", flush=True)
                # Show tool calls as they happen
                elif hasattr(event, "tool_name"):
                    print(f"\n[Using tool: {event.tool_name}]", flush=True)

            # Get final response
            response = await handler
            print()  # Newline after streaming
        except KeyboardInterrupt:
            print("\n[Interrupted]")
            continue
        except Exception as e:
            print(f"\nError: {e}")

async def main() -> None:
    agent = await build_agent()
    await chat_loop(agent)

if __name__ == "__main__":
    # Handle Ctrl+C gracefully
    signal.signal(signal.SIGINT, lambda s, f: (print("\nBye!"), exit(0)))
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print("\nBye!")
```

```typescript
import "dotenv/config";
import readline from "node:readline/promises";
import { stdin as input, stdout as output } from "node:process";

import { Composio } from "@composio/core";
import { LlamaindexProvider } from "@composio/llamaindex";

import { mcp } from "@llamaindex/tools";
import { agent as createAgent } from "@llamaindex/workflow";
import { openai } from "@llamaindex/openai";

dotenv.config();

const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const COMPOSIO_API_KEY = process.env.COMPOSIO_API_KEY;
const COMPOSIO_USER_ID = process.env.COMPOSIO_USER_ID;

if (!OPENAI_API_KEY) {
    throw new Error("OPENAI_API_KEY is not set in the environment");
  }
if (!COMPOSIO_API_KEY) {
    throw new Error("COMPOSIO_API_KEY is not set in the environment");
  }
if (!COMPOSIO_USER_ID) {
    throw new Error("COMPOSIO_USER_ID is not set in the environment");
  }

async function buildAgent() {

  console.log(`Initializing Composio client...${COMPOSIO_USER_ID!}...`);
  console.log(`COMPOSIO_USER_ID: ${COMPOSIO_USER_ID!}...`);

  const composio = new Composio({
    apiKey: COMPOSIO_API_KEY,
    provider: new LlamaindexProvider(),
  });

  const session = await composio.create(
    COMPOSIO_USER_ID!,
    {
      toolkits: ["customgpt"],
    },
  );

  const mcpUrl = session.mcp.url;
  console.log(`Composio Tool Router MCP URL: ${mcpUrl}`);

  const server = mcp({
    url: mcpUrl,
    clientName: "composio_tool_router_with_llamaindex",
    requestInit: {
      headers: {
        "x-api-key": COMPOSIO_API_KEY!,
      },
    },
    // verbose: true,
  });

  const tools = await server.tools();

  const llm = openai({ apiKey: OPENAI_API_KEY, model: "gpt-5" });

  const agent = createAgent({
    name: "composio_tool_router_with_llamaindex",
    description:
      "An agent that uses Composio Tool Router MCP tools to perform actions.",
    systemPrompt:
      "You are a helpful assistant connected to Composio Tool Router."+
"Use the available tools to answer user queries and perform Customgpt actions." ,
    llm,
    tools,
  });

  return agent;
}

async function chatLoop(agent: ReturnType<typeof createAgent>) {
  const rl = readline.createInterface({ input, output });

  console.log("Type 'quit' or 'exit' to stop.");

  while (true) {
    let userInput: string;

    try {
      userInput = (await rl.question("\nYou: ")).trim();
    } catch {
      console.log("\nAgent: Bye!");
      break;
    }

    if (!userInput) {
      continue;
    }

    const lower = userInput.toLowerCase();
    if (lower === "quit" || lower === "exit") {
      console.log("Agent: Bye!");
      break;
    }

    try {
      process.stdout.write("Agent: ");

      const stream = agent.runStream(userInput);
      let finalResult: any = null;

      for await (const event of stream) {
        // The event.data contains the streamed content
        const data: any = event.data;

        // Check for streaming delta content
        if (data?.delta) {
          process.stdout.write(data.delta);
        }

        // Store final result for fallback
        if (data?.result || data?.message) {
          finalResult = data;
        }
      }

      // If no streaming happened, show the final result
      if (finalResult) {
        const answer =
          finalResult.result ??
          finalResult.message?.content ??
          finalResult.message ??
          "";
        if (answer && typeof answer === "string" && !answer.includes("[object")) {
          process.stdout.write(answer);
        }
      }

      console.log(); // New line after streaming completes
    } catch (err: any) {
      console.error("\nAgent error:", err?.message ?? err);
    }
  }

  rl.close();
}

async function main() {
  try {
    const agent = await buildAgent();
    await chatLoop(agent);
  } catch (err: any) {
    console.error("Failed to start agent:", err?.message ?? err);
    process.exit(1);
  }
}

main();
```

## Conclusion

You've successfully connected Customgpt to LlamaIndex through Composio's Tool Router MCP layer.
Key takeaways:
- Tool Router dynamically exposes Customgpt tools through an MCP endpoint
- LlamaIndex's ReActAgent handles reasoning and orchestration; Composio handles integrations
- The agent becomes more capable without increasing prompt size
- Async Python provides clean, efficient execution of agent workflows
You can easily extend this to other toolkits like Gmail, Notion, Stripe, GitHub, and more by adding them to the toolkits parameter.

## How to build Customgpt MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/customgpt/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/customgpt/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/customgpt/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/customgpt/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/customgpt/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/customgpt/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/customgpt/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/customgpt/framework/cli)
- [Google ADK](https://composio.dev/toolkits/customgpt/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/customgpt/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/customgpt/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/customgpt/framework/mastra-ai)
- [CrewAI](https://composio.dev/toolkits/customgpt/framework/crew-ai)

## Related Toolkits

- [Composio](https://composio.dev/toolkits/composio) - Composio is an integration platform that connects AI agents with hundreds of business tools. It streamlines authentication and lets you trigger actions across services—no custom code needed.
- [Composio search](https://composio.dev/toolkits/composio_search) - Composio search is a unified web search toolkit spanning travel, e-commerce, news, financial markets, images, and more. It lets you and your apps tap into up-to-date web data from a single, easy-to-integrate service.
- [Perplexityai](https://composio.dev/toolkits/perplexityai) - Perplexityai delivers natural, conversational AI models for generating human-like text. Instantly get context-aware, high-quality responses for chat, search, or complex workflows.
- [Browser tool](https://composio.dev/toolkits/browser_tool) - Browser tool is a virtual browser integration that lets AI agents interact with the web programmatically. It enables automated browsing, scraping, and action-taking from any AI workflow.
- [Ai ml api](https://composio.dev/toolkits/ai_ml_api) - Ai ml api is a suite of AI/ML models for natural language and image tasks. It provides fast, scalable access to advanced AI capabilities for your apps and workflows.
- [Aivoov](https://composio.dev/toolkits/aivoov) - Aivoov is an AI-powered text-to-speech platform offering 1,000+ voices in over 150 languages. Instantly turn written content into natural, human-like audio for any application.
- [All images ai](https://composio.dev/toolkits/all_images_ai) - All-Images.ai is an AI-powered image generation and management platform. It helps you create, search, and organize images effortlessly with advanced AI capabilities.
- [Anthropic administrator](https://composio.dev/toolkits/anthropic_administrator) - Anthropic administrator is an API for managing Anthropic organizational resources like members, workspaces, and API keys. It helps you automate admin tasks and streamline resource management across your Anthropic organization.
- [Api labz](https://composio.dev/toolkits/api_labz) - Api labz is a platform offering a suite of AI-driven APIs and workflow tools. It helps developers automate tasks and build smarter, more efficient applications.
- [Apipie ai](https://composio.dev/toolkits/apipie_ai) - Apipie ai is an AI model aggregator offering a single API for accessing top AI models from multiple providers. It helps developers build cost-efficient, latency-optimized AI solutions without juggling multiple integrations.
- [Astica ai](https://composio.dev/toolkits/astica_ai) - Astica ai provides APIs for computer vision, NLP, and voice synthesis. Integrate advanced AI features into your app with a single API key.
- [Bigml](https://composio.dev/toolkits/bigml) - BigML is a machine learning platform that lets you build, train, and deploy predictive models from your data. Its intuitive interface and robust API make machine learning accessible and efficient.
- [Botbaba](https://composio.dev/toolkits/botbaba) - Botbaba is a platform for building, managing, and deploying conversational AI chatbots across messaging channels. It streamlines chatbot automation, making it easier to integrate AI into customer interactions.
- [Botpress](https://composio.dev/toolkits/botpress) - Botpress is an open-source platform for building, deploying, and managing chatbots. It helps teams automate conversations and deliver rich, interactive messaging experiences.
- [Chatbotkit](https://composio.dev/toolkits/chatbotkit) - Chatbotkit is a platform for building and managing AI-powered chatbots using robust APIs and SDKs. It lets you easily add conversational AI to your apps for better user engagement.
- [Cody](https://composio.dev/toolkits/cody) - Cody is an AI assistant built for businesses, trained on your company's knowledge and data. It delivers instant answers and insights, tailored for your team.
- [Context7 MCP](https://composio.dev/toolkits/context7_mcp) - Context7 MCP delivers live, version-specific code docs and examples right from the source. It helps developers and AI agents instantly retrieve authoritative programming info—no more out-of-date docs.
- [Datarobot](https://composio.dev/toolkits/datarobot) - Datarobot is a machine learning platform that automates model development, deployment, and monitoring. It empowers organizations to quickly gain predictive insights from large datasets.
- [Deepgram](https://composio.dev/toolkits/deepgram) - Deepgram is an AI-powered speech recognition platform for accurate audio transcription and understanding. It enables fast, scalable speech-to-text with advanced audio intelligence features.
- [DeepImage](https://composio.dev/toolkits/deepimage) - DeepImage is an AI-powered image enhancer and upscaler. Get higher-quality images with just a few clicks.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Customgpt MCP?

With a standalone Customgpt MCP server, the agents and LLMs can only access a fixed set of Customgpt tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Customgpt and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with LlamaIndex?

Yes, you can. LlamaIndex fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Customgpt tools.

### Can I manage the permissions and scopes for Customgpt while using Tool Router?

Yes, absolutely. You can configure which Customgpt scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Customgpt data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
