# How to integrate Superchat MCP with LlamaIndex

```json
{
  "title": "How to integrate Superchat MCP with LlamaIndex",
  "toolkit": "Superchat",
  "toolkit_slug": "superchat",
  "framework": "LlamaIndex",
  "framework_slug": "llama-index",
  "url": "https://composio.dev/toolkits/superchat/framework/llama-index",
  "markdown_url": "https://composio.dev/toolkits/superchat/framework/llama-index.md",
  "updated_at": "2026-05-12T10:27:31.364Z"
}
```

## Introduction

This guide walks you through connecting Superchat to LlamaIndex using the Composio tool router. By the end, you'll have a working Superchat agent that can list all whatsapp conversations from today, create a new contact for incoming lead, fetch details for contact john smith through natural language commands.
This guide will help you understand how to give your LlamaIndex agent real control over a Superchat account through Composio's Superchat MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Superchat with

- [OpenAI Agents SDK](https://composio.dev/toolkits/superchat/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/superchat/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/superchat/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/superchat/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/superchat/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/superchat/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/superchat/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/superchat/framework/cli)
- [Google ADK](https://composio.dev/toolkits/superchat/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/superchat/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/superchat/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/superchat/framework/mastra-ai)
- [CrewAI](https://composio.dev/toolkits/superchat/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- Set your OpenAI and Composio API keys
- Install LlamaIndex and Composio packages
- Create a Composio Tool Router session for Superchat
- Connect LlamaIndex to the Superchat MCP server
- Build a Superchat-powered agent using LlamaIndex
- Interact with Superchat through natural language

## What is LlamaIndex?

LlamaIndex is a data framework for building LLM applications. It provides tools for connecting LLMs to external data sources and services through agents and tools.
Key features include:
- ReAct Agent: Reasoning and acting pattern for tool-using agents
- MCP Tools: Native support for Model Context Protocol
- Context Management: Maintain conversation context across interactions
- Async Support: Built for async/await patterns

## What is the Superchat MCP server, and what's possible with it?

The Superchat MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Superchat account. It provides structured and secure access to your unified messaging platform, so your agent can perform actions like managing contacts, listing conversations, organizing templates, and retrieving channel information across messaging channels on your behalf.
- Unified contact management: Easily create, fetch, and list contacts, allowing your agent to manage your customer database and keep your communications up to date.
- Conversation and channel insights: Ask your agent to list all ongoing conversations and available messaging channels, making it easy to monitor activity and streamline engagement across platforms.
- Template and folder organization: Have your agent create new template folders to organize message templates for efficient, consistent communication with customers.
- Custom attribute retrieval: Let your agent pull all custom contact attributes, enabling dynamic personalization and tailored messaging workflows.
- Webhook and file management: Direct your agent to delete obsolete webhooks or retrieve file metadata, keeping your integrations clean and your resources easily accessible.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `SUPERCHAT_CREATE_CONTACT` | Create Contact | Create a new contact in Superchat with phone or email handles. Use this tool to register contact details before sending messages. You must provide at least one handle (phone or email). Optional fields include first name, last name, gender, and custom attributes (if predefined in your account settings). |
| `SUPERCHAT_CREATE_TEMPLATE_FOLDER` | Create Template Folder | Create a new template folder in SuperChat for organizing message templates. Template folders help organize WhatsApp and other messaging templates into logical groups (e.g., Marketing, Sales, Customer Support). Use this action before creating templates when you want to keep them organized. |
| `SUPERCHAT_DELETE_WEBHOOK` | Delete Webhook | Tool to delete a specific webhook by its ID. Use when you need to remove an obsolete or unwanted webhook subscription. First use SUPERCHAT_LIST_WEBHOOKS to retrieve the webhook ID, then use this action to delete it. Example: Delete webhook wh_UPYSN0Etofjl7lhhQ9yhL. |
| `SUPERCHAT_GET_CONTACT` | Get Contact | Retrieve a specific contact's details by ID. Returns the contact's name, handles (phone, email, social), custom attributes, and timestamps. Use this to look up contact information before sending messages or updating contact details. Example: "Get contact details for co_abc123" or "Fetch info for contact co_xyz789". |
| `SUPERCHAT_GET_FILE` | Get File | Retrieve metadata for a specific file by its ID. Returns file details including the file name, MIME type, API resource URL, and a temporary download link with expiration. Use this to get information about uploaded files before downloading or processing them. |
| `SUPERCHAT_GET_USER` | Get User | Retrieve details of a specific user in the Superchat workspace by their user ID. Use this tool when you need to fetch profile information for a known user. Note: You can obtain user IDs by first calling the List Users action. Example user_id format: 'us_abc123xyz' |
| `SUPERCHAT_LIST_CHANNELS` | List Channels | Lists all communication channels in your Superchat workspace. Channels represent different messaging mediums (WhatsApp, email, SMS, Telegram, Instagram, Facebook) through which conversations occur. Results are sorted by creation date (newest first) and can be paginated using 'limit', 'after', or 'before' parameters. |
| `SUPERCHAT_LIST_CONTACTS` | List Contacts | List all contacts in the Superchat workspace with cursor-based pagination. Use this tool to browse contacts, find contact IDs, or export contact data. Results are sorted by creation date (newest first). |
| `SUPERCHAT_LIST_CONVERSATIONS` | List Conversations | Retrieves a paginated list of all conversations in your Superchat account. Each conversation includes channel info, status (open/snoozed/done), assigned users, contacts, labels, and inbox details. Use this to browse conversations, find specific ones by status or channel, or get conversation IDs for further operations. Supports cursor-based pagination to handle large result sets. |
| `SUPERCHAT_LIST_CUSTOM_ATTRIBUTES` | List Custom Attributes | List all custom attributes defined for contacts in your Superchat account. Custom attributes are user-defined fields that extend contact data beyond standard fields like name and email. Use this action to discover available custom attributes before reading or updating contact information. Supports cursor-based pagination for accounts with many custom attributes. Common use cases: - Discover available custom fields before updating contacts - Retrieve custom attribute types (string, number, date, select, etc.) - Get custom attribute IDs for use in other API calls |
| `SUPERCHAT_LIST_INBOXES` | List Inboxes | Tool to list all inboxes. Use when you need to retrieve inbox IDs and metadata before sending or organizing messages. |
| `SUPERCHAT_LIST_LABELS` | List Labels | List all labels in the Superchat workspace. Labels are used to categorize and organize conversations. Use this tool to retrieve available labels and their IDs, which can then be used to assign labels to conversations via the update conversation endpoint. Supports cursor-based pagination for workspaces with many labels. |
| `SUPERCHAT_LIST_TEMPLATES` | List Templates | Tool to list all message templates. Use when you need to fetch available message templates. |
| `SUPERCHAT_LIST_USERS` | List Users | Retrieve all users in the workspace. Returns user profiles including names, emails, roles, and contact info. Supports pagination for large workspaces. Use cases: - Get a directory of all workspace members - Find user IDs for other API operations - Audit user roles and permissions |
| `SUPERCHAT_LIST_WEBHOOKS` | List Webhooks | Tool to list all webhooks configured in the workspace. Use this tool to: - Retrieve all active and paused webhooks - Get webhook IDs for use with update or delete operations - Check webhook status and event subscriptions Supports pagination via 'limit', 'after', and 'before' parameters. |
| `SUPERCHAT_UPDATE_CONTACT` | Update Contact | Update information for a specific contact in Superchat. Use this tool to modify a contact's name, gender, handles (phone/email), or custom attributes. Requires the contact_id (prefixed with 'ct_') which can be obtained from List Contacts or Create Contact. Examples: - Update first name: {"contact_id": "ct_abc123", "first_name": "Jane"} - Update gender: {"contact_id": "ct_abc123", "gender": "female"} - Update phone handle: {"contact_id": "ct_abc123", "handles": [{"type": "phone", "value": "+1234567890"}]} |
| `SUPERCHAT_UPDATE_WEBHOOK` | Update Webhook | Update an existing webhook's target URL and/or event subscriptions. Use this tool to: - Change the webhook delivery URL - Add or remove event subscriptions - Update event types the webhook listens to Note: Webhook status (ACTIVE/PAUSED) is automatically managed by the API and cannot be manually changed. Webhooks become PAUSED after 7 days of consistent delivery failures. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Superchat MCP server is an implementation of the Model Context Protocol that connects your AI agent to Superchat. It provides structured and secure access so your agent can perform Superchat operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before you begin, make sure you have:
- Python 3.8/Node 16 or higher installed
- A Composio account with the API key
- An OpenAI API key
- A Superchat account and project
- Basic familiarity with async Python/Typescript

### 1. Getting API Keys for OpenAI, Composio, and Superchat

No description provided.

### 2. Installing dependencies

No description provided.
```python
pip install composio-llamaindex llama-index llama-index-llms-openai llama-index-tools-mcp python-dotenv
```

```typescript
npm install @composio/llamaindex @llamaindex/openai @llamaindex/tools @llamaindex/workflow dotenv
```

### 3. Set environment variables

Create a .env file in your project root:
These credentials will be used to:
- Authenticate with OpenAI's GPT-5 model
- Connect to Composio's Tool Router
- Identify your Composio user session for Superchat access
```bash
OPENAI_API_KEY=your-openai-api-key
COMPOSIO_API_KEY=your-composio-api-key
COMPOSIO_USER_ID=your-user-id
```

### 4. Import modules

No description provided.
```python
import asyncio
import os
import dotenv

from composio import Composio
from composio_llamaindex import LlamaIndexProvider
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.workflow import Context
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec

dotenv.load_dotenv()
```

```typescript
import "dotenv/config";
import readline from "node:readline/promises";
import { stdin as input, stdout as output } from "node:process";

import { Composio } from "@composio/core";

import { mcp } from "@llamaindex/tools";
import { agent as createAgent } from "@llamaindex/workflow";
import { openai } from "@llamaindex/openai";

dotenv.config();
```

### 5. Load environment variables and initialize Composio

No description provided.
```python
OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY is not set in the environment")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set in the environment")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set in the environment")
```

```typescript
const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const COMPOSIO_API_KEY = process.env.COMPOSIO_API_KEY;
const COMPOSIO_USER_ID = process.env.COMPOSIO_USER_ID;

if (!OPENAI_API_KEY) throw new Error("OPENAI_API_KEY is not set");
if (!COMPOSIO_API_KEY) throw new Error("COMPOSIO_API_KEY is not set");
if (!COMPOSIO_USER_ID) throw new Error("COMPOSIO_USER_ID is not set");
```

### 6. Create a Tool Router session and build the agent function

What's happening here:
- We create a Composio client using your API key and configure it with the LlamaIndex provider
- We then create a tool router MCP session for your user, specifying the toolkits we want to use (in this case, superchat)
- The session returns an MCP HTTP endpoint URL that acts as a gateway to all your configured tools
- LlamaIndex will connect to this endpoint to dynamically discover and use the available Superchat tools.
- The MCP tools are mapped to LlamaIndex-compatible tools and plug them into the Agent.
```python
async def build_agent() -> ReActAgent:
    composio_client = Composio(
        api_key=COMPOSIO_API_KEY,
        provider=LlamaIndexProvider(),
    )

    session = composio_client.create(
        user_id=COMPOSIO_USER_ID,
        toolkits=["superchat"],
    )

    mcp_url = session.mcp.url
    print(f"Composio MCP URL: {mcp_url}")

    mcp_client = BasicMCPClient(mcp_url, headers={"x-api-key": COMPOSIO_API_KEY})
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    llm = OpenAI(model="gpt-5")

    description = "An agent that uses Composio Tool Router MCP tools to perform Superchat actions."
    system_prompt = """
    You are a helpful assistant connected to Composio Tool Router.
    Use the available tools to answer user queries and perform Superchat actions.
    """
    return ReActAgent(tools=tools, llm=llm, description=description, system_prompt=system_prompt, verbose=True)
```

```typescript
async function buildAgent() {

  console.log(`Initializing Composio client...${COMPOSIO_USER_ID!}...`);
  console.log(`COMPOSIO_USER_ID: ${COMPOSIO_USER_ID!}...`);

  const composio = new Composio({
    apiKey: COMPOSIO_API_KEY,
    provider: new LlamaindexProvider(),
  });

  const session = await composio.create(
    COMPOSIO_USER_ID!,
    {
      toolkits: ["superchat"],
    },
  );

  const mcpUrl = session.mcp.url;
  console.log(`Composio Tool Router MCP URL: ${mcpUrl}`);

  const server = mcp({
    url: mcpUrl,
    clientName: "composio_tool_router_with_llamaindex",
    requestInit: {
      headers: {
        "x-api-key": COMPOSIO_API_KEY!,
      },
    },
    // verbose: true,
  });

  const tools = await server.tools();

  const llm = openai({ apiKey: OPENAI_API_KEY, model: "gpt-5" });

  const agent = createAgent({
    name: "composio_tool_router_with_llamaindex",
        description : "An agent that uses Composio Tool Router MCP tools to perform actions.",
    systemPrompt:
      "You are a helpful assistant connected to Composio Tool Router."+
"Use the available tools to answer user queries and perform Superchat actions." ,
    llm,
    tools,
  });

  return agent;
}
```

### 7. Create an interactive chat loop

No description provided.
```python
async def chat_loop(agent: ReActAgent) -> None:
    ctx = Context(agent)
    print("Type 'quit', 'exit', or Ctrl+C to stop.")

    while True:
        try:
            user_input = input("\nYou: ").strip()
        except (KeyboardInterrupt, EOFError):
            print("\nBye!")
            break

        if not user_input or user_input.lower() in {"quit", "exit"}:
            print("Bye!")
            break

        try:
            print("Agent: ", end="", flush=True)
            handler = agent.run(user_input, ctx=ctx)

            async for event in handler.stream_events():
                # Stream token-by-token from LLM responses
                if hasattr(event, "delta") and event.delta:
                    print(event.delta, end="", flush=True)
                # Show tool calls as they happen
                elif hasattr(event, "tool_name"):
                    print(f"\n[Using tool: {event.tool_name}]", flush=True)

            # Get final response
            response = await handler
            print()  # Newline after streaming
        except KeyboardInterrupt:
            print("\n[Interrupted]")
            continue
        except Exception as e:
            print(f"\nError: {e}")
```

```typescript
async function chatLoop(agent: ReturnType<typeof createAgent>) {
  const rl = readline.createInterface({ input, output });

  console.log("Type 'quit' or 'exit' to stop.");

  while (true) {
    let userInput: string;

    try {
      userInput = (await rl.question("\nYou: ")).trim();
    } catch {
      console.log("\nAgent: Bye!");
      break;
    }

    if (!userInput) {
      continue;
    }

    const lower = userInput.toLowerCase();
    if (lower === "quit" || lower === "exit") {
      console.log("Agent: Bye!");
      break;
    }

    try {
      process.stdout.write("Agent: ");

      const stream = agent.runStream(userInput);
      let finalResult: any = null;

      for await (const event of stream) {
        // The event.data contains the streamed content
        const data: any = event.data;

        // Check for streaming delta content
        if (data?.delta) {
          process.stdout.write(data.delta);
        }

        // Store final result for fallback
        if (data?.result || data?.message) {
          finalResult = data;
        }
      }

      // If no streaming happened, show the final result
      if (finalResult) {
        const answer =
          finalResult.result ??
          finalResult.message?.content ??
          finalResult.message ??
          "";
        if (answer && typeof answer === "string" && !answer.includes("[object")) {
          process.stdout.write(answer);
        }
      }

      console.log(); // New line after streaming completes
    } catch (err: any) {
      console.error("\nAgent error:", err?.message ?? err);
    }
  }

  rl.close();
}
```

### 8. Define the main entry point

What's happening here:
- We're orchestrating the entire application flow
- The agent gets built with proper error handling
- Then we kick off the interactive chat loop so you can start talking to Superchat
```python
async def main() -> None:
    agent = await build_agent()
    await chat_loop(agent)

if __name__ == "__main__":
    # Handle Ctrl+C gracefully
    signal.signal(signal.SIGINT, lambda s, f: (print("\nBye!"), exit(0)))
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print("\nBye!")
```

```typescript
async function main() {
  try {
    const agent = await buildAgent();
    await chatLoop(agent);
  } catch (err) {
    console.error("Failed to start agent:", err);
    process.exit(1);
  }
}

main();
```

### 9. Run the agent

When prompted, authenticate and authorise your agent with Superchat, then start asking questions.
```bash
python llamaindex_agent.py
```

```typescript
npx ts-node llamaindex-agent.ts
```

## Complete Code

```python
import asyncio
import os
import signal
import dotenv

from composio import Composio
from composio_llamaindex import LlamaIndexProvider
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.workflow import Context
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec

dotenv.load_dotenv()

OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY is not set")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set")

async def build_agent() -> ReActAgent:
    composio_client = Composio(
        api_key=COMPOSIO_API_KEY,
        provider=LlamaIndexProvider(),
    )

    session = composio_client.create(
        user_id=COMPOSIO_USER_ID,
        toolkits=["superchat"],
    )

    mcp_url = session.mcp.url
    print(f"Composio MCP URL: {mcp_url}")

    mcp_client = BasicMCPClient(mcp_url, headers={"x-api-key": COMPOSIO_API_KEY})
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    llm = OpenAI(model="gpt-5")
    description = "An agent that uses Composio Tool Router MCP tools to perform Superchat actions."
    system_prompt = """
    You are a helpful assistant connected to Composio Tool Router.
    Use the available tools to answer user queries and perform Superchat actions.
    """
    return ReActAgent(
        tools=tools,
        llm=llm,
        description=description,
        system_prompt=system_prompt,
        verbose=True,
    );

async def chat_loop(agent: ReActAgent) -> None:
    ctx = Context(agent)
    print("Type 'quit', 'exit', or Ctrl+C to stop.")

    while True:
        try:
            user_input = input("\nYou: ").strip()
        except (KeyboardInterrupt, EOFError):
            print("\nBye!")
            break

        if not user_input or user_input.lower() in {"quit", "exit"}:
            print("Bye!")
            break

        try:
            print("Agent: ", end="", flush=True)
            handler = agent.run(user_input, ctx=ctx)

            async for event in handler.stream_events():
                # Stream token-by-token from LLM responses
                if hasattr(event, "delta") and event.delta:
                    print(event.delta, end="", flush=True)
                # Show tool calls as they happen
                elif hasattr(event, "tool_name"):
                    print(f"\n[Using tool: {event.tool_name}]", flush=True)

            # Get final response
            response = await handler
            print()  # Newline after streaming
        except KeyboardInterrupt:
            print("\n[Interrupted]")
            continue
        except Exception as e:
            print(f"\nError: {e}")

async def main() -> None:
    agent = await build_agent()
    await chat_loop(agent)

if __name__ == "__main__":
    # Handle Ctrl+C gracefully
    signal.signal(signal.SIGINT, lambda s, f: (print("\nBye!"), exit(0)))
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print("\nBye!")
```

```typescript
import "dotenv/config";
import readline from "node:readline/promises";
import { stdin as input, stdout as output } from "node:process";

import { Composio } from "@composio/core";
import { LlamaindexProvider } from "@composio/llamaindex";

import { mcp } from "@llamaindex/tools";
import { agent as createAgent } from "@llamaindex/workflow";
import { openai } from "@llamaindex/openai";

dotenv.config();

const OPENAI_API_KEY = process.env.OPENAI_API_KEY;
const COMPOSIO_API_KEY = process.env.COMPOSIO_API_KEY;
const COMPOSIO_USER_ID = process.env.COMPOSIO_USER_ID;

if (!OPENAI_API_KEY) {
    throw new Error("OPENAI_API_KEY is not set in the environment");
  }
if (!COMPOSIO_API_KEY) {
    throw new Error("COMPOSIO_API_KEY is not set in the environment");
  }
if (!COMPOSIO_USER_ID) {
    throw new Error("COMPOSIO_USER_ID is not set in the environment");
  }

async function buildAgent() {

  console.log(`Initializing Composio client...${COMPOSIO_USER_ID!}...`);
  console.log(`COMPOSIO_USER_ID: ${COMPOSIO_USER_ID!}...`);

  const composio = new Composio({
    apiKey: COMPOSIO_API_KEY,
    provider: new LlamaindexProvider(),
  });

  const session = await composio.create(
    COMPOSIO_USER_ID!,
    {
      toolkits: ["superchat"],
    },
  );

  const mcpUrl = session.mcp.url;
  console.log(`Composio Tool Router MCP URL: ${mcpUrl}`);

  const server = mcp({
    url: mcpUrl,
    clientName: "composio_tool_router_with_llamaindex",
    requestInit: {
      headers: {
        "x-api-key": COMPOSIO_API_KEY!,
      },
    },
    // verbose: true,
  });

  const tools = await server.tools();

  const llm = openai({ apiKey: OPENAI_API_KEY, model: "gpt-5" });

  const agent = createAgent({
    name: "composio_tool_router_with_llamaindex",
    description:
      "An agent that uses Composio Tool Router MCP tools to perform actions.",
    systemPrompt:
      "You are a helpful assistant connected to Composio Tool Router."+
"Use the available tools to answer user queries and perform Superchat actions." ,
    llm,
    tools,
  });

  return agent;
}

async function chatLoop(agent: ReturnType<typeof createAgent>) {
  const rl = readline.createInterface({ input, output });

  console.log("Type 'quit' or 'exit' to stop.");

  while (true) {
    let userInput: string;

    try {
      userInput = (await rl.question("\nYou: ")).trim();
    } catch {
      console.log("\nAgent: Bye!");
      break;
    }

    if (!userInput) {
      continue;
    }

    const lower = userInput.toLowerCase();
    if (lower === "quit" || lower === "exit") {
      console.log("Agent: Bye!");
      break;
    }

    try {
      process.stdout.write("Agent: ");

      const stream = agent.runStream(userInput);
      let finalResult: any = null;

      for await (const event of stream) {
        // The event.data contains the streamed content
        const data: any = event.data;

        // Check for streaming delta content
        if (data?.delta) {
          process.stdout.write(data.delta);
        }

        // Store final result for fallback
        if (data?.result || data?.message) {
          finalResult = data;
        }
      }

      // If no streaming happened, show the final result
      if (finalResult) {
        const answer =
          finalResult.result ??
          finalResult.message?.content ??
          finalResult.message ??
          "";
        if (answer && typeof answer === "string" && !answer.includes("[object")) {
          process.stdout.write(answer);
        }
      }

      console.log(); // New line after streaming completes
    } catch (err: any) {
      console.error("\nAgent error:", err?.message ?? err);
    }
  }

  rl.close();
}

async function main() {
  try {
    const agent = await buildAgent();
    await chatLoop(agent);
  } catch (err: any) {
    console.error("Failed to start agent:", err?.message ?? err);
    process.exit(1);
  }
}

main();
```

## Conclusion

You've successfully connected Superchat to LlamaIndex through Composio's Tool Router MCP layer.
Key takeaways:
- Tool Router dynamically exposes Superchat tools through an MCP endpoint
- LlamaIndex's ReActAgent handles reasoning and orchestration; Composio handles integrations
- The agent becomes more capable without increasing prompt size
- Async Python provides clean, efficient execution of agent workflows
You can easily extend this to other toolkits like Gmail, Notion, Stripe, GitHub, and more by adding them to the toolkits parameter.

## How to build Superchat MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/superchat/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/superchat/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/superchat/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/superchat/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/superchat/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/superchat/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/superchat/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/superchat/framework/cli)
- [Google ADK](https://composio.dev/toolkits/superchat/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/superchat/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/superchat/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/superchat/framework/mastra-ai)
- [CrewAI](https://composio.dev/toolkits/superchat/framework/crew-ai)

## Related Toolkits

- [Gmail](https://composio.dev/toolkits/gmail) - Gmail is Google's email service with powerful spam protection, search, and G Suite integration. It keeps your inbox organized and makes communication fast and reliable.
- [Outlook](https://composio.dev/toolkits/outlook) - Outlook is Microsoft's email and calendaring platform for unified communications and scheduling. It helps users stay organized with powerful email, contacts, and calendar management.
- [Slack](https://composio.dev/toolkits/slack) - Slack is a channel-based messaging platform for teams and organizations. It helps people collaborate in real time, share files, and connect all their tools in one place.
- [Gong](https://composio.dev/toolkits/gong) - Gong is a platform for video meetings, call recording, and team collaboration. It helps teams capture conversations, analyze calls, and turn insights into action.
- [Microsoft teams](https://composio.dev/toolkits/microsoft_teams) - Microsoft Teams is a collaboration platform that combines chat, meetings, and file sharing within Microsoft 365. It keeps distributed teams connected and productive through seamless virtual communication.
- [Slackbot](https://composio.dev/toolkits/slackbot) - Slackbot is a conversational automation tool for Slack that handles reminders, notifications, and automated responses. It boosts team productivity by streamlining onboarding, answering FAQs, and managing timely alerts—all right inside Slack.
- [2chat](https://composio.dev/toolkits/_2chat) - 2chat is an API platform for WhatsApp and multichannel text messaging. It streamlines chat automation, group management, and real-time messaging for developers.
- [Agent mail](https://composio.dev/toolkits/agent_mail) - Agent mail provides AI agents with dedicated email inboxes for sending, receiving, and managing emails. It empowers agents to communicate autonomously with people, services, and other agents—no human intervention needed.
- [Basecamp](https://composio.dev/toolkits/basecamp) - Basecamp is a project management and team collaboration tool by 37signals. It helps teams organize tasks, share files, and communicate efficiently in one place.
- [Chatwork](https://composio.dev/toolkits/chatwork) - Chatwork is a team communication platform with group chats, file sharing, and task management. It helps businesses boost collaboration and streamline productivity.
- [Clickmeeting](https://composio.dev/toolkits/clickmeeting) - ClickMeeting is a cloud-based platform for running online meetings and webinars. It helps businesses and individuals host, manage, and engage virtual audiences with ease.
- [Confluence](https://composio.dev/toolkits/confluence) - Confluence is Atlassian's team collaboration and knowledge management platform. It helps your team organize, share, and update documents and project content in one secure workspace.
- [Dailybot](https://composio.dev/toolkits/dailybot) - DailyBot streamlines team collaboration with chat-based standups, reminders, and polls. It keeps work flowing smoothly in your favorite messaging platforms.
- [Dialmycalls](https://composio.dev/toolkits/dialmycalls) - Dialmycalls is a mass notification service for sending voice and text messages to contacts. It helps teams and organizations quickly broadcast urgent alerts and updates.
- [Dialpad](https://composio.dev/toolkits/dialpad) - Dialpad is a cloud-based business phone and contact center system for teams. It unifies voice, video, messaging, and meetings across your devices.
- [Discord](https://composio.dev/toolkits/discord) - Discord is a real-time messaging and VoIP platform for communities and teams. It lets users chat, share media, and collaborate across public and private channels.
- [Discordbot](https://composio.dev/toolkits/discordbot) - Discordbot is an automation tool for Discord servers that handles moderation, messaging, and user engagement. It helps communities run smoothly by automating routine and complex tasks.
- [Echtpost](https://composio.dev/toolkits/echtpost) - Echtpost is a secure digital communication platform for encrypted document and message exchange. It ensures confidential data stays private and protected during transmission.
- [Egnyte](https://composio.dev/toolkits/egnyte) - Egnyte is a cloud-based platform for secure file sharing, storage, and governance. It helps teams collaborate efficiently while maintaining data compliance and security.
- [Google Meet](https://composio.dev/toolkits/googlemeet) - Google Meet is a secure video conferencing platform for virtual meetings, chat, and screen sharing. It helps teams connect, collaborate, and communicate seamlessly from anywhere.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Superchat MCP?

With a standalone Superchat MCP server, the agents and LLMs can only access a fixed set of Superchat tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Superchat and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with LlamaIndex?

Yes, you can. LlamaIndex fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Superchat tools.

### Can I manage the permissions and scopes for Superchat while using Tool Router?

Yes, absolutely. You can configure which Superchat scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Superchat data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
