# How to integrate Conveyor MCP with Vercel AI SDK v6

```json
{
  "title": "How to integrate Conveyor MCP with Vercel AI SDK v6",
  "toolkit": "Conveyor",
  "toolkit_slug": "conveyor",
  "framework": "Vercel AI SDK",
  "framework_slug": "ai-sdk",
  "url": "https://composio.dev/toolkits/conveyor/framework/ai-sdk",
  "markdown_url": "https://composio.dev/toolkits/conveyor/framework/ai-sdk.md",
  "updated_at": "2026-05-06T08:07:22.514Z"
}
```

## Introduction

This guide walks you through connecting Conveyor to Vercel AI SDK v6 using the Composio tool router. By the end, you'll have a working Conveyor agent that can list all pending authorization requests, fetch all documents in your trust center, delete a folder by its id through natural language commands.
This guide will help you understand how to give your Vercel AI SDK agent real control over a Conveyor account through Composio's Conveyor MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Conveyor with

- [OpenAI Agents SDK](https://composio.dev/toolkits/conveyor/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/conveyor/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/conveyor/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/conveyor/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/conveyor/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/conveyor/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/conveyor/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/conveyor/framework/cli)
- [Google ADK](https://composio.dev/toolkits/conveyor/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/conveyor/framework/langchain)
- [Mastra AI](https://composio.dev/toolkits/conveyor/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/conveyor/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/conveyor/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- How to set up and configure a Vercel AI SDK agent with Conveyor integration
- Using Composio's Tool Router to dynamically load and access Conveyor tools
- Creating an MCP client connection using HTTP transport
- Building an interactive CLI chat interface with conversation history management
- Handling tool calls and results within the Vercel AI SDK framework

## What is Vercel AI SDK?

The Vercel AI SDK is a TypeScript library for building AI-powered applications. It provides tools for creating agents that can use external services and maintain conversation state.
Key features include:
- streamText: Core function for streaming responses with real-time tool support
- MCP Client: Built-in support for Model Context Protocol via @ai-sdk/mcp
- Step Counting: Control multi-step tool execution with stopWhen: stepCountIs()
- OpenAI Provider: Native integration with OpenAI models

## What is the Conveyor MCP server, and what's possible with it?

The Conveyor MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Conveyor account. It provides structured and secure access to your security reviews and compliance workflows, so your agent can perform actions like retrieving documents, managing authorization requests, tracking connections, and automating security questionnaire processes on your behalf.
- Authorization request management: Fetch, list, and review details of all authorization requests, making it easy for your agent to help you track and respond to security and compliance requests in real time.
- Document and folder automation: Retrieve, organize, or delete specific documents and folders, ensuring your Trust Center stays tidy and up to date without manual effort.
- Connection insights and tracking: Access a complete list of your Conveyor connections, letting your agent monitor integrations and stay on top of your security ecosystem.
- Interaction history by document: Instantly pull all interactions related to a specific document, so your agent can summarize or audit user activity for compliance needs.
- API token validation and guidance: Use AI-driven guidance to validate API tokens and get structured support for access issues, helping keep your Conveyor integration secure and running smoothly.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `CONVEYOR_DELETE_DOCUMENT` | Delete a Conveyor document | Tool to delete a specific document. Use when you need to remove a document by its ID. |
| `CONVEYOR_DELETE_FOLDER` | Delete folder | Tool to delete a folder by its ID. Use when you need to remove a specific folder after confirming its ID. |
| `CONVEYOR_GENERATE_API_TOKEN` | Generate/Validate API Token Guidance | Tool to validate API token and provide guidance. Conveyor does not support API-based token creation; tokens must be created in the Conveyor UI. This action performs a real API call (using the provided metadata) to validate the existing API token and returns structured guidance. |
| `CONVEYOR_GET_AUTHORIZATION_REQUEST` | Get Authorization Request | Tool to fetch details of a specific authorization request. Use when you need to retrieve metadata by authorization_request_id. |
| `CONVEYOR_GET_AUTHORIZATION_REQUESTS` | Get Authorization Requests | Tool to fetch authorization requests. Use when you need to list authorization requests, optionally filtered by status. |
| `CONVEYOR_GET_AUTHORIZATIONS` | Get all authorization requests | Tool to retrieve all authorization requests. Use when you need to list all authorizations; optionally filter by status. Use after authenticating with a valid API token. |
| `CONVEYOR_GET_CONNECTIONS` | Get all Conveyor connections | Tool to retrieve all connections. Use when you need to fetch the complete list of your Conveyor connections. Use after authenticating with a valid API key. |
| `CONVEYOR_GET_DOCUMENTS` | Get all Conveyor documents | Tool to retrieve all documents. Use after authenticating with a valid API key. |
| `CONVEYOR_GET_FOLDERS` | Get all Conveyor folders | Tool to retrieve all folders. Use after authenticating with a valid API key to fetch the complete list of your Conveyor folders. |
| `CONVEYOR_GET_INTERACTIONS_BY_DOCUMENT_ID` | Get interactions by document ID | Tool to fetch interactions associated with a specific document. Use when you need to list all interactions for a given document after validating its existence. |
| `CONVEYOR_GET_KNOWLEDGE_BASE_QUESTIONS` | Get Knowledge Base Questions | Tool to retrieve knowledge base questions. Use when you need to fetch all questions from the Conveyor knowledge base. |
| `CONVEYOR_GET_PRODUCT_LINES` | Get product lines | Tool to fetch all product lines. Use when you need to retrieve product lines after confirming API key validity. |
| `CONVEYOR_PATCH_AUTHORIZATION` | Patch authorization | Tool to update or revoke an existing authorization. Use when managing authorization access groups or revoking access. |
| `CONVEYOR_PATCH_DOCUMENT` | Patch Conveyor document | Tool to update document attributes. Use when you need to modify fields of an existing document by its ID. |
| `CONVEYOR_POST_AUTHORIZATION` | Create new authorization | Tool to create a new authorization. Use when you need to grant access by email or from a prior authorization request. |
| `CONVEYOR_POST_DOCUMENT` | Upload new document | Tool to upload a new document. Use when you have a local file (<=100MB) to send to Conveyor. |
| `CONVEYOR_POST_FOLDER` | Create new folder | Tool to create a new folder in Conveyor Exchange. Use when you need to organize items into folders programmatically after obtaining an API key. |
| `CONVEYOR_POST_SINGLE_QUESTION` | Submit single question | Tool to submit a single question. Use when you need an immediate AI-generated answer for a specific product line question. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Conveyor MCP server is an implementation of the Model Context Protocol that connects your AI agent to Conveyor. It provides structured and secure access so your agent can perform Conveyor operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before you begin, make sure you have:
- Node.js and npm installed
- A Composio account with API key
- An OpenAI API key

### 1. Getting API Keys for OpenAI and Composio

OpenAI API Key
- Go to the [OpenAI dashboard](https://platform.openai.com/settings/organization/api-keys) and create an API key. You'll need credits to use the models, or you can connect to another model provider.
- Keep the API key safe.
Composio API Key
- Log in to the [Composio dashboard](https://dashboard.composio.dev?utm_source=toolkits&utm_medium=framework_docs).
- Navigate to your API settings and generate a new API key.
- Store this key securely as you'll need it for authentication.

### 2. Install required dependencies

First, install the necessary packages for your project.
What you're installing:
- @ai-sdk/openai: Vercel AI SDK's OpenAI provider
- @ai-sdk/mcp: MCP client for Vercel AI SDK
- @composio/core: Composio SDK for tool integration
- ai: Core Vercel AI SDK
- dotenv: Environment variable management
```bash
npm install @ai-sdk/openai @ai-sdk/mcp @composio/core ai dotenv
```

### 3. Set up environment variables

Create a .env file in your project root.
What's needed:
- OPENAI_API_KEY: Your OpenAI API key for GPT model access
- COMPOSIO_API_KEY: Your Composio API key for tool access
- COMPOSIO_USER_ID: A unique identifier for the user session
```bash
OPENAI_API_KEY=your_openai_api_key_here
COMPOSIO_API_KEY=your_composio_api_key_here
COMPOSIO_USER_ID=your_user_id_here
```

### 4. Import required modules and validate environment

What's happening:
- We're importing all necessary libraries including Vercel AI SDK's OpenAI provider and Composio
- The dotenv/config import automatically loads environment variables
- The MCP client import enables connection to Composio's tool server
```typescript
import "dotenv/config";
import { openai } from "@ai-sdk/openai";
import { Composio } from "@composio/core";
import * as readline from "readline";
import { streamText, type ModelMessage, stepCountIs } from "ai";
import { createMCPClient } from "@ai-sdk/mcp";

const composioAPIKey = process.env.COMPOSIO_API_KEY;
const composioUserID = process.env.COMPOSIO_USER_ID;

if (!process.env.OPENAI_API_KEY) throw new Error("OPENAI_API_KEY is not set");
if (!composioAPIKey) throw new Error("COMPOSIO_API_KEY is not set");
if (!composioUserID) throw new Error("COMPOSIO_USER_ID is not set");

const composio = new Composio({
  apiKey: composioAPIKey,
});
```

### 5. Create Tool Router session and initialize MCP client

What's happening:
- We're creating a Tool Router session that gives your agent access to Conveyor tools
- The create method takes the user ID and specifies which toolkits should be available
- The returned mcp object contains the URL and authentication headers needed to connect to the MCP server
- This session provides access to all Conveyor-related tools through the MCP protocol
```typescript
async function main() {
  // Create a tool router session for the user
  const session = await composio.create(composioUserID!, {
    toolkits: ["conveyor"],
  });

  const mcpUrl = session.mcp.url;
```

### 6. Connect to MCP server and retrieve tools

What's happening:
- We're creating an MCP client that connects to our Composio Tool Router session via HTTP
- The mcp.url provides the endpoint, and mcp.headers contains authentication credentials
- The type: "http" is important - Composio requires HTTP transport
- tools() retrieves all available Conveyor tools that the agent can use
```typescript
const mcpClient = await createMCPClient({
  transport: {
    type: "http",
    url: mcpUrl,
    headers: session.mcp.headers, // Authentication headers for the Composio MCP server
  },
});

const tools = await mcpClient.tools();
```

### 7. Initialize conversation and CLI interface

What's happening:
- We initialize an empty messages array to maintain conversation history
- A readline interface is created to accept user input from the command line
- Instructions are displayed to guide the user on how to interact with the agent
```typescript
let messages: ModelMessage[] = [];

console.log("Chat started! Type 'exit' or 'quit' to end the conversation.\n");
console.log(
  "Ask any questions related to conveyor, like summarize my last 5 emails, send an email, etc... :)))\n",
);

const rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout,
  prompt: "> ",
});

rl.prompt();
```

### 8. Handle user input and stream responses with real-time tool feedback

What's happening:
- We use streamText instead of generateText to stream responses in real-time
- toolChoice: "auto" allows the model to decide when to use Conveyor tools
- stopWhen: stepCountIs(10) allows up to 10 steps for complex multi-tool operations
- onStepFinish callback displays which tools are being used in real-time
- We iterate through the text stream to create a typewriter effect as the agent responds
- The complete response is added to conversation history to maintain context
- Errors are caught and displayed with helpful retry suggestions
```typescript
rl.on("line", async (userInput: string) => {
  const trimmedInput = userInput.trim();

  if (["exit", "quit", "bye"].includes(trimmedInput.toLowerCase())) {
    console.log("\nGoodbye!");
    rl.close();
    process.exit(0);
  }

  if (!trimmedInput) {
    rl.prompt();
    return;
  }

  messages.push({ role: "user", content: trimmedInput });
  console.log("\nAgent is thinking...\n");

  try {
    const stream = streamText({
      model: openai("gpt-5"),
      messages,
      tools,
      toolChoice: "auto",
      stopWhen: stepCountIs(10),
      onStepFinish: (step) => {
        for (const toolCall of step.toolCalls) {
          console.log(`[Using tool: ${toolCall.toolName}]`);
          }
          if (step.toolCalls.length > 0) {
            console.log(""); // Add space after tool calls
          }
        },
      });

      for await (const chunk of stream.textStream) {
        process.stdout.write(chunk);
      }

      console.log("\n\n---\n");

      // Get final result for message history
      const response = await stream.response;
      if (response?.messages?.length) {
        messages.push(...response.messages);
      }
    } catch (error) {
      console.error("\nAn error occurred while talking to the agent:");
      console.error(error);
      console.log(
        "\nYou can try again or restart the app if it keeps happening.\n",
      );
    } finally {
      rl.prompt();
    }
  });

  rl.on("close", async () => {
    await mcpClient.close();
    console.log("\n👋 Session ended.");
    process.exit(0);
  });
}

main().catch((err) => {
  console.error("Fatal error:", err);
  process.exit(1);
});
```

## Complete Code

```typescript
import "dotenv/config";
import { openai } from "@ai-sdk/openai";
import { Composio } from "@composio/core";
import * as readline from "readline";
import { streamText, type ModelMessage, stepCountIs } from "ai";
import { createMCPClient } from "@ai-sdk/mcp";

const composioAPIKey = process.env.COMPOSIO_API_KEY;
const composioUserID = process.env.COMPOSIO_USER_ID;

if (!process.env.OPENAI_API_KEY) throw new Error("OPENAI_API_KEY is not set");
if (!composioAPIKey) throw new Error("COMPOSIO_API_KEY is not set");
if (!composioUserID) throw new Error("COMPOSIO_USER_ID is not set");

const composio = new Composio({
  apiKey: composioAPIKey,
});

async function main() {
  // Create a tool router session for the user
  const session = await composio.create(composioUserID!, {
    toolkits: ["conveyor"],
  });

  const mcpUrl = session.mcp.url;

  const mcpClient = await createMCPClient({
    transport: {
      type: "http",
      url: mcpUrl,
      headers: session.mcp.headers, // Authentication headers for the Composio MCP server
    },
  });

  const tools = await mcpClient.tools();

  let messages: ModelMessage[] = [];

  console.log("Chat started! Type 'exit' or 'quit' to end the conversation.\n");
  console.log(
    "Ask any questions related to conveyor, like summarize my last 5 emails, send an email, etc... :)))\n",
  );

  const rl = readline.createInterface({
    input: process.stdin,
    output: process.stdout,
    prompt: "> ",
  });

  rl.prompt();

  rl.on("line", async (userInput: string) => {
    const trimmedInput = userInput.trim();

    if (["exit", "quit", "bye"].includes(trimmedInput.toLowerCase())) {
      console.log("\nGoodbye!");
      rl.close();
      process.exit(0);
    }

    if (!trimmedInput) {
      rl.prompt();
      return;
    }

    messages.push({ role: "user", content: trimmedInput });
    console.log("\nAgent is thinking...\n");

    try {
      const stream = streamText({
        model: openai("gpt-5"),
        messages,
        tools,
        toolChoice: "auto",
        stopWhen: stepCountIs(10),
        onStepFinish: (step) => {
          for (const toolCall of step.toolCalls) {
            console.log(`[Using tool: ${toolCall.toolName}]`);
          }
          if (step.toolCalls.length > 0) {
            console.log(""); // Add space after tool calls
          }
        },
      });

      for await (const chunk of stream.textStream) {
        process.stdout.write(chunk);
      }

      console.log("\n\n---\n");

      // Get final result for message history
      const response = await stream.response;
      if (response?.messages?.length) {
        messages.push(...response.messages);
      }
    } catch (error) {
      console.error("\nAn error occurred while talking to the agent:");
      console.error(error);
      console.log(
        "\nYou can try again or restart the app if it keeps happening.\n",
      );
    } finally {
      rl.prompt();
    }
  });

  rl.on("close", async () => {
    await mcpClient.close();
    console.log("\n👋 Session ended.");
    process.exit(0);
  });
}

main().catch((err) => {
  console.error("Fatal error:", err);
  process.exit(1);
});
```

## Conclusion

You've successfully built a Conveyor agent using the Vercel AI SDK with streaming capabilities! This implementation provides a powerful foundation for building AI applications with natural language interfaces and real-time feedback.
Key features of this implementation:
- Real-time streaming responses for a better user experience with typewriter effect
- Live tool execution feedback showing which tools are being used as the agent works
- Dynamic tool loading through Composio's Tool Router with secure authentication
- Multi-step tool execution with configurable step limits (up to 10 steps)
- Comprehensive error handling for robust agent execution
- Conversation history maintenance for context-aware responses
You can extend this further by adding custom error handling, implementing specific business logic, or integrating additional Composio toolkits to create multi-app workflows.

## How to build Conveyor MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/conveyor/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/conveyor/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/conveyor/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/conveyor/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/conveyor/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/conveyor/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/conveyor/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/conveyor/framework/cli)
- [Google ADK](https://composio.dev/toolkits/conveyor/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/conveyor/framework/langchain)
- [Mastra AI](https://composio.dev/toolkits/conveyor/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/conveyor/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/conveyor/framework/crew-ai)

## Related Toolkits

- [Apilio](https://composio.dev/toolkits/apilio) - Apilio is a home automation platform that lets you connect and control smart devices from different brands. It helps you build flexible automations with complex conditions, schedules, and integrations.
- [Basin](https://composio.dev/toolkits/basin) - Basin is a no-code form backend for quickly setting up reliable contact forms. It lets you collect and manage form submissions without writing any server-side code.
- [Bouncer](https://composio.dev/toolkits/bouncer) - Bouncer is an email validation platform that verifies the authenticity of email addresses in real-time and batch. It helps boost deliverability and reduce bounce rates for your communications.
- [Crowdin](https://composio.dev/toolkits/crowdin) - Crowdin is a localization management platform that streamlines translation workflows and collaboration. It helps teams centralize multilingual content, boost productivity, and automate translation processes.
- [Databox](https://composio.dev/toolkits/databox) - Databox is a business analytics platform that connects your data from any tool and device. It helps you track KPIs, build dashboards, and discover actionable insights.
- [Detrack](https://composio.dev/toolkits/detrack) - Detrack is a delivery management platform for real-time tracking and proof of delivery. It helps businesses automate notifications and keep customers updated every step of the way.
- [Dnsfilter](https://composio.dev/toolkits/dnsfilter) - Dnsfilter is a cloud-based DNS security and content filtering solution. It helps organizations block online threats and manage safe internet access with ease.
- [Faraday](https://composio.dev/toolkits/faraday) - Faraday lets you embed AI in workflows across your stack for smarter automation. It boosts your favorite tools with actionable intelligence and seamless integration.
- [Feathery](https://composio.dev/toolkits/feathery) - Feathery is an AI-powered platform for building dynamic data intake forms with advanced logic. It helps teams automate complex workflows and collect structured data with ease.
- [Fillout forms](https://composio.dev/toolkits/fillout_forms) - Fillout forms is an online platform for building and managing forms with a flexible API. It lets you create, distribute, and collect responses from forms with ease.
- [Formdesk](https://composio.dev/toolkits/formdesk) - Formdesk is an online form builder for creating and managing professional forms. It's perfect for collecting data, automating workflows, and integrating form submissions with your favorite services.
- [Formsite](https://composio.dev/toolkits/formsite) - Formsite lets you build online forms and surveys with drag-and-drop simplicity. Capture, manage, and integrate form responses securely for streamlined workflows.
- [Graphhopper](https://composio.dev/toolkits/graphhopper) - GraphHopper is an enterprise-grade Directions API for routing, optimization, and geocoding across multiple vehicle types. It enables fast, reliable route planning and logistics automation for businesses.
- [Hyperbrowser](https://composio.dev/toolkits/hyperbrowser) - Hyperbrowser is a next-generation platform for scalable browser automation. It empowers AI agents to interact with web apps, automate workflows, and handle browser sessions at scale.
- [La Growth Machine](https://composio.dev/toolkits/lagrowthmachine) - La Growth Machine automates multi-channel sales outreach and routine tasks for sales teams. Streamline your workflow and focus on closing more deals.
- [Leverly](https://composio.dev/toolkits/leverly) - Leverly is a workflow automation platform that connects and coordinates actions across your apps. It streamlines repetitive processes so your business runs smoother, faster, and with fewer manual steps.
- [Maintainx](https://composio.dev/toolkits/maintainx) - Maintainx is a cloud-based CMMS for centralizing maintenance data, communication, and workflows. It helps organizations streamline maintenance operations and improve team coordination.
- [Make](https://composio.dev/toolkits/make) - Make is an automation platform that connects your favorite apps and services. Build powerful, custom workflows without writing code.
- [Ntfy](https://composio.dev/toolkits/ntfy) - Ntfy is a notification service to send push messages to phones or desktops. Instantly deliver alerts and updates to users, devices, or teams.
- [Persona](https://composio.dev/toolkits/persona) - Persona offers identity infrastructure to automate user verification and compliance. It helps organizations securely verify users and reduce fraud risk.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Conveyor MCP?

With a standalone Conveyor MCP server, the agents and LLMs can only access a fixed set of Conveyor tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Conveyor and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with Vercel AI SDK v6?

Yes, you can. Vercel AI SDK v6 fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Conveyor tools.

### Can I manage the permissions and scopes for Conveyor while using Tool Router?

Yes, absolutely. You can configure which Conveyor scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Conveyor data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
