# How to integrate Wachete MCP with Vercel AI SDK v6

```json
{
  "title": "How to integrate Wachete MCP with Vercel AI SDK v6",
  "toolkit": "Wachete",
  "toolkit_slug": "wachete",
  "framework": "Vercel AI SDK",
  "framework_slug": "ai-sdk",
  "url": "https://composio.dev/toolkits/wachete/framework/ai-sdk",
  "markdown_url": "https://composio.dev/toolkits/wachete/framework/ai-sdk.md",
  "updated_at": "2026-05-12T10:29:55.603Z"
}
```

## Introduction

This guide walks you through connecting Wachete to Vercel AI SDK v6 using the Composio tool router. By the end, you'll have a working Wachete agent that can monitor a webpage for price changes, list all your active web watchers, delete a watcher monitoring an old url through natural language commands.
This guide will help you understand how to give your Vercel AI SDK agent real control over a Wachete account through Composio's Wachete MCP server.
Before we dive in, let's take a quick look at the key ideas and tools involved.

## Also integrate Wachete with

- [OpenAI Agents SDK](https://composio.dev/toolkits/wachete/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/wachete/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/wachete/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/wachete/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/wachete/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/wachete/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/wachete/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/wachete/framework/cli)
- [Google ADK](https://composio.dev/toolkits/wachete/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/wachete/framework/langchain)
- [Mastra AI](https://composio.dev/toolkits/wachete/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/wachete/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/wachete/framework/crew-ai)

## TL;DR

Here's what you'll learn:
- How to set up and configure a Vercel AI SDK agent with Wachete integration
- Using Composio's Tool Router to dynamically load and access Wachete tools
- Creating an MCP client connection using HTTP transport
- Building an interactive CLI chat interface with conversation history management
- Handling tool calls and results within the Vercel AI SDK framework

## What is Vercel AI SDK?

The Vercel AI SDK is a TypeScript library for building AI-powered applications. It provides tools for creating agents that can use external services and maintain conversation state.
Key features include:
- streamText: Core function for streaming responses with real-time tool support
- MCP Client: Built-in support for Model Context Protocol via @ai-sdk/mcp
- Step Counting: Control multi-step tool execution with stopWhen: stepCountIs()
- OpenAI Provider: Native integration with OpenAI models

## What is the Wachete MCP server, and what's possible with it?

The Wachete MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, and more directly to your Wachete account. It provides structured and secure access to your web monitoring setup, so your agent can create watchers, monitor webpages for changes, manage your folders, and keep you notified about updates—all automatically.
- Automated webpage monitoring: Let your agent create new watchers to track changes on any web page or specific elements, so you never miss an update.
- Watcher management and cleanup: Effortlessly remove obsolete monitors by deleting watchers when you no longer need to track certain content.
- Folder structure navigation: Retrieve and explore the content of your Wachete folders, listing all subfolders and active watchers for better organization.
- Real-time change notifications: Instantly pull notifications about detected changes across all your monitored pages, keeping you up to date at a glance.
- Comprehensive watcher overview: Ask your agent to list all configured watchers, making it easy to review, audit, or adjust your monitoring strategy as your needs evolve.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `WACHETE_CREATE_UPDATE_FOLDER` | Create or update folder | Create a new folder or update an existing folder in Wachete. Folders help organize watchers into hierarchical structures. Omit the id parameter to create a new folder, or provide an id to update an existing one. |
| `WACHETE_CREATE_WATCHER` | Create Watcher | Create or update a Wachete watcher to monitor web page changes. Watchers check pages at specified intervals and send alerts when changes are detected. Use SinglePage mode for monitoring a single page, or Portal mode to crawl and monitor multiple linked pages. |
| `WACHETE_DELETE_FOLDER` | Delete folder | Permanently deletes a folder along with all nested subfolders and watchers (monitoring tasks). This is a destructive operation that cannot be undone. Use when you need to remove an entire folder structure. All subfolders and monitoring tasks within the folder will be permanently deleted. Obtain the folder ID from the Get Folder Content action before calling. Example: "Delete the folder with ID 576b3f7e-e126-4e92-9b95-f72a8d187a18" |
| `WACHETE_DELETE_WATCHER` | Delete watcher | Deletes a website monitoring watcher (task) by its unique ID. This operation is idempotent - deleting a non-existent or already-deleted watcher will succeed without error. Use when you need to permanently remove a monitoring task. Obtain the watcher ID from List Watchers or Create Watcher actions before calling. Example: "Delete the watcher with ID 974b65b5-6ccb-4996-812c-5a678c2455e8" |
| `WACHETE_GET_CRAWLER_PAGES` | Get crawler pages | Retrieves all pages monitored by a crawler watcher (portal monitor). Use this to get detailed information about each page being tracked including URLs, last check timestamps, content changes, and error states. Only works with portal-type watchers that monitor multiple pages. |
| `WACHETE_GET_DATA_HISTORY` | Get Data History | Retrieve history for a wachet (monitor). Returns timestamped snapshots of monitored content showing when changes occurred. Supports time range filtering and optional diff with previous value. Use continuationToken for pagination when retrieving large histories. |
| `WACHETE_GET_FOLDER_CONTENT` | Get folder content | Retrieves the contents of a Wachete folder, including subfolders and watcher tasks. Use this tool to: - List all subfolders and tasks in the root folder (omit parentId) - List contents of a specific folder (provide parentId) - Navigate the folder hierarchy using the path breadcrumb - Check task statuses and last check data Returns subfolders, tasks with their monitoring details, folder path, and pagination token. |
| `WACHETE_GET_WATCHER` | Get watcher by ID | Retrieve complete watcher (monitor) definition by ID. Use this to get detailed configuration and current status of a specific monitoring task including URL, XPath selector, alerts, notification endpoints, and latest check results. |
| `WACHETE_LIST_NOTIFICATIONS` | List notifications | Retrieves notifications from Wachete watchers. Returns notifications for all watchers or filtered by specific watcher ID and/or time range. Useful for checking recent changes detected by your web page monitors. |
| `WACHETE_LIST_WATCHERS` | List watchers | List all monitoring watchers (tasks) configured in your Wachete account. Optionally filter by search query. Returns up to 500 watchers with details including name, URL, monitoring settings, and notification configuration. |
| `WACHETE_MOVE_ITEMS_TO_FOLDER` | Move Items to Folder | Move tasks (watchers) and folders to a specified destination folder. Use this to organize your monitoring structure by relocating items within the folder hierarchy. Provide at least one of folderIds or taskIds to move items. Set folderId to null to move items to root level. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

The Wachete MCP server is an implementation of the Model Context Protocol that connects your AI agent to Wachete. It provides structured and secure access so your agent can perform Wachete operations on your behalf through a secure, permission-based interface.
With Composio's managed implementation, you don't have to create your own developer app. For production, if you're building an end product, we recommend using your own credentials. The managed server helps you prototype fast and go from 0-1 faster.

## Step-by-step Guide

### 1. Prerequisites

Before you begin, make sure you have:
- Node.js and npm installed
- A Composio account with API key
- An OpenAI API key

### 1. Getting API Keys for OpenAI and Composio

OpenAI API Key
- Go to the [OpenAI dashboard](https://platform.openai.com/settings/organization/api-keys) and create an API key. You'll need credits to use the models, or you can connect to another model provider.
- Keep the API key safe.
Composio API Key
- Log in to the [Composio dashboard](https://dashboard.composio.dev?utm_source=toolkits&utm_medium=framework_docs).
- Navigate to your API settings and generate a new API key.
- Store this key securely as you'll need it for authentication.

### 2. Install required dependencies

First, install the necessary packages for your project.
What you're installing:
- @ai-sdk/openai: Vercel AI SDK's OpenAI provider
- @ai-sdk/mcp: MCP client for Vercel AI SDK
- @composio/core: Composio SDK for tool integration
- ai: Core Vercel AI SDK
- dotenv: Environment variable management
```bash
npm install @ai-sdk/openai @ai-sdk/mcp @composio/core ai dotenv
```

### 3. Set up environment variables

Create a .env file in your project root.
What's needed:
- OPENAI_API_KEY: Your OpenAI API key for GPT model access
- COMPOSIO_API_KEY: Your Composio API key for tool access
- COMPOSIO_USER_ID: A unique identifier for the user session
```bash
OPENAI_API_KEY=your_openai_api_key_here
COMPOSIO_API_KEY=your_composio_api_key_here
COMPOSIO_USER_ID=your_user_id_here
```

### 4. Import required modules and validate environment

What's happening:
- We're importing all necessary libraries including Vercel AI SDK's OpenAI provider and Composio
- The dotenv/config import automatically loads environment variables
- The MCP client import enables connection to Composio's tool server
```typescript
import "dotenv/config";
import { openai } from "@ai-sdk/openai";
import { Composio } from "@composio/core";
import * as readline from "readline";
import { streamText, type ModelMessage, stepCountIs } from "ai";
import { createMCPClient } from "@ai-sdk/mcp";

const composioAPIKey = process.env.COMPOSIO_API_KEY;
const composioUserID = process.env.COMPOSIO_USER_ID;

if (!process.env.OPENAI_API_KEY) throw new Error("OPENAI_API_KEY is not set");
if (!composioAPIKey) throw new Error("COMPOSIO_API_KEY is not set");
if (!composioUserID) throw new Error("COMPOSIO_USER_ID is not set");

const composio = new Composio({
  apiKey: composioAPIKey,
});
```

### 5. Create Tool Router session and initialize MCP client

What's happening:
- We're creating a Tool Router session that gives your agent access to Wachete tools
- The create method takes the user ID and specifies which toolkits should be available
- The returned mcp object contains the URL and authentication headers needed to connect to the MCP server
- This session provides access to all Wachete-related tools through the MCP protocol
```typescript
async function main() {
  // Create a tool router session for the user
  const session = await composio.create(composioUserID!, {
    toolkits: ["wachete"],
  });

  const mcpUrl = session.mcp.url;
```

### 6. Connect to MCP server and retrieve tools

What's happening:
- We're creating an MCP client that connects to our Composio Tool Router session via HTTP
- The mcp.url provides the endpoint, and mcp.headers contains authentication credentials
- The type: "http" is important - Composio requires HTTP transport
- tools() retrieves all available Wachete tools that the agent can use
```typescript
const mcpClient = await createMCPClient({
  transport: {
    type: "http",
    url: mcpUrl,
    headers: session.mcp.headers, // Authentication headers for the Composio MCP server
  },
});

const tools = await mcpClient.tools();
```

### 7. Initialize conversation and CLI interface

What's happening:
- We initialize an empty messages array to maintain conversation history
- A readline interface is created to accept user input from the command line
- Instructions are displayed to guide the user on how to interact with the agent
```typescript
let messages: ModelMessage[] = [];

console.log("Chat started! Type 'exit' or 'quit' to end the conversation.\n");
console.log(
  "Ask any questions related to wachete, like summarize my last 5 emails, send an email, etc... :)))\n",
);

const rl = readline.createInterface({
  input: process.stdin,
  output: process.stdout,
  prompt: "> ",
});

rl.prompt();
```

### 8. Handle user input and stream responses with real-time tool feedback

What's happening:
- We use streamText instead of generateText to stream responses in real-time
- toolChoice: "auto" allows the model to decide when to use Wachete tools
- stopWhen: stepCountIs(10) allows up to 10 steps for complex multi-tool operations
- onStepFinish callback displays which tools are being used in real-time
- We iterate through the text stream to create a typewriter effect as the agent responds
- The complete response is added to conversation history to maintain context
- Errors are caught and displayed with helpful retry suggestions
```typescript
rl.on("line", async (userInput: string) => {
  const trimmedInput = userInput.trim();

  if (["exit", "quit", "bye"].includes(trimmedInput.toLowerCase())) {
    console.log("\nGoodbye!");
    rl.close();
    process.exit(0);
  }

  if (!trimmedInput) {
    rl.prompt();
    return;
  }

  messages.push({ role: "user", content: trimmedInput });
  console.log("\nAgent is thinking...\n");

  try {
    const stream = streamText({
      model: openai("gpt-5"),
      messages,
      tools,
      toolChoice: "auto",
      stopWhen: stepCountIs(10),
      onStepFinish: (step) => {
        for (const toolCall of step.toolCalls) {
          console.log(`[Using tool: ${toolCall.toolName}]`);
          }
          if (step.toolCalls.length > 0) {
            console.log(""); // Add space after tool calls
          }
        },
      });

      for await (const chunk of stream.textStream) {
        process.stdout.write(chunk);
      }

      console.log("\n\n---\n");

      // Get final result for message history
      const response = await stream.response;
      if (response?.messages?.length) {
        messages.push(...response.messages);
      }
    } catch (error) {
      console.error("\nAn error occurred while talking to the agent:");
      console.error(error);
      console.log(
        "\nYou can try again or restart the app if it keeps happening.\n",
      );
    } finally {
      rl.prompt();
    }
  });

  rl.on("close", async () => {
    await mcpClient.close();
    console.log("\n👋 Session ended.");
    process.exit(0);
  });
}

main().catch((err) => {
  console.error("Fatal error:", err);
  process.exit(1);
});
```

## Complete Code

```typescript
import "dotenv/config";
import { openai } from "@ai-sdk/openai";
import { Composio } from "@composio/core";
import * as readline from "readline";
import { streamText, type ModelMessage, stepCountIs } from "ai";
import { createMCPClient } from "@ai-sdk/mcp";

const composioAPIKey = process.env.COMPOSIO_API_KEY;
const composioUserID = process.env.COMPOSIO_USER_ID;

if (!process.env.OPENAI_API_KEY) throw new Error("OPENAI_API_KEY is not set");
if (!composioAPIKey) throw new Error("COMPOSIO_API_KEY is not set");
if (!composioUserID) throw new Error("COMPOSIO_USER_ID is not set");

const composio = new Composio({
  apiKey: composioAPIKey,
});

async function main() {
  // Create a tool router session for the user
  const session = await composio.create(composioUserID!, {
    toolkits: ["wachete"],
  });

  const mcpUrl = session.mcp.url;

  const mcpClient = await createMCPClient({
    transport: {
      type: "http",
      url: mcpUrl,
      headers: session.mcp.headers, // Authentication headers for the Composio MCP server
    },
  });

  const tools = await mcpClient.tools();

  let messages: ModelMessage[] = [];

  console.log("Chat started! Type 'exit' or 'quit' to end the conversation.\n");
  console.log(
    "Ask any questions related to wachete, like summarize my last 5 emails, send an email, etc... :)))\n",
  );

  const rl = readline.createInterface({
    input: process.stdin,
    output: process.stdout,
    prompt: "> ",
  });

  rl.prompt();

  rl.on("line", async (userInput: string) => {
    const trimmedInput = userInput.trim();

    if (["exit", "quit", "bye"].includes(trimmedInput.toLowerCase())) {
      console.log("\nGoodbye!");
      rl.close();
      process.exit(0);
    }

    if (!trimmedInput) {
      rl.prompt();
      return;
    }

    messages.push({ role: "user", content: trimmedInput });
    console.log("\nAgent is thinking...\n");

    try {
      const stream = streamText({
        model: openai("gpt-5"),
        messages,
        tools,
        toolChoice: "auto",
        stopWhen: stepCountIs(10),
        onStepFinish: (step) => {
          for (const toolCall of step.toolCalls) {
            console.log(`[Using tool: ${toolCall.toolName}]`);
          }
          if (step.toolCalls.length > 0) {
            console.log(""); // Add space after tool calls
          }
        },
      });

      for await (const chunk of stream.textStream) {
        process.stdout.write(chunk);
      }

      console.log("\n\n---\n");

      // Get final result for message history
      const response = await stream.response;
      if (response?.messages?.length) {
        messages.push(...response.messages);
      }
    } catch (error) {
      console.error("\nAn error occurred while talking to the agent:");
      console.error(error);
      console.log(
        "\nYou can try again or restart the app if it keeps happening.\n",
      );
    } finally {
      rl.prompt();
    }
  });

  rl.on("close", async () => {
    await mcpClient.close();
    console.log("\n👋 Session ended.");
    process.exit(0);
  });
}

main().catch((err) => {
  console.error("Fatal error:", err);
  process.exit(1);
});
```

## Conclusion

You've successfully built a Wachete agent using the Vercel AI SDK with streaming capabilities! This implementation provides a powerful foundation for building AI applications with natural language interfaces and real-time feedback.
Key features of this implementation:
- Real-time streaming responses for a better user experience with typewriter effect
- Live tool execution feedback showing which tools are being used as the agent works
- Dynamic tool loading through Composio's Tool Router with secure authentication
- Multi-step tool execution with configurable step limits (up to 10 steps)
- Comprehensive error handling for robust agent execution
- Conversation history maintenance for context-aware responses
You can extend this further by adding custom error handling, implementing specific business logic, or integrating additional Composio toolkits to create multi-app workflows.

## How to build Wachete MCP Agent with another framework

- [OpenAI Agents SDK](https://composio.dev/toolkits/wachete/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/wachete/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/wachete/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/wachete/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/wachete/framework/codex)
- [OpenClaw](https://composio.dev/toolkits/wachete/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/wachete/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/wachete/framework/cli)
- [Google ADK](https://composio.dev/toolkits/wachete/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/wachete/framework/langchain)
- [Mastra AI](https://composio.dev/toolkits/wachete/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/wachete/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/wachete/framework/crew-ai)

## Related Toolkits

- [Apilio](https://composio.dev/toolkits/apilio) - Apilio is a home automation platform that lets you connect and control smart devices from different brands. It helps you build flexible automations with complex conditions, schedules, and integrations.
- [Basin](https://composio.dev/toolkits/basin) - Basin is a no-code form backend for quickly setting up reliable contact forms. It lets you collect and manage form submissions without writing any server-side code.
- [Bouncer](https://composio.dev/toolkits/bouncer) - Bouncer is an email validation platform that verifies the authenticity of email addresses in real-time and batch. It helps boost deliverability and reduce bounce rates for your communications.
- [Conveyor](https://composio.dev/toolkits/conveyor) - Conveyor is a platform that automates security reviews with a Trust Center and AI-driven questionnaire automation. It streamlines compliance and vendor security processes for faster, hassle-free reviews.
- [Crowdin](https://composio.dev/toolkits/crowdin) - Crowdin is a localization management platform that streamlines translation workflows and collaboration. It helps teams centralize multilingual content, boost productivity, and automate translation processes.
- [Databox](https://composio.dev/toolkits/databox) - Databox is a business analytics platform that connects your data from any tool and device. It helps you track KPIs, build dashboards, and discover actionable insights.
- [Detrack](https://composio.dev/toolkits/detrack) - Detrack is a delivery management platform for real-time tracking and proof of delivery. It helps businesses automate notifications and keep customers updated every step of the way.
- [Dnsfilter](https://composio.dev/toolkits/dnsfilter) - Dnsfilter is a cloud-based DNS security and content filtering solution. It helps organizations block online threats and manage safe internet access with ease.
- [Faraday](https://composio.dev/toolkits/faraday) - Faraday lets you embed AI in workflows across your stack for smarter automation. It boosts your favorite tools with actionable intelligence and seamless integration.
- [Feathery](https://composio.dev/toolkits/feathery) - Feathery is an AI-powered platform for building dynamic data intake forms with advanced logic. It helps teams automate complex workflows and collect structured data with ease.
- [Fillout forms](https://composio.dev/toolkits/fillout_forms) - Fillout forms is an online platform for building and managing forms with a flexible API. It lets you create, distribute, and collect responses from forms with ease.
- [Formdesk](https://composio.dev/toolkits/formdesk) - Formdesk is an online form builder for creating and managing professional forms. It's perfect for collecting data, automating workflows, and integrating form submissions with your favorite services.
- [Formsite](https://composio.dev/toolkits/formsite) - Formsite lets you build online forms and surveys with drag-and-drop simplicity. Capture, manage, and integrate form responses securely for streamlined workflows.
- [Graphhopper](https://composio.dev/toolkits/graphhopper) - GraphHopper is an enterprise-grade Directions API for routing, optimization, and geocoding across multiple vehicle types. It enables fast, reliable route planning and logistics automation for businesses.
- [Hyperbrowser](https://composio.dev/toolkits/hyperbrowser) - Hyperbrowser is a next-generation platform for scalable browser automation. It empowers AI agents to interact with web apps, automate workflows, and handle browser sessions at scale.
- [La Growth Machine](https://composio.dev/toolkits/lagrowthmachine) - La Growth Machine automates multi-channel sales outreach and routine tasks for sales teams. Streamline your workflow and focus on closing more deals.
- [Leverly](https://composio.dev/toolkits/leverly) - Leverly is a workflow automation platform that connects and coordinates actions across your apps. It streamlines repetitive processes so your business runs smoother, faster, and with fewer manual steps.
- [Maintainx](https://composio.dev/toolkits/maintainx) - Maintainx is a cloud-based CMMS for centralizing maintenance data, communication, and workflows. It helps organizations streamline maintenance operations and improve team coordination.
- [Make](https://composio.dev/toolkits/make) - Make is an automation platform that connects your favorite apps and services. Build powerful, custom workflows without writing code.
- [Ntfy](https://composio.dev/toolkits/ntfy) - Ntfy is a notification service to send push messages to phones or desktops. Instantly deliver alerts and updates to users, devices, or teams.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Wachete MCP?

With a standalone Wachete MCP server, the agents and LLMs can only access a fixed set of Wachete tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Wachete and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with Vercel AI SDK v6?

Yes, you can. Vercel AI SDK v6 fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Wachete tools.

### Can I manage the permissions and scopes for Wachete while using Tool Router?

Yes, absolutely. You can configure which Wachete scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Wachete data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
