How to integrate Google BigQuery MCP with Pydantic AI

Trusted by
AWS
Glean
Zoom
Airtable

30 min · no commitment · see it on your stack

Google BigQuery logo
Pydantic AI logo
divider

Introduction

This guide walks you through connecting Google BigQuery to Pydantic AI using the Composio tool router. By the end, you'll have a working Google BigQuery agent that can run yesterday's sales summary query, find top 10 customers by revenue, analyze traffic data for last quarter through natural language commands.

This guide will help you understand how to give your Pydantic AI agent real control over a Google BigQuery account through Composio's Google BigQuery MCP server.

Before we dive in, let's take a quick look at the key ideas and tools involved.

Also integrate Google BigQuery with

TL;DR

Here's what you'll learn:
  • How to set up your Composio API key and User ID
  • How to create a Composio Tool Router session for Google BigQuery
  • How to attach an MCP Server to a Pydantic AI agent
  • How to stream responses and maintain chat history
  • How to build a simple REPL-style chat interface to test your Google BigQuery workflows

What is Pydantic AI?

Pydantic AI is a Python framework for building AI agents with strong typing and validation. It leverages Pydantic's data validation capabilities to create robust, type-safe AI applications.

Key features include:

  • Type Safety: Built on Pydantic for automatic data validation
  • MCP Support: Native support for Model Context Protocol servers
  • Streaming: Built-in support for streaming responses
  • Async First: Designed for async/await patterns

What is the Google BigQuery MCP server, and what's possible with it?

The Google BigQuery MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Google BigQuery account. It provides structured and secure access to your data warehouse, so your agent can perform actions like running SQL queries, analyzing datasets, extracting insights, and automating reporting on your behalf.

  • Instant SQL query execution: Have your agent run complex analytical queries on any of your BigQuery datasets and get results in real time.
  • Custom data analysis and reporting: Instruct your agent to generate summaries, trends, or statistics by querying specific tables or views.
  • Automated data extraction: Let your agent fetch and transform data for integration with other tools or for further analysis.
  • Interactive business intelligence: Enable your agent to answer ad hoc data questions, visualize aggregated data, or pull specific metrics from massive datasets instantly.
  • Streamlined workflow automation: Use your agent to automate recurring BigQuery tasks, such as daily audits or data slice generation, without manual effort.

Supported Tools & Triggers

Tools
Cancel BigQuery JobTool to cancel a running BigQuery job.
Create Capacity CommitmentTool to create a new capacity commitment resource in BigQuery Reservation.
Create BigQuery ConnectionTool to create a new BigQuery connection to external data sources using the BigQuery Connection API.
Create Analytics Hub Data ExchangeTool to create a new Analytics Hub data exchange for sharing BigQuery datasets.
Create Analytics Hub ListingTool to create a new listing in a BigQuery Analytics Hub data exchange.
Create BigQuery DatasetTool to create a new BigQuery dataset with explicit location, labels, and description using the BigQuery Datasets API.
Create Analytics Hub ListingTool to create a new listing in a data exchange using Analytics Hub API.
Create BigQuery Data Policy (v2beta1)Tool to create a new data policy under a project with specified location using the v2beta1 BigQuery Data Policy API.
Create Analytics Hub Query TemplateTool to create a new query template in a BigQuery Analytics Hub Data Clean Room (DCR) data exchange.
Create BigQuery ReservationTool to create a new BigQuery reservation resource to guarantee compute capacity (slots) for query and pipeline jobs.
Create BigQuery Reservation AssignmentTool to create a BigQuery reservation assignment that allows a project, folder, or organization to submit jobs using slots from a specified reservation.
Create BigQuery RoutineTool to create a new user-defined routine (function or procedure) in a BigQuery dataset.
Create BigQuery TableTool to create a new, empty table in a BigQuery dataset.
Delete BigQuery DatasetTool to delete a BigQuery dataset specified by datasetId via the datasets.
Delete BigQuery Job MetadataTool to delete the metadata of a BigQuery job.
Delete BigQuery ML ModelTool to delete a BigQuery ML model from a dataset.
Delete BigQuery RoutineTool to delete a BigQuery routine by its ID.
Delete BigQuery TableTool to delete a BigQuery table from a dataset.
Get BigQuery ML ModelTool to retrieve a specific BigQuery ML model resource by model ID.
Get BigQuery Connection IAM PolicyTool to get the IAM access control policy for a BigQuery connection resource.
Get BigQuery Dataset MetadataTool to retrieve BigQuery dataset metadata including location via the datasets.
Get BigQuery JobTool to retrieve information about a specific BigQuery job.
Get BigQuery Query ResultsTool to get the results of a BigQuery query job via RPC.
Get BigQuery RoutineTool to retrieve a BigQuery routine (user-defined function or stored procedure) by its ID.
Get BigQuery Routine IAM PolicyTool to retrieve the IAM access control policy for a BigQuery routine resource.
Get BigQuery Service AccountTool to get the service account for a project used for interactions with Google Cloud KMS.
Get BigQuery Table IAM PolicyTool to retrieve the IAM access control policy for a BigQuery table resource.
Get BigQuery Table SchemaTool to fetch a BigQuery table's schema and metadata without querying row data.
Insert Data into BigQuery TableTool to stream data into BigQuery one record at a time without running a load job.
Insert BigQuery JobTool to start a new asynchronous BigQuery job (query, load, extract, or copy).
Insert BigQuery Job with UploadTool to start a new BigQuery load job with file upload.
List Analytics Hub ListingsTool to list all listings in a given Analytics Hub data exchange.
List BigQuery ConnectionsTool to list BigQuery connections in a given project and location.
List BigQuery Capacity CommitmentsTool to list all capacity commitments for the admin project.
List Data Exchange ListingsTool to list all listings in a given Analytics Hub data exchange using the v1beta1 API.
List BigQuery DatasetsTool to list datasets in a specific BigQuery project, including dataset locations.
List BigQuery JobsTool to list all jobs that you started in a BigQuery project.
List BigQuery Data Transfer LocationsTool to list information about supported locations for BigQuery Data Transfer Service.
List Connections in LocationTool to list BigQuery connections in a given project and location using the v1beta1 API.
List BigQuery Location Data PoliciesTool to list all data policies in a specified parent project and location using the v2beta1 API.
List BigQuery ModelsTool to list all BigQuery ML models in a specified dataset.
List Organization Data ExchangesTool to list all data exchanges from projects in a given organization and location using Analytics Hub API.
List BigQuery ProjectsTool to list BigQuery projects to which the user has been granted any project role.
List Analytics Hub Query TemplatesTool to list all query templates in a given Analytics Hub data exchange.
List BigQuery Reservation AssignmentsTool to list BigQuery reservation assignments.
List BigQuery Reservation GroupsTool to list all BigQuery reservation groups for a project in a specified location.
List BigQuery ReservationsTool to list all BigQuery reservations for a project in a specified location.
List BigQuery RoutinesTool to list all routines (user-defined functions and stored procedures) in a BigQuery dataset.
List BigQuery Row Access PoliciesTool to list all row access policies on a specified BigQuery table.
List BigQuery Table DataTool to list the content of a BigQuery table in rows via the REST API.
List BigQuery TablesTool to list tables in a BigQuery dataset via the REST API.
Patch BigQuery DatasetTool to update an existing BigQuery dataset using RFC5789 PATCH semantics.
Patch BigQuery ML ModelTool to update specific fields in an existing BigQuery ML model using PATCH semantics.
Patch BigQuery TableTool to update specific fields in an existing BigQuery table using RFC5789 PATCH semantics.
QueryQuery Tool runs a SQL query in BigQuery using the REST API.
Search All BigQuery Reservation AssignmentsTool to search all BigQuery reservation assignments for a specified resource in a particular region.
Set BigQuery Routine IAM PolicyTool to set the IAM access control policy for a BigQuery routine resource.
Test BigQuery Routine IAM PermissionsTool to test which IAM permissions the caller has on a BigQuery routine.
Undelete BigQuery DatasetTool to undelete a BigQuery dataset within the time travel window.
Update BigQuery ConnectionTool to update a specified BigQuery connection using the BigQuery Connection API.
Update BigQuery DatasetTool to update information in an existing BigQuery dataset using the PUT method.
Update BigQuery RoutineTool to update an existing BigQuery routine (function or stored procedure).
Update BigQuery TableTool to update an existing BigQuery table.

What is the Composio tool router, and how does it fit here?

What is Composio SDK?

Composio's Composio SDK helps agents find the right tools for a task at runtime. You can plug in multiple toolkits (like Gmail, HubSpot, and GitHub), and the agent will identify the relevant app and action to complete multi-step workflows. This can reduce token usage and improve the reliability of tool calls. Read more here: Getting started with Composio SDK

The tool router generates a secure MCP URL that your agents can access to perform actions.

How the Composio SDK works

The Composio SDK follows a three-phase workflow:

  1. Discovery: Searches for tools matching your task and returns relevant toolkits with their details.
  2. Authentication: Checks for active connections. If missing, creates an auth config and returns a connection URL via Auth Link.
  3. Execution: Executes the action using the authenticated connection.

Step-by-step Guide

Prerequisites

Before starting, make sure you have:
  • Python 3.9 or higher
  • A Composio account with an active API key
  • Basic familiarity with Python and async programming

Getting API Keys for OpenAI and Composio

OpenAI API Key
  • Go to the OpenAI dashboard and create an API key. You'll need credits to use the models, or you can connect to another model provider.
  • Keep the API key safe.
Composio API Key
  • Log in to the Composio dashboard.
  • Navigate to your API settings and generate a new API key.
  • Store this key securely as you'll need it for authentication.

Install dependencies

bash
pip install composio pydantic-ai python-dotenv

Install the required libraries.

What's happening:

  • composio connects your agent to external SaaS tools like Google BigQuery
  • pydantic-ai lets you create structured AI agents with tool support
  • python-dotenv loads your environment variables securely from a .env file

Set up environment variables

bash
COMPOSIO_API_KEY=your_composio_api_key_here
USER_ID=your_user_id_here
OPENAI_API_KEY=your_openai_api_key

Create a .env file in your project root.

What's happening:

  • COMPOSIO_API_KEY authenticates your agent to Composio's API
  • USER_ID associates your session with your account for secure tool access
  • OPENAI_API_KEY to access OpenAI LLMs

Import dependencies

python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()
What's happening:
  • We load environment variables and import required modules
  • Composio manages connections to Google BigQuery
  • MCPServerStreamableHTTP connects to the Google BigQuery MCP server endpoint
  • Agent from Pydantic AI lets you define and run the AI assistant

Create a Tool Router Session

python
async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Google BigQuery
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["googlebigquery"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")
What's happening:
  • We're creating a Tool Router session that gives your agent access to Google BigQuery tools
  • The create method takes the user ID and specifies which toolkits should be available
  • The returned session.mcp.url is the MCP server URL that your agent will use

Initialize the Pydantic AI Agent

python
# Attach the MCP server to a Pydantic AI Agent
googlebigquery_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
agent = Agent(
    "openai:gpt-5",
    toolsets=[googlebigquery_mcp],
    instructions=(
        "You are a Google BigQuery assistant. Use Google BigQuery tools to help users "
        "with their requests. Ask clarifying questions when needed."
    ),
)
What's happening:
  • The MCP client connects to the Google BigQuery endpoint
  • The agent uses GPT-5 to interpret user commands and perform Google BigQuery operations
  • The instructions field defines the agent's role and behavior

Build the chat interface

python
# Simple REPL with message history
history = []
print("Chat started! Type 'exit' or 'quit' to end.\n")
print("Try asking the agent to help you with Google BigQuery.\n")

while True:
    user_input = input("You: ").strip()
    if user_input.lower() in {"exit", "quit", "bye"}:
        print("\nGoodbye!")
        break
    if not user_input:
        continue

    print("\nAgent is thinking...\n", flush=True)

    async with agent.run_stream(user_input, message_history=history) as stream_result:
        collected_text = ""
        async for chunk in stream_result.stream_output():
            text_piece = None
            if isinstance(chunk, str):
                text_piece = chunk
            elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                text_piece = chunk.delta
            elif hasattr(chunk, "text"):
                text_piece = chunk.text
            if text_piece:
                collected_text += text_piece
        result = stream_result

    print(f"Agent: {collected_text}\n")
    history = result.all_messages()
What's happening:
  • The agent reads input from the terminal and streams its response
  • Google BigQuery API calls happen automatically under the hood
  • The model keeps conversation history to maintain context across turns

Run the application

python
if __name__ == "__main__":
    asyncio.run(main())
What's happening:
  • The asyncio loop launches the agent and keeps it running until you exit

Complete Code

Here's the complete code to get you started with Google BigQuery and Pydantic AI:

python
import asyncio
import os
from dotenv import load_dotenv
from composio import Composio
from pydantic_ai import Agent
from pydantic_ai.mcp import MCPServerStreamableHTTP

load_dotenv()

async def main():
    api_key = os.getenv("COMPOSIO_API_KEY")
    user_id = os.getenv("USER_ID")
    if not api_key or not user_id:
        raise RuntimeError("Set COMPOSIO_API_KEY and USER_ID in your environment")

    # Create a Composio Tool Router session for Google BigQuery
    composio = Composio(api_key=api_key)
    session = composio.create(
        user_id=user_id,
        toolkits=["googlebigquery"],
    )
    url = session.mcp.url
    if not url:
        raise ValueError("Composio session did not return an MCP URL")

    # Attach the MCP server to a Pydantic AI Agent
    googlebigquery_mcp = MCPServerStreamableHTTP(url, headers={"x-api-key": COMPOSIO_API_KEY})
    agent = Agent(
        "openai:gpt-5",
        toolsets=[googlebigquery_mcp],
        instructions=(
            "You are a Google BigQuery assistant. Use Google BigQuery tools to help users "
            "with their requests. Ask clarifying questions when needed."
        ),
    )

    # Simple REPL with message history
    history = []
    print("Chat started! Type 'exit' or 'quit' to end.\n")
    print("Try asking the agent to help you with Google BigQuery.\n")

    while True:
        user_input = input("You: ").strip()
        if user_input.lower() in {"exit", "quit", "bye"}:
            print("\nGoodbye!")
            break
        if not user_input:
            continue

        print("\nAgent is thinking...\n", flush=True)

        async with agent.run_stream(user_input, message_history=history) as stream_result:
            collected_text = ""
            async for chunk in stream_result.stream_output():
                text_piece = None
                if isinstance(chunk, str):
                    text_piece = chunk
                elif hasattr(chunk, "delta") and isinstance(chunk.delta, str):
                    text_piece = chunk.delta
                elif hasattr(chunk, "text"):
                    text_piece = chunk.text
                if text_piece:
                    collected_text += text_piece
            result = stream_result

        print(f"Agent: {collected_text}\n")
        history = result.all_messages()

if __name__ == "__main__":
    asyncio.run(main())

Conclusion

You've built a Pydantic AI agent that can interact with Google BigQuery through Composio's Tool Router. With this setup, your agent can perform real Google BigQuery actions through natural language. You can extend this further by:
  • Adding other toolkits like Gmail, HubSpot, or Salesforce
  • Building a web-based chat interface around this agent
  • Using multiple MCP endpoints to enable cross-app workflows (for example, Gmail + Google BigQuery for workflow automation)
This architecture makes your AI agent "agent-native", able to securely use APIs in a unified, composable way without custom integrations.

How to build Google BigQuery MCP Agent with another framework

FAQ

What are the differences in Tool Router MCP and Google BigQuery MCP?

With a standalone Google BigQuery MCP server, the agents and LLMs can only access a fixed set of Google BigQuery tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Google BigQuery and many other apps based on the task at hand, all through a single MCP endpoint.

Can I use Tool Router MCP with Pydantic AI?

Yes, you can. Pydantic AI fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Google BigQuery tools.

Can I manage the permissions and scopes for Google BigQuery while using Tool Router?

Yes, absolutely. You can configure which Google BigQuery scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Google BigQuery data and credentials are handled as safely as possible.

Used by agents from

Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.