# Kaggle

```json
{
  "name": "Kaggle",
  "slug": "kaggle",
  "url": "https://composio.dev/toolkits/kaggle",
  "markdown_url": "https://composio.dev/toolkits/kaggle.md",
  "logo_url": "https://logos.composio.dev/api/kaggle",
  "categories": [
    "ai & machine learning"
  ],
  "is_composio_managed": false,
  "updated_at": "2026-05-12T10:16:40.031Z"
}
```

![Kaggle logo](https://logos.composio.dev/api/kaggle)

## Description

Securely connect your AI agents and chatbots (Claude, ChatGPT, Cursor, etc) with Kaggle MCP or direct API to search datasets, download notebooks, submit competition entries, and manage your Kaggle profile through natural language.

## Summary

Kaggle is a platform for data science and machine learning competitions, datasets, and collaborative notebooks. It makes it easy to find data, participate in challenges, and share insights with a global data community.

## Categories

- ai & machine learning

## Toolkit Details

- Tools: 35

## Images

- Logo: https://logos.composio.dev/api/kaggle

## Authentication

- **Api Key**
  - Type: `api_key`
  - Description: Api Key authentication for Kaggle.
  - Setup:
    - Configure Api Key credentials for Kaggle.
    - Use the credentials when creating an auth config in Composio.

## Suggested Prompts

- Download data files for the Titanic competition
- Create a new version of my COVID-19 dataset
- Check processing status of my uploaded dataset
- Submit my predictions to the House Prices competition

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `KAGGLE_COMPETITION_DOWNLOAD_FILES` | Download competition data files | Downloads all data files for a Kaggle competition as a single zip archive. Returns the local file path where the zip was saved. Note: You must have accepted the competition's rules on Kaggle's website before downloading (403 error if not accepted). |
| `KAGGLE_COMPETITION_SUBMIT` | Submit Competition Entry | Submit an entry to a Kaggle competition using a previously uploaded file. Prerequisites: 1. You must have accepted the competition rules on Kaggle's website 2. You must have uploaded your submission file and obtained a blob_file_tokens (use Kaggle's file upload API endpoint first) This action performs the final submission step after file upload. The blob token identifies your uploaded file and associates it with your competition submission. |
| `KAGGLE_CONFIG_DIR` | Get Kaggle Config Directory | Tool to retrieve the directory of the Kaggle API configuration file. Use when you need to locate the directory containing your kaggle.json credentials. |
| `KAGGLE_CONFIG_INIT` | Initialize Kaggle Configuration | Initialize Kaggle API client configuration. This action sets up the necessary configuration file for Kaggle API access by first attempting to use the Kaggle CLI's 'kaggle config init' command. If the CLI is unavailable, it falls back to creating a kaggle.json file at ~/.kaggle/kaggle.json (or $KAGGLE_CONFIG_DIR/kaggle.json if that environment variable is set). The action is idempotent - if configuration already exists, it will not overwrite it. No parameters are required; the action uses environment variables and metadata when available. Run this before other Kaggle actions when credentials are missing or when KAGGLE_CONFIG_VIEW returns empty/error output. |
| `KAGGLE_CONFIG_KEYS` | List Kaggle Configuration Keys | Tool to list local Kaggle API configuration keys. Use when you need to see which configuration options are set without revealing values. |
| `KAGGLE_CONFIG_PATH` | Get Kaggle Config Path | Tool to retrieve local Kaggle API configuration file path. Use when you need to know the location of the Kaggle config before operations. |
| `KAGGLE_CONFIG_RESET` | Reset Kaggle Configuration | Tool to reset local Kaggle CLI configuration to defaults. Clears CLI-managed keys ('competition', 'path', 'proxy'). |
| `KAGGLE_CONFIG_SET` | Set Kaggle Configuration | Tool to set a Kaggle CLI configuration parameter. Use when updating local CLI settings such as default download path or proxy. Ensure Kaggle CLI is installed. |
| `KAGGLE_CONFIG_UNSET` | Unset Kaggle Configuration | Tool to unset a Kaggle CLI configuration parameter. Use when removing local CLI settings such as default download path or proxy. Ensure Kaggle CLI is installed. |
| `KAGGLE_CONFIG_VIEW` | View Kaggle Configuration | View local Kaggle API credentials and configuration settings. This action reads Kaggle configuration from local sources (does NOT make API calls to Kaggle). Configuration is retrieved in the following precedence order: 1. kaggle.json file (from KAGGLE_CONFIG_DIR env var, ~/.config/kaggle/, or ~/.kaggle/) 2. 'kaggle config view' CLI output (for proxy/path settings) 3. Environment variables (KAGGLE_USERNAME, KAGGLE_KEY) 4. Authorization header from metadata Use this action to: - Verify Kaggle credentials are configured before making API calls - Check current proxy settings - Debug authentication issues Returns empty strings for username/key if no credentials are found; use KAGGLE_CONFIG_INIT to set up credentials first. Note: username and key are independent — an empty username field does not indicate missing or invalid credentials. WARNING: This action returns sensitive API key data in plain text. |
| `KAGGLE_DATASET_CREATE` | Dataset Create | Create a new Kaggle dataset with metadata. IMPORTANT: Dataset creation requires at least one data file. Ensure files are uploaded before calling this action. The 'id' field must use your authenticated Kaggle username as the owner. Returns the creation status and any message from the Kaggle API. |
| `KAGGLE_DATASET_INIT` | Kaggle Dataset Init | Tool to initialize a dataset-metadata.json file in a local folder. Use when preparing a dataset folder before uploading to Kaggle. |
| `KAGGLE_DATASET_LIST_FILES` | List Kaggle Dataset Files | Tool to list files in a Kaggle dataset. Use when you need to retrieve paginated file listings by owner and dataset slugs, with optional version and paging controls. |
| `KAGGLE_DATASET_STATUS` | Get Dataset Status | Check the processing status of a Kaggle dataset after creation or version update. This endpoint is used to monitor datasets that are currently being processed by Kaggle's servers. It returns status information for datasets that are actively uploading, processing, or experiencing errors. For already-published datasets, this endpoint typically returns 404 (Not Found), which is expected behavior. Use this tool immediately after creating a new dataset (KAGGLE_DATASET_CREATE) or updating an existing dataset version (KAGGLE_DATASET_VERSION) to check when the dataset becomes ready. Poll this endpoint periodically until the status indicates completion or error. |
| `KAGGLE_DATASET_VERSION` | Create Dataset Version | Create a new version of an existing Kaggle dataset. Prerequisites: - You must own the dataset or have edit permissions - Files must be uploaded first to obtain upload tokens (required for the 'files' parameter) Use this when you have updated files or metadata and need to publish a new version of an existing dataset. |
| `KAGGLE_DOWNLOAD_COMPETITION_FILE` | Download competition file | Tool to download a specific data file from a Kaggle competition. Use when you need to retrieve a single file from a competition by specifying the competition slug and filename. Note: You must have accepted the competition's rules on Kaggle's website before downloading. |
| `KAGGLE_DOWNLOAD_COMPETITION_LEADERBOARD` | Download competition leaderboard | Tool to download the entire competition leaderboard as a CSV file packaged in a ZIP archive. Use when you need to analyze or review competition standings and scores. |
| `KAGGLE_DOWNLOAD_DATASET` | Download Kaggle Dataset | Tool to download all files from a Kaggle dataset as a zip archive. Supports downloading specific versions by providing the dataset_version_number parameter. |
| `KAGGLE_DOWNLOAD_DATASET_FILE` | Download Kaggle Dataset File | Tool to download a specific file from a Kaggle dataset. Use when you need to retrieve a single file from a dataset by specifying the owner, dataset, and filename. |
| `KAGGLE_GENERATE_COMPETITION_SUBMISSION_URL` | Generate Competition Submission URL | Tool to generate a pre-signed URL for uploading competition submission files. Use this before uploading your submission file to Kaggle. This action generates a temporary upload URL and token for submitting to a competition. You must provide the competition ID, file size, and last modified timestamp. After obtaining the URL, upload your submission file to the createUrl, then use the token to finalize the submission. |
| `KAGGLE_GET_DATASET_METADATA` | Get Dataset Metadata | Tool to get comprehensive metadata for a Kaggle dataset including title, description, licenses, and tags. Use when you need detailed information about a dataset's structure, schema, or properties. |
| `KAGGLE_GET_MODEL` | Get Model Details | Tool to get a Kaggle model's details including metadata and description. Use when you need information about a specific model on Kaggle. |
| `KAGGLE_GET_MODEL_INSTANCE` | Get Model Instance Details | Tool to get details for a specific Kaggle model instance (variation). Returns metadata including overview, usage instructions, download URL, version information, and license details. Use when you need to inspect or retrieve information about a specific model variation before downloading or using it. |
| `KAGGLE_KERNEL_INIT` | Kaggle Kernel Init | Initialize a kernel-metadata.json template file in a specified folder. This file is required before pushing/uploading a kernel to Kaggle. The template includes default values for kernel configuration (language, kernel_type, GPU settings, etc.) that can be customized before pushing. Use this when setting up a new Kaggle kernel locally. |
| `KAGGLE_KERNEL_OUTPUT` | Download kernel output | Tool to download the output of a Kaggle kernel. Use when needing the latest kernel results locally. |
| `KAGGLE_KERNELS_STATUS` | Get Kernel Status | Get the execution status of a Kaggle kernel (notebook). Returns current status (running, complete, error), timestamps, and output URL. Use this to monitor kernel execution after pushing/submitting a kernel. Note: You need permission to access the kernel - typically only your own kernels or public kernels you have access to. |
| `KAGGLE_LIST_COMPETITION_FILES` | List competition data files | Tool to list all data files available for a Kaggle competition. Use when you need to retrieve file names, sizes, and metadata for competition datasets before downloading. |
| `KAGGLE_LIST_COMPETITIONS` | List Kaggle Competitions | Tool to list available Kaggle competitions with filters and pagination. Use when you need to discover competitions, search by keywords, or filter by category, group, and sorting options. |
| `KAGGLE_LIST_DATASETS` | List Kaggle Datasets | Tool to list Kaggle datasets with filters and pagination. Use after authenticating with Kaggle API key. |
| `KAGGLE_LIST_KERNEL_OUTPUT_FILES` | List Kernel Output Files | Tool to list output files for a specific kernel run. Use when you need to retrieve paginated file listings by kernel owner and slug. |
| `KAGGLE_LIST_KERNELS` | List Kaggle Kernels | Tool to list Kaggle kernels (notebooks and scripts) with filters and pagination. Use to discover kernels by search terms, user, language, type, competition, or dataset. |
| `KAGGLE_LIST_MODEL_INSTANCE_VERSION_FILES` | List Model Instance Version Files | Tool to list files for a specific version of a model variation. Use when you need to retrieve files for a particular model framework instance version by owner, model, framework, variation, and version. |
| `KAGGLE_LIST_MODELS` | List Kaggle Models | Tool to list Kaggle models with optional filters for owner, sorting, search, and pagination. Use to discover available models on Kaggle's platform. |
| `KAGGLE_PULL_KERNEL` | Pull Kernel Code | Tool to pull (download) the source code of a Kaggle kernel to local storage. Use when you need to retrieve a kernel's notebook, script, or metadata files. Optionally include metadata JSON file with kernel configuration details. |
| `KAGGLE_VIEW_COMPETITION_LEADERBOARD` | View competition leaderboard | Tool to view competition leaderboard information showing rankings and scores of participants. Use when you need to check competition standings, team scores, or analyze leaderboard positions. |

## Supported Triggers

None listed.

## Installation and MCP Setup

### Path 1: SDK Installation

#### Path 1, Step 1: Install Composio

Install the Composio SDK
```python
pip install composio_openai
```

```typescript
npm install @composio/openai
```

#### Path 1, Step 2: Initialize Composio and Create Tool Router Session

Import and initialize Composio client, then create a Tool Router session
```python
from openai import OpenAI
from composio import Composio
from composio_openai import OpenAIResponsesProvider

composio = Composio(provider=OpenAIResponsesProvider())
openai = OpenAI()
session = composio.create(user_id='your-user-id')
```

```typescript
import OpenAI from 'openai';
import { Composio } from '@composio/core';
import { OpenAIResponsesProvider } from '@composio/openai';

const composio = new Composio({
  provider: new OpenAIResponsesProvider(),
});
const openai = new OpenAI({});
const session = await composio.create('your-user-id');
```

#### Path 1, Step 3: Execute Kaggle Tools via Tool Router with Your Agent

Get tools from Tool Router session and execute Kaggle actions with your Agent
```python
tools = session.tools
response = openai.responses.create(
  model='gpt-4.1',
  tools=tools,
  input=[{
    'role': 'user',
    'content': 'Download the latest competition data for "titanic"'
  }]
)
result = composio.provider.handle_tool_calls(
  response=response,
  user_id='your-user-id'
)
print(result)
```

```typescript
const tools = session.tools;
const response = await openai.responses.create({
  model: 'gpt-4.1',
  tools: tools,
  input: [{
    role: 'user',
    content: 'Download the latest competition data for "titanic"'
  }],
});
const result = await composio.provider.handleToolCalls(
  'your-user-id',
  response.output
);
console.log(result);
```

### Path 2: MCP Server Setup

#### Path 2, Step 1: Install Composio

Install the Composio SDK and Claude Agent SDK
```python
pip install composio claude-agent-sdk
```

```typescript
npm install @composio/core ai @ai-sdk/openai @ai-sdk/mcp
```

#### Path 2, Step 2: Create Tool Router Session

Initialize the Composio client and create a Tool Router session
```python
from composio import Composio
from claude_agent_sdk import ClaudeSDKClient, ClaudeAgentOptions

composio = Composio(api_key='your-composio-api-key')
session = composio.create(user_id='your-user-id')
url = session.mcp.url
```

```typescript
import { Composio } from '@composio/core';

const composio = new Composio({ apiKey: 'your-api-key' });

console.log("Creating Tool Router session...");
const { mcp } = await composio.create('your-user-id');
console.log(`Tool Router session created: ${mcp.url}`);
```

#### Path 2, Step 3: Connect to AI Agent

Use the MCP server with your AI agent
```python
import asyncio

options = ClaudeAgentOptions(
    permission_mode='bypassPermissions',
    mcp_servers={
        'tool_router': {
            'type': 'http',
            'url': url,
            'headers': {
                'x-api-key': 'your-composio-api-key'
            }
        }
    },
    system_prompt='You are a helpful assistant with access to Kaggle tools.',
    max_turns=10
)

async def main():
    async with ClaudeSDKClient(options=options) as client:
        await client.query('Download competition data files for Titanic competition')
        async for message in client.receive_response():
            if hasattr(message, 'content'):
                for block in message.content:
                    if hasattr(block, 'text'):
                        print(block.text)

asyncio.run(main())
```

```typescript
import { openai } from '@ai-sdk/openai';
import { experimental_createMCPClient as createMCPClient } from '@ai-sdk/mcp';
import { generateText, stepCountIs } from 'ai';

const client = await createMCPClient({
  transport: {
    type: 'http',
    url: mcp.url,
    headers: { 'x-api-key': 'your-composio-api-key' }
  }
});

const tools = await client.tools();

const { text } = await generateText({
  model: openai('gpt-4o'),
  tools,
  messages: [{ role: 'user', content: 'Download competition data files for Titanic competition' }],
  stopWhen: stepCountIs( 5 )
});

console.log(`Agent: ${text}`);
```

## Why Use Composio?

### 1. AI Native Kaggle Integration

- Supports both Kaggle MCP and direct API based integrations
- Structured, LLM-friendly schemas for reliable tool execution
- Rich coverage for reading, writing, and querying your Kaggle data

### 2. Managed Auth

- Built-in API Key handling for seamless access
- Central place to manage, scope, and revoke Kaggle credentials
- Per user and per environment credentials instead of hard-coded keys

### 3. Agent Optimized Design

- Tools are tuned using real error and success rates to improve reliability over time
- Comprehensive execution logs so you always know what ran, when, and on whose behalf

### 4. Enterprise Grade Security

- Fine-grained RBAC so you control which agents and users can access Kaggle
- Scoped, least privilege access to Kaggle resources
- Full audit trail of agent actions to support review and compliance

## Use Kaggle with any AI Agent Framework

Choose a framework you want to connect Kaggle with:

- [ChatGPT](https://composio.dev/toolkits/kaggle/framework/chatgpt)
- [OpenAI Agents SDK](https://composio.dev/toolkits/kaggle/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/kaggle/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/kaggle/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/kaggle/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/kaggle/framework/codex)
- [Cursor](https://composio.dev/toolkits/kaggle/framework/cursor)
- [VS Code](https://composio.dev/toolkits/kaggle/framework/vscode)
- [OpenCode](https://composio.dev/toolkits/kaggle/framework/opencode)
- [OpenClaw](https://composio.dev/toolkits/kaggle/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/kaggle/framework/hermes-agent)
- [Google ADK](https://composio.dev/toolkits/kaggle/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/kaggle/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/kaggle/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/kaggle/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/kaggle/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/kaggle/framework/crew-ai)
- [Pydantic AI](https://composio.dev/toolkits/kaggle/framework/pydantic-ai)
- [AutoGen](https://composio.dev/toolkits/kaggle/framework/autogen)

## Related Toolkits

- [Composio](https://composio.dev/toolkits/composio) - Composio is an integration platform that connects AI agents with hundreds of business tools. It streamlines authentication and lets you trigger actions across services—no custom code needed.
- [Composio search](https://composio.dev/toolkits/composio_search) - Composio search is a unified web search toolkit spanning travel, e-commerce, news, financial markets, images, and more. It lets you and your apps tap into up-to-date web data from a single, easy-to-integrate service.
- [Perplexityai](https://composio.dev/toolkits/perplexityai) - Perplexityai delivers natural, conversational AI models for generating human-like text. Instantly get context-aware, high-quality responses for chat, search, or complex workflows.
- [Browser tool](https://composio.dev/toolkits/browser_tool) - Browser tool is a virtual browser integration that lets AI agents interact with the web programmatically. It enables automated browsing, scraping, and action-taking from any AI workflow.
- [Ai ml api](https://composio.dev/toolkits/ai_ml_api) - Ai ml api is a suite of AI/ML models for natural language and image tasks. It provides fast, scalable access to advanced AI capabilities for your apps and workflows.
- [Aivoov](https://composio.dev/toolkits/aivoov) - Aivoov is an AI-powered text-to-speech platform offering 1,000+ voices in over 150 languages. Instantly turn written content into natural, human-like audio for any application.
- [All images ai](https://composio.dev/toolkits/all_images_ai) - All-Images.ai is an AI-powered image generation and management platform. It helps you create, search, and organize images effortlessly with advanced AI capabilities.
- [Anthropic administrator](https://composio.dev/toolkits/anthropic_administrator) - Anthropic administrator is an API for managing Anthropic organizational resources like members, workspaces, and API keys. It helps you automate admin tasks and streamline resource management across your Anthropic organization.
- [Api labz](https://composio.dev/toolkits/api_labz) - Api labz is a platform offering a suite of AI-driven APIs and workflow tools. It helps developers automate tasks and build smarter, more efficient applications.
- [Apipie ai](https://composio.dev/toolkits/apipie_ai) - Apipie ai is an AI model aggregator offering a single API for accessing top AI models from multiple providers. It helps developers build cost-efficient, latency-optimized AI solutions without juggling multiple integrations.
- [Astica ai](https://composio.dev/toolkits/astica_ai) - Astica ai provides APIs for computer vision, NLP, and voice synthesis. Integrate advanced AI features into your app with a single API key.
- [Bigml](https://composio.dev/toolkits/bigml) - BigML is a machine learning platform that lets you build, train, and deploy predictive models from your data. Its intuitive interface and robust API make machine learning accessible and efficient.
- [Botbaba](https://composio.dev/toolkits/botbaba) - Botbaba is a platform for building, managing, and deploying conversational AI chatbots across messaging channels. It streamlines chatbot automation, making it easier to integrate AI into customer interactions.
- [Botpress](https://composio.dev/toolkits/botpress) - Botpress is an open-source platform for building, deploying, and managing chatbots. It helps teams automate conversations and deliver rich, interactive messaging experiences.
- [Chatbotkit](https://composio.dev/toolkits/chatbotkit) - Chatbotkit is a platform for building and managing AI-powered chatbots using robust APIs and SDKs. It lets you easily add conversational AI to your apps for better user engagement.
- [Cody](https://composio.dev/toolkits/cody) - Cody is an AI assistant built for businesses, trained on your company's knowledge and data. It delivers instant answers and insights, tailored for your team.
- [Context7 MCP](https://composio.dev/toolkits/context7_mcp) - Context7 MCP delivers live, version-specific code docs and examples right from the source. It helps developers and AI agents instantly retrieve authoritative programming info—no more out-of-date docs.
- [Customgpt](https://composio.dev/toolkits/customgpt) - CustomGPT.ai lets you build and deploy chatbots tailored to your own data and business needs. Get precise and context-aware AI conversations without writing code.
- [Datarobot](https://composio.dev/toolkits/datarobot) - Datarobot is a machine learning platform that automates model development, deployment, and monitoring. It empowers organizations to quickly gain predictive insights from large datasets.
- [Deepgram](https://composio.dev/toolkits/deepgram) - Deepgram is an AI-powered speech recognition platform for accurate audio transcription and understanding. It enables fast, scalable speech-to-text with advanced audio intelligence features.

## Frequently Asked Questions

### Do I need my own developer credentials to use Kaggle with Composio?

Yes, Kaggle requires you to configure your own API key credentials. Once set up, Composio handles secure credential storage and API request handling for you.

### Can I use multiple toolkits together?

Yes! Composio's Tool Router enables agents to use multiple toolkits. [Learn more](https://docs.composio.dev/tool-router/overview).

### Is Composio secure?

Composio is SOC 2 and ISO 27001 compliant with all data encrypted in transit and at rest. [Learn more](https://trust.composio.dev).

### What if the API changes?

Composio maintains and updates all toolkit integrations automatically, so your agents always work with the latest API versions.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
