# How to connect Databricks MCP with VS Code

```json
{
  "title": "How to connect Databricks MCP with VS Code",
  "toolkit": "Databricks",
  "toolkit_slug": "databricks",
  "framework": "VS Code",
  "framework_slug": "vscode",
  "url": "https://composio.dev/toolkits/databricks/framework/vscode",
  "markdown_url": "https://composio.dev/toolkits/databricks/framework/vscode.md",
  "updated_at": "2026-03-29T06:29:42.676Z"
}
```

## Introduction

### How to connect Databricks MCP with VS Code
VS Code is the most popular code editor out there. With its recent AI makeover, it can do more than just help you write code. You can connect your applications to it and let LLMs automate many of the mundane tasks in your workflow.
In this guide, I will explain how to connect Databricks with VS Code in the most secure and robust way possible via Composio.

## Also integrate Databricks with

- [ChatGPT](https://composio.dev/toolkits/databricks/framework/chatgpt)
- [OpenAI Agents SDK](https://composio.dev/toolkits/databricks/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/databricks/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/databricks/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/databricks/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/databricks/framework/codex)
- [Cursor](https://composio.dev/toolkits/databricks/framework/cursor)
- [OpenCode](https://composio.dev/toolkits/databricks/framework/opencode)
- [OpenClaw](https://composio.dev/toolkits/databricks/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/databricks/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/databricks/framework/cli)
- [Google ADK](https://composio.dev/toolkits/databricks/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/databricks/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/databricks/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/databricks/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/databricks/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/databricks/framework/crew-ai)

## TL;DR

### Why use Composio?
Composio provides:
- Access to 1,000+ managed apps from a single MCP endpoint. This makes it convenient for agents to run cross-app workflows.
- Programmatic tool calling. Allows LLMs to write its code in a remote workbench to handle complex tool chaining. Reduces to-and-fro with LLMs for frequent tool calling.
- Large tool response handling outside the LLM context. This minimizes context bloat from large tool responses.
- Dynamic just-in-time access to thousands of tools across hundreds of apps. Composio loads the tools your agent needs, so LLMs are not overwhelmed by tools they do not need.

## Connect Databricks to VS Code

### Integrate Databricks MCP with VS Code
### 1. Install with one click
Click the button below to add Composio to VS Code. You will be prompted to authorize. This requires VS Code 1.99+ with GitHub Copilot.
[+Install in VS Code](vscode:mcp/install?%7B%22name%22%3A%22composio%22%2C%22type%22%3A%22http%22%2C%22url%22%3A%22https%3A%2F%2Fconnect.composio.dev%2Fmcp%22%7D)
### 2. Or add manually
Open or create .vscode/mcp.json in your project root and add the following configuration:

```bash
{
  "servers": {
    "composio": {
      "type": "http",
      "url": "https://connect.composio.dev/mcp"
    }
  }
}
```

## What is the Databricks MCP server, and what's possible with it?

The Databricks MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Databricks account. It provides structured and secure access so your agent can perform Databricks operations on your behalf.

## Supported Tools

| Tool slug | Name | Description |
|---|---|---|
| `DATABRICKS_ADD_MEMBER_TO_SECURITY_GROUP` | Add Member to Security Group | Tool to add a user or group as a member to a Databricks security group. Use when you need to grant group membership for access control. |
| `DATABRICKS_AGENTBRICKS_AGENT_BRICKS_DELETE_CUSTOM_LLM` | Delete Custom LLM Agent | Tool to delete a Custom LLM agent created through Agent Bricks. Use when you need to remove a custom LLM and all associated data. This operation is irreversible and deletes all data including temporary transformations, model checkpoints, and internal metadata. |
| `DATABRICKS_APPS_APPS_CREATE` | Create Databricks App | Tool to create a new Databricks app with specified configuration. Use when you need to create apps hosted on Databricks serverless platform to deploy secure data and AI applications. The app name must be unique within the workspace, contain only lowercase alphanumeric characters and hyphens, and cannot be changed after creation. |
| `DATABRICKS_APPS_APPS_DELETE` | Delete Databricks App | Tool to delete a Databricks app from the workspace. Use when you need to remove an app and its associated service principal. When an app is deleted, Databricks automatically deletes the provisioned service principal. |
| `DATABRICKS_DEPLOY_DATABRICKS_APP` | Deploy Databricks App | Tool to create a deployment for a Databricks app. Use when you need to deploy an app with source code from a workspace path. The deployment process provisions compute resources and uploads the source code. Deployments can be in states: IN_PROGRESS, SUCCEEDED, FAILED, or CANCELLED. |
| `DATABRICKS_GET_DATABRICKS_APP_DETAILS` | Get Databricks App Details | Tool to retrieve details about a specific Databricks app by name. Use when you need to get comprehensive information about an app including configuration, deployment status, compute resources, and metadata. |
| `DATABRICKS_GET_DATABRICKS_APP_PERMISSION_LEVELS` | Get Databricks App Permission Levels | Tool to retrieve available permission levels for a Databricks app. Use when you need to understand what permission levels can be assigned to users or groups for a specific app. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions. |
| `DATABRICKS_GET_DATABRICKS_APP_PERMISSIONS` | Get Databricks App Permissions | Tool to retrieve permissions for a Databricks app. Use when you need to check who has access to an app and their permission levels. Returns the access control list including inherited permissions from parent or root objects. |
| `DATABRICKS_GET_APP_DEPLOYMENT_UPDATE` | Get App Deployment Update | Tool to retrieve information about a specific app deployment update. Use when you need to track the status and details of app deployment updates, including whether the update succeeded, failed, is in progress, or was not updated. |
| `DATABRICKS_SET_DATABRICKS_APP_PERMISSIONS` | Set Databricks App Permissions | Tool to set permissions for a Databricks app, replacing all existing permissions. Use when you need to configure access control for an app. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead. Admin permissions cannot be removed. |
| `DATABRICKS_START_DATABRICKS_APP` | Start Databricks App | Tool to start the last active deployment of a Databricks app. Use when you need to start a stopped app, which transitions it to the ACTIVE state. The start operation is asynchronous and the app will transition to the ACTIVE state after the operation completes. |
| `DATABRICKS_STOP_DATABRICKS_APP` | Stop Databricks App | Tool to stop the active deployment of a Databricks app. Use when you need to stop a running app, which transitions it to the STOPPED state. The stop operation is asynchronous and the app will transition to the STOPPED state after the operation completes. |
| `DATABRICKS_UPDATE_DATABRICKS_APP` | Update Databricks App | Tool to update an existing Databricks app configuration. Use when you need to modify app settings such as description, resources, compute size, budget policy, or API scopes. This is a partial update operation - only fields provided in the request will be updated, other fields retain their current values. |
| `DATABRICKS_UPDATE_DATABRICKS_APP_PERMISSIONS` | Update Databricks App Permissions | Tool to incrementally update permissions for a Databricks app. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead. |
| `DATABRICKS_CANCEL_DATABRICKS_JOB_RUN` | Cancel Databricks Job Run | Tool to cancel a Databricks job run asynchronously. Use when you need to terminate a running job. The run will be terminated shortly after the request completes. If the run is already in a terminal state (TERMINATED, SKIPPED, or INTERNAL_ERROR), this is a no-op. |
| `DATABRICKS_GET_CATALOG_ARTIFACT_ALLOWLIST` | Get Catalog Artifact Allowlist | Tool to retrieve artifact allowlist configuration for a specified artifact type in Unity Catalog. Use when you need to check which artifacts are permitted for use in your Databricks environment. Requires metastore admin privileges or MANAGE ALLOWLIST privilege on the metastore. |
| `DATABRICKS_DELETE_CATALOG` | Delete Catalog | Tool to delete a catalog from Unity Catalog metastore. Use when you need to permanently remove a catalog and optionally its contents. By default, the catalog must be empty (except for information_schema). Use force=true to delete non-empty catalogs. Do not delete the main catalog as it can break existing data operations. |
| `DATABRICKS_GET_CATALOG_DETAILS` | Get Catalog Details | Tool to retrieve details of a specific catalog in Unity Catalog. Use when you need to get information about a catalog including its metadata, owner, properties, and configuration. Requires metastore admin privileges, catalog ownership, or USE_CATALOG privilege. |
| `DATABRICKS_CREATE_CATALOG_CONNECTION` | Create Catalog Connection | Tool to create a new Unity Catalog connection to external data sources. Use when you need to establish connections to databases and services such as MySQL, PostgreSQL, Snowflake, etc. Requires metastore admin privileges or CREATE CONNECTION privilege on the metastore. |
| `DATABRICKS_DELETE_CATALOG_CONNECTION` | Delete Catalog Connection | Tool to delete a Unity Catalog connection to external data sources. Use when you need to remove connections to databases and services. Deleting a connection removes the abstraction used to connect from Databricks Compute to external data sources. |
| `DATABRICKS_GET_CATALOG_CONNECTION` | Get Catalog Connection | Tool to retrieve detailed information about a specific Unity Catalog connection. Use when you need to get connection metadata, configuration, and properties for external data source connections. |
| `DATABRICKS_UPDATE_CATALOG_CONNECTION` | Update Catalog Connection | Tool to update an existing Unity Catalog connection configuration. Use when you need to modify connection properties, credentials, ownership, or metadata for external data sources. |
| `DATABRICKS_CREATE_CATALOG_CREDENTIAL` | Create Catalog Credential | Tool to create a new credential for Unity Catalog access to cloud services. Use when you need to establish authentication for STORAGE (cloud storage) or SERVICE (external services like AWS Glue) purposes. Requires metastore admin or CREATE_STORAGE_CREDENTIAL/CREATE_SERVICE_CREDENTIAL privileges. Exactly one cloud credential type must be provided. |
| `DATABRICKS_DELETE_CATALOG_CREDENTIAL` | Delete Catalog Credential | Tool to delete a Unity Catalog credential for cloud storage or service access. Use when you need to remove credentials that authenticate access to cloud resources. By default, deletion will fail if the credential has dependent resources. Use force=true to delete credentials with dependencies. |
| `DATABRICKS_GENERATE_TEMPORARY_SERVICE_CREDENTIAL` | Generate Temporary Service Credential | Tool to generate temporary credentials from a service credential with admin access. Use when you need short-lived, scoped credentials for accessing cloud resources. The caller must be a metastore admin or have the ACCESS privilege on the service credential. |
| `DATABRICKS_GET_CATALOG_CREDENTIAL` | Get Catalog Credential | Tool to retrieve detailed information about a specific Unity Catalog credential. Use when you need to get credential metadata, configuration, and cloud provider details for storage or service credentials. |
| `DATABRICKS_UPDATE_CATALOG_CREDENTIAL` | Update Catalog Credential | Tool to update an existing Unity Catalog credential with new properties. Use when you need to modify credential configuration, ownership, or cloud provider settings. The caller must be the owner of the credential, a metastore admin, or have MANAGE permission on the credential. If the caller is a metastore admin, only the owner field can be changed. |
| `DATABRICKS_VALIDATE_CATALOG_CREDENTIAL` | Validate Catalog Credential | Tool to validate a Unity Catalog credential for external access. Use when you need to verify that a credential can successfully perform its intended operations. For SERVICE credentials, validates cloud service access. For STORAGE credentials, tests READ, WRITE, DELETE, LIST operations on the specified location. |
| `DATABRICKS_GET_ENTITY_TAG_ASSIGNMENT` | Get Entity Tag Assignment | Tool to retrieve a specific tag assignment for a Unity Catalog entity by tag key. Use when you need to get details about a tag assigned to catalogs, schemas, tables, columns, or volumes. Requires USE CATALOG and USE SCHEMA permissions on parent resources, and ASSIGN or MANAGE permissions on the tag policy for governed tags. |
| `DATABRICKS_CREATE_EXTERNAL_LOCATION` | Create External Location | Tool to create a new Unity Catalog external location combining a cloud storage path with a storage credential. Use when you need to establish access to cloud storage in Azure Data Lake Storage, AWS S3, or Cloudflare R2. Requires metastore admin or CREATE_EXTERNAL_LOCATION privilege on both the metastore and the associated storage credential. |
| `DATABRICKS_DELETE_EXTERNAL_LOCATION` | Delete External Location | Tool to delete an external location from Unity Catalog metastore. Use when you need to remove an external location that combines a cloud storage path with a storage credential. The caller must be the owner of the external location. Use force=true to delete even if there are dependent external tables or mounts. |
| `DATABRICKS_GET_EXTERNAL_LOCATION_DETAILS` | Get External Location Details | Tool to retrieve details of a specific Unity Catalog external location. Use when you need to get information about an external location including its URL, storage credential, and configuration. Requires metastore admin privileges, external location ownership, or appropriate privileges on the external location. |
| `DATABRICKS_UPDATE_EXTERNAL_LOCATION` | Update External Location | Tool to update an existing Unity Catalog external location properties. Use when you need to modify the cloud storage path, credentials, ownership, or configuration of an external location. The caller must be the owner of the external location or a metastore admin. Use force parameter to update even if URL changes invalidate dependencies. |
| `DATABRICKS_UPDATE_EXTERNAL_METADATA` | Update External Metadata | Tool to update an external metadata object in Unity Catalog. Use when you need to modify metadata about external systems registered within Unity Catalog. The user must have metastore admin status, own the object, or possess the MODIFY privilege. Note that changing ownership requires the MANAGE privilege, and callers cannot update both the owner and other metadata in a single request. |
| `DATABRICKS_UPDATE_CATALOG_FUNCTION` | Update Catalog Function | Tool to update function owner in Unity Catalog. Use when you need to change the ownership of a catalog function. Only the owner of the function can be updated via this endpoint. The caller must be a metastore admin, the owner of the function's parent catalog, the owner of the parent schema with USE_CATALOG privilege, or the owner of the function with both USE_CATALOG and USE_SCHEMA privileges. |
| `DATABRICKS_GET_CATALOG_GRANTS` | Get Catalog Grants | Tool to get permissions (grants) for a securable in Unity Catalog without inherited permissions. Use when you need to see direct privilege assignments on a catalog or other securable object. Returns only privileges directly assigned to principals, excluding inherited permissions from parent securables. For inherited permissions, use the get-effective endpoint instead. |
| `DATABRICKS_GET_EFFECTIVE_CATALOG_PERMISSIONS` | Get Effective Catalog Permissions | Tool to get effective permissions for a securable in Unity Catalog, including inherited permissions from parent securables. Use when you need to understand what privileges are granted to principals through direct assignments or inheritance. Returns privileges conveyed to each principal through the Unity Catalog hierarchy (metastore → catalog → schema → table/view/volume). |
| `DATABRICKS_UPDATE_CATALOG_GRANTS` | Update Catalog Grants | Tool to update permissions for Unity Catalog securables by adding or removing privileges for principals. Use when you need to grant or revoke permissions on catalogs, schemas, tables, or other Unity Catalog objects. Only metastore admins, object owners, users with MANAGE privilege, or parent catalog/schema owners can update permissions. |
| `DATABRICKS_ASSIGN_METASTORE_TO_WORKSPACE` | Assign Metastore to Workspace | Tool to assign a Unity Catalog metastore to a workspace. Use when you need to link a workspace to a Unity Catalog metastore, enabling shared data access with consistent governance policies. Requires account admin privileges. If an assignment for the same workspace_id exists, it will be overwritten by the new metastore_id and default_catalog_name. |
| `DATABRICKS_CREATE_METASTORE` | Create Metastore | Tool to create a new Unity Catalog metastore. Use when you need to establish a top-level container for data in Unity Catalog, registering metadata about securable objects (tables, volumes, external locations, shares) and access permissions. Requires account admin privileges. By default, the owner is the user calling the API; setting owner to empty string assigns ownership to System User. |
| `DATABRICKS_GET_CURRENT_METASTORE_ASSIGNMENT` | Get Current Metastore Assignment | Tool to retrieve the current metastore assignment for the workspace being accessed. Use when you need to determine which metastore is assigned to the current workspace context. |
| `DATABRICKS_DELETE_METASTORE` | Delete Metastore | Tool to delete a Unity Catalog metastore. Use when you need to permanently remove a metastore and its managed data. Before deletion, you must delete or unlink any workspaces using the metastore. All objects managed by the metastore will become inaccessible. Requires metastore admin privileges. |
| `DATABRICKS_GET_METASTORE_DETAILS` | Get Metastore Details | Tool to retrieve detailed information about a specific Unity Catalog metastore by its ID. Use when you need to get metastore configuration, ownership, storage settings, and Delta Sharing details. Requires metastore admin permissions. |
| `DATABRICKS_GET_METASTORE_SUMMARY` | Get Metastore Summary | Tool to retrieve summary information about the metastore associated with the current workspace. Use when you need metastore configuration overview including cloud vendor, region, storage, and Delta Sharing details. |
| `DATABRICKS_UNASSIGN_METASTORE_FROM_WORKSPACE` | Unassign Metastore from Workspace | Tool to unassign a Unity Catalog metastore from a workspace. Use when you need to remove the association between a workspace and its assigned metastore, leaving the workspace with no metastore. The metastore itself is not deleted, only the workspace assignment is removed. Requires account admin privileges. |
| `DATABRICKS_UPDATE_METASTORE` | Update Metastore | Tool to update configuration settings for an existing Unity Catalog metastore. Use when you need to modify metastore properties like name, owner, Delta Sharing settings, or storage credentials. Requires metastore admin permissions. |
| `DATABRICKS_UPDATE_METASTORE_ASSIGNMENT` | Update Metastore Assignment | Tool to update a metastore assignment for a workspace. Use when you need to update the metastore_id or default_catalog_name for a workspace that already has a metastore assigned. Account admin privileges are required to update metastore_id, while workspace admin can update default_catalog_name. |
| `DATABRICKS_GET_MODEL_VERSION` | Get Model Version | Tool to retrieve detailed information about a specific version of a registered model in Unity Catalog. Use when you need to get metadata, status, source location, and configuration of a model version. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges. |
| `DATABRICKS_UPDATE_MODEL_VERSION` | Update Model Version | Tool to update a Unity Catalog model version. Use when you need to modify the comment of a specific model version. Currently only the comment field can be updated. The caller must be a metastore admin or owner of the parent registered model with appropriate catalog and schema privileges. |
| `DATABRICKS_DELETE_ONLINE_TABLE` | Delete Online Table | Tool to delete an online table by name. Use when you need to permanently remove an online table and stop data synchronization. This operation deletes all data in the online table permanently and releases all resources. Note: online tables are deprecated and will not be accessible after January 15, 2026. |
| `DATABRICKS_GET_QUALITY_MONITOR` | Get Quality Monitor | Tool to retrieve quality monitor configuration for a Unity Catalog table. Use when you need to get monitor status, metrics tables, custom metrics, notifications, scheduling, and monitoring configuration details. Requires catalog and schema privileges plus SELECT on the table. |
| `DATABRICKS_LIST_QUALITY_MONITOR_REFRESHES` | List Quality Monitor Refreshes | Tool to retrieve the refresh history for a quality monitor on a Unity Catalog table. Use when you need to check the status and history of monitor refresh operations. Returns up to 25 most recent refreshes including their state, timing, and status messages. |
| `DATABRICKS_GET_REGISTERED_MODEL` | Get Registered Model | Tool to retrieve detailed information about a registered model in Unity Catalog. Use when you need to get metadata, owner, storage location, and configuration of a registered model. Requires metastore admin privileges, model ownership, or EXECUTE privilege on the registered model with appropriate catalog and schema privileges. |
| `DATABRICKS_GET_RESOURCE_QUOTA_INFORMATION` | Get Resource Quota Information | Tool to retrieve usage information for a Unity Catalog resource quota defined by a child-parent pair. Use when you need to check quota usage for a specific resource type (tables per metastore, schemas per catalog, etc.). The API also triggers an asynchronous refresh if the count is out of date. Requires account admin authentication with OAuth. |
| `DATABRICKS_BATCH_CREATE_ACCESS_REQUESTS` | Batch Create Access Requests | Tool to batch create access requests for Unity Catalog permissions. Use when you need to request access to catalogs, schemas, tables, or other Unity Catalog securables. Maximum 30 requests per API call, and maximum 30 securables per principal per call. |
| `DATABRICKS_GET_ACCESS_REQUEST_DESTINATIONS` | Get Access Request Destinations | Tool to retrieve access request destinations for a Unity Catalog securable. Use when you need to find where notifications are sent when users request access to catalogs, schemas, tables, or other securables. Any caller can see URL destinations or destinations on the metastore. For other securables, only those with BROWSE permissions can see destinations. |
| `DATABRICKS_UPDATE_ACCESS_REQUEST_DESTINATIONS` | Update Access Request Destinations | Tool to update access request notification destinations for Unity Catalog securables. Use when you need to configure where access request notifications are sent for catalogs, schemas, or external locations. Requires metastore admin, owner privileges, or MANAGE permission on the securable. Maximum 5 emails and 5 external destinations allowed per securable. |
| `DATABRICKS_GET_CATALOG_SCHEMA` | Get Catalog Schema | Tool to retrieve details of a specific schema from Unity Catalog metastore. Use when you need to get schema metadata, ownership, storage configuration, and properties. Requires metastore admin privileges, schema ownership, or USE_SCHEMA privilege. |
| `DATABRICKS_CREATE_STORAGE_CREDENTIAL` | Create Storage Credential | Tool to create a new storage credential in Unity Catalog for cloud data access. Use when you need to establish authentication for accessing cloud storage paths. Requires metastore admin or CREATE_STORAGE_CREDENTIAL privilege on the metastore. Exactly one cloud credential type must be provided. |
| `DATABRICKS_DELETE_STORAGE_CREDENTIAL` | Delete Storage Credential | Tool to delete a storage credential from the Unity Catalog metastore. Use when you need to remove storage credentials that provide authentication to cloud storage. The caller must be the owner of the storage credential. Use force=true to delete even if there are dependent external locations, tables, or services. |
| `DATABRICKS_GET_STORAGE_CREDENTIAL` | Get Storage Credential | Tool to retrieve storage credential details from Unity Catalog metastore by name. Use when you need to get information about a storage credential's configuration and properties. Requires metastore admin privileges, credential ownership, or appropriate permissions on the storage credential. |
| `DATABRICKS_UPDATE_STORAGE_CREDENTIAL` | Update Storage Credential | Tool to update an existing storage credential in Unity Catalog. Use when you need to modify credential properties, cloud provider configuration, or ownership. The caller must be the owner of the storage credential or a metastore admin. Metastore admins can only modify the owner field. |
| `DATABRICKS_VALIDATE_STORAGE_CREDENTIAL` | Validate Storage Credential | Tool to validate a storage credential configuration for Unity Catalog. Use when you need to verify that a storage credential can successfully access a cloud storage location. Requires metastore admin, storage credential owner, or CREATE_EXTERNAL_LOCATION privilege. |
| `DATABRICKS_DISABLE_SYSTEM_SCHEMA` | Disable System Schema | Tool to disable a system schema in Unity Catalog metastore. Use when you need to remove a system schema from the system catalog. System schemas store information about customer usage patterns such as audit logs, billing information, and lineage data. Requires account admin or metastore admin privileges. |
| `DATABRICKS_ENABLE_SYSTEM_SCHEMA` | Enable System Schema | Tool to enable a system schema in Unity Catalog metastore. Use when you need to activate a system schema to track customer usage patterns. System schemas store information about audit logs, billing, compute usage, storage, lineage, and marketplace data. Requires account admin or metastore admin privileges. |
| `DATABRICKS_DELETE_CATALOG_TABLE` | Delete Catalog Table | Tool to delete a table from Unity Catalog. Use when you need to permanently remove a table from its parent catalog and schema. The operation requires appropriate permissions on the parent catalog, schema, and table. |
| `DATABRICKS_CHECK_TABLE_EXISTS` | Check Table Exists | Tool to check if a table exists in Unity Catalog metastore. Use when you need to verify table existence before performing operations. Requires metastore admin privileges, table ownership with SELECT privilege, or USE_CATALOG and USE_SCHEMA privileges on parent objects. |
| `DATABRICKS_GET_CATALOG_TABLE_DETAILS` | Get Catalog Table Details | Tool to retrieve comprehensive metadata about a table from Unity Catalog metastore. Use when you need detailed table information including columns, type, storage, constraints, and governance metadata. Requires metastore admin privileges, table ownership, or SELECT privilege on the table, plus USE_CATALOG and USE_SCHEMA privileges on parent objects. |
| `DATABRICKS_UPDATE_CATALOG_TABLE` | Update Catalog Table | Tool to update Unity Catalog table properties. Use when you need to change the owner or comment of a table. The caller must be the owner of the parent catalog, have the USE_CATALOG privilege on the parent catalog and be the owner of the parent schema, or be the owner of the table and have the USE_CATALOG privilege on the parent catalog and the USE_SCHEMA privilege on the parent schema. |
| `DATABRICKS_GENERATE_TEMPORARY_PATH_CREDENTIALS` | Generate Temporary Path Credentials | Tool to generate short-lived, scoped temporary credentials for accessing external storage locations registered in Unity Catalog. Use when you need temporary access to cloud storage paths with specific read/write permissions. The credentials inherit the privileges of the requesting principal and are valid for a limited time. The requesting principal must have EXTERNAL USE LOCATION privilege on the external location. |
| `DATABRICKS_GET_CATALOG_VOLUME_DETAILS` | Get Catalog Volume Details | Tool to retrieve detailed information about a specific Unity Catalog volume. Use when you need to get volume metadata including type, storage location, owner, and timestamps. Requires metastore admin privileges or volume ownership with appropriate USE_CATALOG and USE_SCHEMA privileges on parent objects. |
| `DATABRICKS_UPDATE_CATALOG_WORKSPACE_BINDINGS` | Update Catalog Workspace Bindings | Tool to update workspace bindings for a Unity Catalog securable (catalog). Use when you need to control which workspaces can access a catalog. Allows adding or removing workspace bindings with read-write or read-only access. Caller must be a metastore admin or owner of the catalog. |
| `DATABRICKS_GET_CLEAN_ROOM_ASSET` | Get Clean Room Asset | Tool to retrieve detailed information about a specific asset within a Databricks Clean Room. Use when you need to get metadata and configuration for clean room assets such as tables, views, notebooks, volumes, or foreign tables. |
| `DATABRICKS_CREATE_CLEAN_ROOM_AUTO_APPROVAL_RULE` | Create Clean Room Auto-Approval Rule | Tool to create a new auto-approval rule for a Databricks Clean Room. Use when you need to automatically approve notebooks shared by other collaborators that meet specific criteria. In 2-person clean rooms, auto-approve notebooks from the other collaborator using author_collaborator_alias. In multi-collaborator clean rooms, use author_scope=ANY_AUTHOR to auto-approve from any author. |
| `DATABRICKS_CREATE_CLEAN_ROOM` | Create Clean Room | Tool to create a new Databricks Clean Room for secure data collaboration with specified collaborators. Use when you need to establish a collaborative environment for multi-party data analysis. This is an asynchronous operation; the clean room starts in PROVISIONING state and becomes ACTIVE when ready. Requires metastore admin privileges or CREATE_CLEAN_ROOM privilege on the metastore. |
| `DATABRICKS_CREATE_COMPUTE_CLUSTER_POLICY` | Create Compute Cluster Policy | Tool to create a new cluster policy with prescribed settings for controlling cluster creation. Use when you need to establish policies that govern cluster configurations. Only admin users can create cluster policies. |
| `DATABRICKS_DELETE_COMPUTE_CLUSTER_POLICY` | Delete Compute Cluster Policy | Tool to delete a cluster policy. Use when you need to remove a cluster policy from the workspace. Clusters governed by this policy can still run, but cannot be edited. Only workspace admin users can delete policies. This operation is permanent and cannot be undone. |
| `DATABRICKS_EDIT_COMPUTE_CLUSTER_POLICY` | Edit Compute Cluster Policy | Tool to update an existing Databricks cluster policy. Use when you need to modify policy settings like name, definition, or restrictions. Note that this operation may make some clusters governed by the previous policy invalid. |
| `DATABRICKS_GET_COMPUTE_CLUSTER_POLICY` | Get Compute Cluster Policy | Tool to retrieve detailed information about a specific cluster policy by its ID. Use when you need to view the configuration and settings of an existing cluster policy. |
| `DATABRICKS_GET_COMPUTE_CLUSTER_POLICY_PERMISSION_LEVELS` | Get Compute Cluster Policy Permission Levels | Tool to retrieve available permission levels for a Databricks cluster policy. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster policy. Returns permission levels like CAN_USE with their descriptions. |
| `DATABRICKS_GET_COMPUTE_CLUSTER_POLICY_PERMISSIONS` | Get Compute Cluster Policy Permissions | Tool to retrieve permissions for a Databricks cluster policy. Use when you need to check who has access to a specific cluster policy and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects. |
| `DATABRICKS_SET_COMPUTE_CLUSTER_POLICY_PERMISSIONS` | Set Compute Cluster Policy Permissions | Tool to set permissions for a Databricks cluster policy, replacing all existing permissions. Use when you need to configure access control for a cluster policy. This operation replaces ALL existing permissions; non-admin users must be granted permissions to access the policy. Workspace admins always have permissions on all policies. |
| `DATABRICKS_UPDATE_CLUSTER_POLICY_PERMISSIONS` | Update Cluster Policy Permissions | Tool to incrementally update permissions on a Databricks cluster policy. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. |
| `DATABRICKS_DELETE_COMPUTE_CLUSTER` | Delete Compute Cluster | Tool to terminate a Databricks compute cluster. Use when you need to stop and delete a cluster. The cluster configuration is retained for 30 days after termination, after which it is permanently deleted. |
| `DATABRICKS_EDIT_COMPUTE_CLUSTER` | Edit Compute Cluster | Tool to update the configuration of a Databricks cluster. Use when you need to modify cluster settings like size, Spark version, or cloud-specific attributes. The cluster must be in RUNNING or TERMINATED state. Running clusters will restart to apply changes. |
| `DATABRICKS_GET_COMPUTE_CLUSTER_PERMISSION_LEVELS` | Get Compute Cluster Permission Levels | Tool to retrieve available permission levels for a Databricks compute cluster. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific cluster. Returns permission levels like CAN_ATTACH_TO, CAN_RESTART, and CAN_MANAGE with their descriptions. |
| `DATABRICKS_LIST_COMPUTE_CLUSTER_NODE_TYPES` | List Compute Cluster Node Types | Tool to list all supported Spark node types available for cluster launch in the workspace region. Use when you need to determine which instance types are available for creating or configuring clusters. Returns detailed specifications including compute resources, storage capabilities, and cloud-specific attributes for each node type. |
| `DATABRICKS_LIST_COMPUTE_CLUSTER_AVAILABILITY_ZONES` | List Compute Cluster Availability Zones | Tool to list availability zones where Databricks clusters can be created. Use when you need to determine available zones for cluster deployment or planning redundancy. Returns the default zone and a list of all zones available in the workspace's cloud region. This endpoint is available for AWS workspaces. |
| `DATABRICKS_PERMANENTLY_DELETE_COMPUTE_CLUSTER` | Permanently Delete Compute Cluster | Tool to permanently delete a Databricks compute cluster. Use when you need to irreversibly remove a cluster and its resources. After permanent deletion, the cluster will no longer appear in the cluster list and cannot be recovered. |
| `DATABRICKS_PIN_COMPUTE_CLUSTER` | Pin Compute Cluster | Tool to pin a Databricks compute cluster configuration. Use when you need to preserve a cluster's configuration beyond the standard 30-day retention period. This operation is idempotent - pinning an already-pinned cluster has no effect. Requires workspace administrator privileges. |
| `DATABRICKS_LIST_COMPUTE_CLUSTER_SPARK_VERSIONS` | List Compute Cluster Spark Versions | Tool to list all available Databricks Runtime Spark versions for cluster creation. Use when you need to determine which Spark versions are available for creating or configuring clusters. The 'key' field from the response should be used as the 'spark_version' parameter when creating clusters. |
| `DATABRICKS_START_COMPUTE_CLUSTER` | Start Compute Cluster | Tool to start a terminated Databricks compute cluster asynchronously. Use when you need to restart a stopped cluster. The cluster transitions through PENDING state before reaching RUNNING. Poll cluster status to verify when fully started. |
| `DATABRICKS_UNPIN_COMPUTE_CLUSTER` | Unpin Compute Cluster | Tool to unpin a Databricks compute cluster configuration. Use when you need to allow a cluster's configuration to be removed after termination. This operation is idempotent - unpinning an already-unpinned cluster has no effect. Requires workspace administrator privileges. |
| `DATABRICKS_UPDATE_COMPUTE_CLUSTER` | Update Compute Cluster | Tool to partially update a Databricks compute cluster configuration using field masks. Use when you need to update specific cluster attributes without providing a full configuration. The update_mask specifies which fields to modify. Running clusters restart to apply changes; terminated clusters apply changes on next startup. |
| `DATABRICKS_CREATE_GLOBAL_INIT_SCRIPT` | Create Global Init Script | Tool to create a new global initialization script in Databricks workspace. Use when you need to run scripts on every node in every cluster. Global init scripts run on all cluster nodes and only workspace admins can create them. Scripts execute in position order and clusters must restart to apply changes. The script cannot exceed 64KB when decoded. |
| `DATABRICKS_DELETE_GLOBAL_INIT_SCRIPT` | Delete Global Init Script | Tool to delete a global initialization script from Databricks workspace. Use when you need to remove a script that runs on every cluster node. Requires workspace administrator privileges. Clusters must restart to reflect the removal of the script. |
| `DATABRICKS_GET_GLOBAL_INIT_SCRIPT` | Get Global Init Script | Tool to retrieve complete details of a global initialization script in Databricks workspace. Use when you need to view script configuration, Base64-encoded content, or metadata. Returns all script details including creation/update timestamps and whether the script is enabled. |
| `DATABRICKS_UPDATE_GLOBAL_INIT_SCRIPT` | Update Global Init Script | Tool to update a global initialization script in Databricks workspace. Use when you need to modify script content, name, enabled status, or execution order. All fields are optional; unspecified fields retain their current value. Existing clusters must be restarted to pick up changes. |
| `DATABRICKS_CREATE_COMPUTE_INSTANCE_POOL` | Create Compute Instance Pool | Tool to create a new Databricks instance pool with specified configuration. Use when you need to set up a pool that reduces cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances. When attached to a pool, a cluster allocates driver and worker nodes from the pool. |
| `DATABRICKS_DELETE_COMPUTE_INSTANCE_POOL` | Delete Compute Instance Pool | Tool to delete a Databricks compute instance pool. Use when you need to permanently remove an instance pool. The idle instances in the pool are terminated asynchronously after deletion. |
| `DATABRICKS_EDIT_COMPUTE_INSTANCE_POOL` | Edit Compute Instance Pool | Tool to modify the configuration of an existing Databricks instance pool. Use when you need to update pool settings like capacity, termination minutes, or preloaded images. Note that the pool's node type cannot be changed after creation, though it must still be provided with the same value. |
| `DATABRICKS_GET_INSTANCE_POOL_DETAILS` | Get Instance Pool Details | Tool to retrieve detailed information about a Databricks instance pool by its ID. Use when you need to get instance pool configuration, capacity settings, preloaded images, and usage statistics. Instance pools reduce cluster start and auto-scaling times by maintaining idle, ready-to-use cloud instances. |
| `DATABRICKS_GET_INSTANCE_POOL_PERMISSION_LEVELS` | Get Instance Pool Permission Levels | Tool to retrieve available permission levels for a Databricks instance pool. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific instance pool. Returns permission levels like CAN_ATTACH_TO and CAN_MANAGE with their descriptions. |
| `DATABRICKS_GET_INSTANCE_POOL_PERMISSIONS` | Get Instance Pool Permissions | Tool to retrieve permissions for a Databricks instance pool. Use when you need to check who has access to a specific instance pool and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects. |
| `DATABRICKS_SET_COMPUTE_INSTANCE_POOL_PERMISSIONS` | Set Compute Instance Pool Permissions | Tool to set permissions for a Databricks instance pool, replacing all existing permissions. Use when you need to configure access control for an instance pool. This operation replaces ALL existing permissions. You must have CAN_MANAGE permission on a pool to configure its permissions. |
| `DATABRICKS_UPDATE_INSTANCE_POOL_PERMISSIONS` | Update Instance Pool Permissions | Tool to incrementally update permissions on a Databricks instance pool. Use when you need to modify permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. |
| `DATABRICKS_ADD_COMPUTE_INSTANCE_PROFILE` | Add Compute Instance Profile | Tool to register an instance profile in Databricks for cluster launches. Use when administrators need to grant users permission to launch clusters using that profile. Requires admin access. Successfully registered profiles enable clusters to use the associated IAM role. |
| `DATABRICKS_EDIT_COMPUTE_INSTANCE_PROFILE` | Edit Compute Instance Profile | Tool to modify an existing AWS EC2 instance profile registered with Databricks. Use when you need to update the IAM role ARN associated with an instance profile. This operation is only available to admin users. The IAM role ARN is required if both of the following are true: your role name and instance profile name do not match, and you want to use the instance profile with Databricks SQL Serverless. |
| `DATABRICKS_REMOVE_COMPUTE_INSTANCE_PROFILE` | Remove Compute Instance Profile | Tool to remove an instance profile from Databricks. Use when you need to unregister an AWS instance profile ARN from Databricks. This operation is only accessible to admin users. Existing clusters with this instance profile will continue to function normally after removal. |
| `DATABRICKS_ENFORCE_CLUSTER_POLICY_COMPLIANCE` | Enforce Cluster Policy Compliance | Tool to update a cluster to be compliant with the current version of its policy. Use when you need to enforce policy compliance on a cluster. The cluster can be updated if it is in a RUNNING or TERMINATED state. Note: Clusters created by Databricks Jobs, DLT, or Models cannot be enforced by this API. |
| `DATABRICKS_GET_CLUSTER_POLICY_COMPLIANCE` | Get Cluster Policy Compliance | Tool to retrieve policy compliance status for a specific cluster. Use when you need to check whether a cluster meets the requirements of its assigned policy and identify any policy violations. Clusters could be out of compliance if their policy was updated after the cluster was last edited. |
| `DATABRICKS_GET_COMPUTE_POLICY_FAMILIES` | Get Compute Policy Families | Tool to retrieve information for a policy family by identifier and optional version. Use when you need to view Databricks-provided templates for configuring clusters for a particular use case. Policy families cannot be created, edited, or deleted by users. |
| `DATABRICKS_CREATE_DATABRICKS_CLUSTER` | Create Databricks Cluster | Tool to create a new Databricks Spark cluster with specified configuration. Use when you need to provision compute resources for data processing. This is an asynchronous operation that returns a cluster_id immediately with the cluster in PENDING state. The cluster transitions through states until reaching RUNNING. |
| `DATABRICKS_CREATE_GENIE_MESSAGE` | Create Genie Message | Tool to create a message in a Genie conversation and get AI-generated responses. Use when you need to ask questions or send messages to Genie for data analysis. The response initially has status 'IN_PROGRESS' and should be polled every 1-5 seconds until reaching COMPLETED, FAILED, or CANCELLED status. Subject to 5 queries-per-minute rate limit during Public Preview. |
| `DATABRICKS_CREATE_GENIE_SPACE` | Create Genie Space | Tool to create a new Genie space from a serialized payload for programmatic space management. Use when you need to create a Genie workspace for AI-powered data analysis with predefined sample questions and data sources. The space requires a SQL warehouse ID and a serialized configuration that includes sample questions, instructions, and data source tables. |
| `DATABRICKS_DELETE_GENIE_CONVERSATION` | Delete Genie Conversation | Tool to delete a conversation from a Genie space programmatically. Use when you need to remove conversations to manage the Genie space limits (10,000 conversations per space). Useful for deleting older or test conversations that are no longer needed. |
| `DATABRICKS_DELETE_GENIE_CONVERSATION_MESSAGE` | Delete Genie Conversation Message | Tool to delete a specific message from a Genie conversation. Use when you need to remove individual messages from conversations. This operation permanently deletes the message and cannot be undone. |
| `DATABRICKS_EXECUTE_MESSAGE_ATTACHMENT_QUERY` | Execute Message Attachment Query | Tool to execute SQL query for an expired message attachment in a Genie space. Use when a query attachment has expired and needs to be re-executed to retrieve fresh results. Returns SQL statement execution results with schema, metadata, and data. |
| `DATABRICKS_EXECUTE_GENIE_MESSAGE_QUERY` | Execute Genie Message Query | Tool to execute the SQL query associated with a Genie message. Use when you need to run the query generated by Genie and retrieve result data. Note: This endpoint is deprecated in favor of ExecuteMessageAttachmentQuery. |
| `DATABRICKS_GET_GENIE_MESSAGE` | Get Genie Message | Tool to retrieve details of a specific message from a Genie conversation. Use when you need to get message content, status, attachments, or check processing status of a previously created message. |
| `DATABRICKS_GET_GENIE_MESSAGE_ATTACHMENT_QUERY_RESULT` | Get Genie Message Attachment Query Result | Tool to retrieve SQL query results from a Genie message attachment. Use when the message status is EXECUTING_QUERY or COMPLETED and you need to fetch the actual query execution results. Returns statement execution details including query data, schema, and metadata with a maximum of 5000 rows. |
| `DATABRICKS_GET_GENIE_MESSAGE_QUERY_RESULT` | Get Genie Message Query Result | Tool to retrieve SQL query execution results for a Genie message (up to 5000 rows). Use when message status is EXECUTING_QUERY or COMPLETED and the message has a query attachment. Returns query results with schema, metadata, and data in inline or external link format. |
| `DATABRICKS_GET_GENIE_MESSAGE_QUERY_RESULT` | Get Genie Message Query Result | Tool to retrieve SQL query execution results for a message attachment in a Genie space conversation. Use when you need to fetch query results from a Genie conversation message. Note: This endpoint is deprecated; consider using GetMessageAttachmentQueryResult instead. Returns results only when message status is EXECUTING_QUERY or COMPLETED. Maximum 5,000 rows per result. |
| `DATABRICKS_GET_GENIE_SPACE_DETAILS` | Get Genie Space Details | Tool to retrieve detailed information about a specific Databricks Genie space by ID. Use when you need to get configuration details, metadata, and optionally the serialized space content for backup or promotion across workspaces. Requires at least CAN EDIT permission to retrieve the serialized space content. |
| `DATABRICKS_LIST_GENIE_CONVERSATION_MESSAGES` | List Genie Conversation Messages | Tool to retrieve all messages from a specific conversation thread in a Genie space. Use when you need to view the complete message history of a conversation including user queries and AI responses. Supports pagination for conversations with many messages. |
| `DATABRICKS_LIST_GENIE_CONVERSATIONS` | List Genie Conversations | Tool to retrieve all existing conversation threads within a Genie space. Use when you need to view conversations in a Genie space, either for the current user or all users if you have CAN MANAGE permission. Supports pagination for spaces with many conversations. |
| `DATABRICKS_LIST_GENIE_SPACES` | List Genie Spaces | Tool to retrieve all Genie spaces in the workspace that the authenticated user has access to. Use when you need to list available Genie spaces, their metadata, and warehouse associations. Supports pagination for workspaces with many spaces. |
| `DATABRICKS_SEND_GENIE_MESSAGE_FEEDBACK` | Send Genie Message Feedback | Tool to send feedback for a Genie message. Use when you need to provide positive, negative, or no feedback rating for AI-generated messages in Genie conversations. Positive feedback on responses that join tables or use SQL expressions can prompt Genie to suggest new SQL snippets to space managers for review and approval. |
| `DATABRICKS_START_GENIE_CONVERSATION` | Start Genie Conversation | Tool to start a new Genie conversation in a Databricks space for natural language data queries. Use when you need to ask questions about data using natural language. The message processes asynchronously, so initial status will be IN_PROGRESS. Poll the message status to get the completed response with query results. |
| `DATABRICKS_TRASH_GENIE_SPACE` | Trash Genie Space | Tool to move a Genie space to trash instead of permanently deleting it. Use when you need to remove a Genie space while retaining recovery options. Trashed spaces follow standard Databricks trash behavior with 30-day retention before permanent deletion. Requires CAN MANAGE permission on the space. |
| `DATABRICKS_UPDATE_GENIE_SPACE` | Update Genie Space | Tool to update an existing Genie space configuration. Use when you need to modify a Genie space's title, description, warehouse assignment, or complete serialized configuration. Supports partial updates (only provide fields you want to change) or full replacement via serialized_space. Useful for CI/CD pipelines, version control, and automated space management. |
| `DATABRICKS_CREATE_LAKEVIEW_DASHBOARD` | Create Lakeview Dashboard | Tool to create a new Lakeview dashboard in Databricks. Use when you need to create AI/BI dashboards for data visualization and analytics. Only the display_name parameter is required to create a blank dashboard, or you can provide serialized_dashboard to duplicate an existing dashboard. |
| `DATABRICKS_DELETE_LAKEVIEW_DASHBOARD_SCHEDULE` | Delete Lakeview Dashboard Schedule | Tool to delete a dashboard schedule from a Lakeview dashboard. Use when you need to remove scheduled refreshes or updates for a dashboard. Provide the etag parameter to ensure the schedule hasn't been modified since last retrieval (optimistic concurrency control). |
| `DATABRICKS_GET_PUBLISHED_DASHBOARD_TOKEN_INFO` | Get Published Dashboard Token Info | Tool to retrieve authorization info for generating downscoped tokens to access published Lakeview dashboards. Use when you need to generate OAuth tokens for dashboard embedding for external users, ensuring tokens are properly scoped to prevent leaking privileged access. |
| `DATABRICKS_GET_LAKEVIEW_DASHBOARD_DETAILS` | Get Lakeview Dashboard Details | Tool to retrieve details about a draft AI/BI Lakeview dashboard from the workspace. Use when you need to get comprehensive information about a dashboard including metadata, configuration, state, and serialized dashboard content. |
| `DATABRICKS_GET_PUBLISHED_LAKEVIEW_DASHBOARD` | Get Published Lakeview Dashboard | Tool to retrieve the current published version of a Lakeview dashboard. Use when you need to get information about the published dashboard including its display name, embedded credentials status, warehouse configuration, and last revision timestamp. |
| `DATABRICKS_GET_LAKEVIEW_DASHBOARD_SCHEDULE` | Get Lakeview Dashboard Schedule | Tool to retrieve a specific schedule for a Databricks AI/BI Lakeview dashboard. Use when you need to get schedule details including cron expressions, pause status, warehouse configuration, and subscription information. Each dashboard can have up to 10 schedules, with each schedule supporting up to 100 subscriptions. |
| `DATABRICKS_PUBLISH_LAKEVIEW_DASHBOARD` | Publish Lakeview Dashboard | Tool to publish an AI/BI Lakeview dashboard making it accessible via public link. Use when you need to publish a draft dashboard with embedded credentials and assign a warehouse for query execution. After successful publication, the dashboard becomes accessible at https:///dashboardsv3//published. |
| `DATABRICKS_TRASH_LAKEVIEW_DASHBOARD` | Trash Lakeview Dashboard | Tool to move a Lakeview dashboard to trash instead of permanently deleting it. Use when you need to remove a dashboard while retaining recovery options. Trashed dashboards can be recovered within 30 days before permanent deletion. |
| `DATABRICKS_UNPUBLISH_LAKEVIEW_DASHBOARD` | Unpublish Lakeview Dashboard | Tool to unpublish an AI/BI Lakeview dashboard while preserving its draft version. Use when you need to remove the published version of a dashboard. The draft version remains available and can be republished later if needed. |
| `DATABRICKS_UPDATE_LAKEVIEW_DASHBOARD` | Update Lakeview Dashboard | Tool to update a draft Lakeview dashboard configuration and metadata. Use when you need to modify dashboard properties such as display name, warehouse, location, or content. This is a partial update operation - only provided fields will be updated. The etag field can be used for optimistic concurrency control to prevent conflicts from concurrent modifications. |
| `DATABRICKS_CREATE_DATABASE_CATALOG` | Create Database Catalog | Tool to create a new database catalog in Databricks. Use when you need to establish a catalog for organizing database objects within a specific database instance. Requires appropriate database permissions. |
| `DATABRICKS_CREATE_DATABASE_INSTANCE` | Create Database Instance | Tool to create a Lakebase database instance with specified configuration. Use when you need to provision a new database instance in Databricks with database owner and superuser role. The creator receives full administrative capabilities on the instance. |
| `DATABRICKS_DELETE_DATABASE_INSTANCE` | Delete Database Instance | Tool to delete a Lakebase Postgres database instance. Use when you need to permanently remove a database instance and all associated data. The instance should be stopped before deletion, and users must have CAN MANAGE permissions. This operation cannot be undone. |
| `DATABRICKS_DELETE_SYNCED_DATABASE_TABLE` | Delete Synced Database Table | Tool to delete a synced table from Unity Catalog and stop data refreshes. Use when you need to deregister a synced table connection between Unity Catalog and a database instance. Note: The underlying Postgres table remains and must be manually dropped to free space. |
| `DATABRICKS_FIND_DATABASE_INSTANCE_BY_UID` | Find Database Instance By UID | Tool to find a database instance by its unique identifier (UID). Use when you need to retrieve instance details using the immutable UUID instead of the instance name. |
| `DATABRICKS_GENERATE_DATABASE_CREDENTIAL` | Generate Database Credential | Tool to generate OAuth token for database instance authentication. Use when you need to authenticate to Databricks database instances. The generated token is workspace-scoped and expires after one hour, though open connections remain active past expiration. |
| `DATABRICKS_GET_DATABASE_INSTANCE` | Get Database Instance | Tool to retrieve detailed information about a specific database instance by its name identifier. Use when you need to get comprehensive configuration details including capacity, state, retention settings, and connection endpoints for a PostgreSQL database instance managed by Databricks Lakebase. |
| `DATABRICKS_CREATE_DATA_QUALITY_MONITOR` | Create Data Quality Monitor | Tool to create a data quality monitor for a Unity Catalog Delta table. Use when you need to set up monitoring for table quality, track data drift, or monitor ML model inference logs. Supports snapshot, time series, and inference log monitoring types. Only one monitor can be created per table. |
| `DATABRICKS_LIST_DBFS_DIRECTORY_CONTENTS` | List DBFS Directory Contents | Tool to list the contents of a directory or get details of a file in DBFS. Use when you need to browse DBFS directories or check file details. Note: Recommended for directories with less than 10,000 files due to ~60 second timeout limitation. Throws RESOURCE_DOES_NOT_EXIST error if path doesn't exist. |
| `DATABRICKS_DELETE_DATABRICKS_CLUSTER` | Delete Databricks Cluster | Tool to terminate a Databricks Spark cluster asynchronously. Use when you need to stop and remove a cluster. The cluster is terminated asynchronously and removed after completion. Cluster configuration is retained for 30 days after termination. |
| `DATABRICKS_DELETE_DATABRICKS_JOB_RUN` | Delete Databricks Job Run | Tool to delete a non-active Databricks job run from the system. Use when you need to manually remove completed runs before the 60-day auto-deletion. Returns an error if the run is still active. Only non-active runs can be deleted. |
| `DATABRICKS_DELETE_USER_BY_ID` | Delete User by ID | Tool to delete a user from the Databricks workspace by their ID. Use when you need to remove a user resource from the workspace. A user that does not own or belong to a workspace is automatically purged after 30 days. Only workspace admins can deactivate users at the workspace level. |
| `DATABRICKS_EDIT_DATABRICKS_CLUSTER` | Edit Databricks Cluster | Tool to edit an existing Databricks cluster configuration. Use when you need to modify cluster settings such as size, Spark version, node types, or cloud-specific attributes. The cluster must be in RUNNING or TERMINATED state. If updated while RUNNING, it will restart to apply changes. |
| `DATABRICKS_ADD_BLOCK_TO_DBFS_STREAM` | Add Block to DBFS Stream | Tool to append a block of data to an open DBFS stream. Use when uploading large files in chunks as part of the DBFS streaming upload workflow: 1) create a stream handle, 2) add blocks, 3) close the stream. Each block is limited to 1 MB of base64-encoded data. |
| `DATABRICKS_CREATE_DBFS_FILE_STREAM` | Create DBFS File Stream | Tool to open a stream to write to a DBFS file and returns a handle. Use when uploading files to DBFS using the streaming workflow: 1) create a stream handle, 2) add blocks of data, 3) close the stream. The returned handle has a 10-minute idle timeout and must be used within that period. |
| `DATABRICKS_DELETE_DBFS_FILE_OR_DIRECTORY` | Delete DBFS File or Directory | Tool to delete a file or directory from DBFS. Use when you need to remove files or directories from the Databricks File System. For large deletions (>10K files), use dbutils.fs in a cluster context instead of the REST API. Operation may return 503 PARTIAL_DELETE for large deletions and should be re-invoked until completion. |
| `DATABRICKS_GET_DBFS_FILE_STATUS` | Get DBFS File Status | Tool to get the information of a file or directory in DBFS. Use when you need to check if a file or directory exists, retrieve its size, type, or last modification time. Throws RESOURCE_DOES_NOT_EXIST exception if the file or directory does not exist. |
| `DATABRICKS_MOVE_DBFS_FILE_OR_DIRECTORY` | Move DBFS File or Directory | Tool to move a file or directory from one location to another within DBFS. Use when you need to relocate files or directories in Databricks File System. Recursively moves all files if source is a directory. Not recommended for large-scale operations (>10k files) as it may timeout after ~60 seconds. |
| `DATABRICKS_READ_DBFS_FILE_CONTENTS` | Read DBFS File Contents | Tool to read the contents of a file from DBFS. Returns base64-encoded file data with maximum read size of 1 MB per request. Use when you need to retrieve file contents from Databricks File System. Throws RESOURCE_DOES_NOT_EXIST if file does not exist, INVALID_PARAMETER_VALUE if path is a directory, MAX_READ_SIZE_EXCEEDED if read length exceeds 1 MB. |
| `DATABRICKS_GET_ALL_LIBRARY_STATUSES` | Get All Library Statuses | Tool to retrieve status of all libraries across all Databricks clusters. Use when you need to check library installation status on all clusters, including libraries set to be installed on all clusters via the API or libraries UI. Returns detailed status information for each library on each cluster. |
| `DATABRICKS_GET_CLUSTER_INFORMATION` | Get Cluster Information | Tool to retrieve comprehensive metadata and configuration details for a Databricks cluster by its unique identifier. Use when you need to check cluster state, configuration, resources, or operational details. Returns cluster information including state, compute configuration, cloud-specific settings, and resource allocations. |
| `DATABRICKS_GET_GROUP_BY_ID` | Get Group by ID | Tool to retrieve information for a specific group in Databricks workspace by its ID. Use when you need to get complete group details including members, roles, entitlements, and metadata. Implements the SCIM 2.0 protocol standard for retrieving Group resources. |
| `DATABRICKS_GET_USER_BY_ID` | Get User by ID | Tool to retrieve information for a specific user in Databricks workspace by their ID. Use when you need to get complete user details including identity, contact information, group memberships, roles, and entitlements. Implements the SCIM 2.0 protocol standard for retrieving User resources. |
| `DATABRICKS_UPDATE_IAM_ACCOUNT_ACCESS_CONTROL_RULE_SET` | Update IAM Account Access Control Rule Set | Tool to update account-level access control rule set for service principals, groups, or budget policies. Use when you need to replace the entire set of access control rules for a resource. This is a PUT operation that replaces all existing roles - to preserve existing roles, they must be included in the grant_rules array. |
| `DATABRICKS_GET_IAM_ACCOUNT_GROUP_V2` | Get IAM Account Group V2 | Tool to retrieve a specific group resource by its unique identifier from a Databricks account using SCIM v2 protocol. Use when you need to get complete group details including members, roles, and entitlements. |
| `DATABRICKS_GET_CURRENT_USER_INFORMATION` | Get Current User Information | Tool to retrieve details about the currently authenticated user or service principal making the API request. Use when you need to get information about the current user's identity, groups, roles, and entitlements within the Databricks workspace. |
| `DATABRICKS_CREATE_IAM_GROUP_V2` | Create IAM Group V2 | Tool to create a new group in Databricks workspace using SCIM v2 protocol. Use when you need to create a new security group with a unique display name, optionally with initial members, entitlements, and roles. |
| `DATABRICKS_DELETE_IAM_GROUP_V2` | Delete IAM Group V2 | Tool to delete a group from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a security group. Requires appropriate permissions to delete the group. |
| `DATABRICKS_GET_WORKSPACE_IAM_GROUP_V2` | Get Workspace IAM Group V2 | Tool to retrieve details of a specific group by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete group information including members, roles, entitlements, and metadata. |
| `DATABRICKS_PATCH_IAM_GROUP_V2` | Patch IAM Group V2 | Tool to partially update a Databricks workspace group using SCIM 2.0 PATCH operations. Use when you need to modify group attributes like displayName, add/remove members, or update entitlements/roles. All operations in a single request are atomic. |
| `DATABRICKS_UPDATE_IAM_GROUP_V2` | Update IAM Group V2 | Tool to update an existing group in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the group resource. Use when you need to update group properties, members, entitlements, or roles. For partial updates, consider using PATCH instead. |
| `DATABRICKS_MIGRATE_PERMISSIONS` | Migrate Permissions | Tool to migrate ACL permissions from workspace groups to account groups. Use when adopting Unity Catalog and migrating permissions from workspace-level groups to account-level groups. Primarily used by the Unity Catalog Migration (UCX) tool. Supports batch processing with configurable size limits. |
| `DATABRICKS_GET_IAM_PERMISSIONS` | Get IAM Permissions | Tool to retrieve IAM permissions for a Databricks workspace object. Use when you need to check who has access to a specific resource and their permission levels. Returns the access control list (ACL) including user, group, and service principal permissions with inheritance information. |
| `DATABRICKS_GET_IAM_PERMISSION_LEVELS` | Get IAM Permission Levels | Tool to retrieve available permission levels for a Databricks workspace object. Use when you need to understand what permission levels can be assigned to users or groups for a specific object type. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, CAN_MANAGE with their descriptions. Available levels vary by object type. |
| `DATABRICKS_SET_IAM_PERMISSIONS` | Set IAM Permissions | Tool to set IAM permissions for a Databricks workspace object, replacing all existing permissions. Use when you need to configure complete access control for a resource. This operation replaces the entire access control list - existing permissions are overwritten. Admin permissions on the admins group cannot be removed. |
| `DATABRICKS_UPDATE_IAM_PERMISSIONS` | Update IAM Permissions | Tool to incrementally update permissions on Databricks workspace objects including dashboards, jobs, clusters, warehouses, notebooks, and more. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. |
| `DATABRICKS_CREATE_IAM_SERVICE_PRINCIPAL_V2` | Create IAM Service Principal V2 | Tool to create a new service principal in Databricks workspace using SCIM v2 protocol. Use when you need to create a service principal that already exists in the Databricks account. Required for identity-federated workspaces where you must specify a valid UUID applicationId. |
| `DATABRICKS_DELETE_IAM_SERVICE_PRINCIPAL_V2` | Delete IAM Service Principal V2 | Tool to delete a service principal from Databricks workspace using SCIM v2 protocol. Use when you need to permanently remove a service principal and revoke its access to the workspace. The operation is idempotent - subsequent DELETE requests to the same ID will return 404 Not Found. |
| `DATABRICKS_GET_IAM_SERVICE_PRINCIPAL_V2` | Get IAM Service Principal V2 | Tool to retrieve details of a specific service principal by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete service principal information including groups, roles, entitlements, and metadata. |
| `DATABRICKS_PATCH_IAM_SERVICE_PRINCIPAL_V2` | Patch IAM Service Principal V2 | Tool to partially update a service principal using SCIM 2.0 PATCH operations. Use when you need to modify service principal attributes like active status, displayName, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic. |
| `DATABRICKS_UPDATE_IAM_SERVICE_PRINCIPAL_V2` | Update IAM Service Principal V2 | Tool to update an existing service principal in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the service principal resource (PUT operation). Use when you need to update service principal properties, group memberships, entitlements, or roles. Note: applicationId and id are immutable fields. |
| `DATABRICKS_CREATE_IAM_USER_V2` | Create IAM User V2 | Tool to create a new user in Databricks workspace using SCIM v2 protocol. Use when you need to provision a new user account with a unique userName (email), optionally with display name, activation status, group memberships, entitlements, and roles. |
| `DATABRICKS_DELETE_IAM_USER_V2` | Delete IAM User V2 | Tool to delete a user from Databricks workspace using SCIM v2 protocol. Use when you need to inactivate a user and revoke their access to the workspace. Note that users are automatically purged 30 days after deletion if they do not own or belong to any workspace. Applications or scripts using tokens generated by the deleted user will no longer be able to access Databricks APIs. |
| `DATABRICKS_GET_IAM_USER_V2` | Get IAM User V2 | Tool to retrieve detailed information for a specific user by ID from Databricks workspace using SCIM v2 protocol. Use when you need to get complete user information including name, email, groups, roles, entitlements, and metadata. |
| `DATABRICKS_GET_IAM_USERS_V2_PERMISSIONS` | Get IAM Users V2 Permissions | Tool to retrieve permissions for password-based authentication. Use when you need to check who has access to password authentication and their permission levels. Note: Password authentication was deprecated July 10, 2024 and is no longer supported. |
| `DATABRICKS_PATCH_IAM_USER_V2` | Patch IAM User V2 | Tool to partially update a user using SCIM 2.0 PATCH operations. Use when you need to modify user attributes like active status, displayName, userName, name fields, emails, groups, entitlements, or roles without replacing the entire resource. All operations in a single request are atomic. |
| `DATABRICKS_UPDATE_IAM_USER_V2` | Update IAM User V2 | Tool to update a user in Databricks workspace using SCIM v2 protocol. This performs a complete replacement of the user resource. Use when you need to update user properties including userName, displayName, active status, groups, entitlements, or roles. |
| `DATABRICKS_GET_WORKSPACE_ACCESS_DETAIL_LOCAL` | Get Workspace Access Detail Local | Tool to retrieve detailed workspace access information for a specific identity in Databricks. Use when you need to check workspace access details including permissions, principal information, and access metadata. |
| `DATABRICKS_LIST_JOB_COMPLIANCE_FOR_POLICY` | List Job Compliance for Policy | Tool to retrieve policy compliance status of all jobs using a given cluster policy. Use when you need to identify jobs that are out of compliance because the policy was updated after the job was last edited. Jobs are non-compliant when their job clusters no longer meet the requirements of the updated policy. |
| `DATABRICKS_GET_JOB_PERMISSION_LEVELS` | Get Job Permission Levels | Tool to retrieve available permission levels for a Databricks job. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific job. Returns permission levels like CAN_VIEW, CAN_MANAGE_RUN, and CAN_MANAGE with their descriptions. |
| `DATABRICKS_UPDATE_JOB_PERMISSIONS` | Update Job Permissions | Tool to incrementally update permissions for a Databricks job. Use when you need to modify specific job permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. |
| `DATABRICKS_GET_JOB_RUN_BY_ID` | Get Job Run By ID | Tool to retrieve metadata of a single Databricks job run by ID. Use when you need to get detailed information about a specific job run including state, timing, and cluster configuration. Runs are automatically removed after 60 days. |
| `DATABRICKS_LIST_DATABRICKS_JOB_RUNS` | List Databricks Job Runs | Tool to list Databricks job runs in descending order by start time. Use when you need to retrieve a paginated list of job runs with optional filtering by job ID, run status, time range, and other criteria. Supports pagination via offset/limit or page_token. All runs are automatically removed after 60 days. |
| `DATABRICKS_CANCEL_ALL_DATABRICKS_JOB_RUNS` | Cancel All Databricks Job Runs | Tool to cancel all active runs of a Databricks job asynchronously. Use when you need to terminate all running instances of a job. The cancellation happens asynchronously without preventing new runs. When all_queued_runs=true without a job_id, it cancels all queued runs across the workspace. |
| `DATABRICKS_CANCEL_DATABRICKS_JOB_RUN` | Cancel Databricks Job Run | Tool to cancel a Databricks job run asynchronously. Use when you need to terminate a running job. The run will be terminated shortly after the request completes. If the run is already in a terminal state, this is a no-op. |
| `DATABRICKS_DELETE_DATABRICKS_JOB_RUN` | Delete Databricks Job Run | Tool to delete a non-active Databricks job run. Use when you need to remove a job run from the workspace. The run must be in a non-active state; attempting to delete an active run will return an error. Runs are automatically removed after 60 days. |
| `DATABRICKS_GET_DATABRICKS_JOB_DETAILS` | Get Databricks Job Details | Tool to retrieve detailed information about a single Databricks job. Use when you need to get comprehensive job configuration including tasks, schedules, notifications, and cluster settings. For jobs with more than 100 tasks or job clusters, use the page_token parameter to paginate through results. |
| `DATABRICKS_GET_JOB_PERMISSION_LEVELS` | Get Job Permission Levels | Tool to retrieve available permission levels for a Databricks job. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific job. Returns permission levels like CAN_VIEW, CAN_MANAGE_RUN, CAN_MANAGE, and IS_OWNER with their descriptions. |
| `DATABRICKS_GET_DATABRICKS_JOB_RUN_DETAILS` | Get Databricks Job Run Details | Tool to retrieve complete metadata for a single Databricks job run. Use when you need to get detailed information about a specific job run including its state, timing, cluster configuration, and task details. Note that runs are automatically removed after 60 days. This endpoint does not return the run's output; use the getRunOutput method separately to retrieve output. |
| `DATABRICKS_GET_DATABRICKS_JOB_RUN_OUTPUT` | Get Databricks Job Run Output | Tool to retrieve output and metadata of a single Databricks task run. Use when you need to get the output value from dbutils.notebook.exit() or check task execution results. IMPORTANT: This only works on task-level run IDs, not top-level job run IDs for multi-task jobs. API returns first 5 MB of output; for larger results use cloud storage. Runs are auto-removed after 60 days. |
| `DATABRICKS_SET_DATABRICKS_JOB_PERMISSIONS` | Set Databricks Job Permissions | Tool to set permissions for a Databricks job, completely replacing all existing permissions. Use when you need to configure access control for a job. This operation replaces ALL existing permissions; if no access_control_list is provided, all direct permissions are deleted. The job must have exactly one owner (cannot be a group). |
| `DATABRICKS_SUBMIT_DATABRICKS_JOB_RUN` | Submit Databricks Job Run | Tool to submit a one-time Databricks job run without creating a persistent job. Use when you need to execute a workload directly without defining a reusable job. The job run is submitted immediately and executes the specified tasks. You can track the run using the returned run_id with the jobs/runs/get endpoint. |
| `DATABRICKS_LIST_DATABRICKS_JOBS` | List Databricks Jobs | Tool to retrieve a paginated list of all jobs in the Databricks workspace. Use when you need to discover available jobs, filter by name, or iterate through all jobs. Returns jobs in descending order by start time and supports task expansion for detailed task information. |
| `DATABRICKS_GET_JOB_POLICY_COMPLIANCE` | Get Job Policy Compliance | Tool to retrieve policy compliance status for a specific job. Use when you need to check whether a job meets the requirements of its assigned policies and identify any policy violations. Jobs could be out of compliance if a policy they use was updated after the job was last edited and some of its job clusters no longer comply with their updated policies. |
| `DATABRICKS_LIST_UNITY_CATALOGS` | List Unity Catalogs | Tool to retrieve a list of all catalogs in the Unity Catalog metastore. Use when you need to discover available catalogs based on user permissions. If the caller is the metastore admin, all catalogs will be retrieved. Otherwise, only catalogs owned by the caller or for which the caller has the USE_CATALOG privilege will be retrieved. |
| `DATABRICKS_LIST_CLUSTERS` | List Clusters | Tool to list all pinned, active, and recently terminated Databricks clusters. Use when you need to retrieve cluster information, monitor cluster status, or get an overview of available compute resources. Returns clusters terminated within the last 30 days along with currently active clusters. Supports filtering by state, source, and policy, with pagination for large result sets. |
| `DATABRICKS_LIST_WORKSPACE_GROUPS` | List Workspace Groups | Tool to list all groups in the Databricks workspace using SCIM v2 protocol. Use when you need to retrieve all groups or search for specific groups using filters and pagination. |
| `DATABRICKS_LIST_INSTANCE_POOLS` | List Instance Pools | Tool to retrieve a list of all active instance pools in the Databricks workspace with their statistics and configuration. Use when you need to get an overview of all available instance pools. |
| `DATABRICKS_LIST_ALL_DATABRICKS_JOBS_API_2_0` | List All Databricks Jobs (API 2.0) | Tool to list all jobs in the Databricks workspace using API 2.0. Use when you need to retrieve all jobs without pagination. Note: API 2.0 does not support pagination or filtering. For pagination support, use the API 2.2 endpoint instead. |
| `DATABRICKS_LIST_MEMBERS_OF_A_SECURITY_GROUP` | List Members of a Security Group | Tool to retrieve all members (users and nested groups) of a Databricks security group. Use when you need to see who belongs to a specific group for access control auditing or management. This method is non-recursive and does not expand nested group memberships. |
| `DATABRICKS_LIST_MODEL_SERVING_ENDPOINTS` | List Model Serving Endpoints | Tool to retrieve all serving endpoints for model serving in the workspace. Use when you need to list all available model serving endpoints and their configurations. Returns information about each endpoint including its state, configuration, served models, and traffic routing. |
| `DATABRICKS_LIST_NODE_TYPES` | List Node Types | Tool to list all supported node types available for cluster launch in the workspace. Use when you need to determine which instance types are available for creating or configuring Databricks clusters. |
| `DATABRICKS_LIST_DELTA_LIVE_TABLES_PIPELINES` | List Delta Live Tables Pipelines | Tool to list Delta Live Tables pipelines in the workspace. Use when you need to retrieve a paginated list of pipelines with summary information. The pipeline specification field is not returned by this endpoint - only summary information is provided. For complete pipeline details, use the get pipeline endpoint. |
| `DATABRICKS_LIST_REPOS` | List Repos | Tool to list Git repos that the calling user has Manage permissions on. Use when you need to retrieve all available repos in the workspace. Supports pagination and filtering by path prefix. |
| `DATABRICKS_LIST_DATABRICKS_JOB_RUNS` | List Databricks Job Runs | Tool to list Databricks job runs in descending order by start time. Use when you need to retrieve job runs with optional filtering by job ID, run status, and type. Supports pagination via offset and limit parameters. Runs are automatically removed after 60 days. |
| `DATABRICKS_LIST_CATALOG_SCHEMAS` | List Catalog Schemas | Tool to retrieve all schemas in a specified catalog from Unity Catalog. Use when you need to discover available schemas within a catalog based on user permissions. If the caller is the metastore admin or owner of the parent catalog, all schemas will be retrieved. Otherwise, only schemas owned by the caller or for which the caller has the USE_SCHEMA privilege will be retrieved. |
| `DATABRICKS_LIST_SECRETS` | List Secrets | Tool to list all secret keys stored in a Databricks secret scope. Use when you need to retrieve metadata about secrets in a scope (does not return secret values). Requires READ permission on the scope. |
| `DATABRICKS_LIST_SECRET_SCOPES` | List Secret Scopes | Tool to list all secret scopes available in the Databricks workspace. Use when you need to retrieve all secret scopes including their names, backend types (DATABRICKS or AZURE_KEYVAULT), and Key Vault metadata for Azure-backed scopes. |
| `DATABRICKS_LIST_SECURITY_GROUPS` | List Security Groups | Tool to list all security groups in the Databricks workspace using SCIM v2 protocol. Use when you need to retrieve all groups with their identifiers and display names for access control management. |
| `DATABRICKS_LIST_SQL_WAREHOUSES` | List SQL Warehouses | Tool to list all SQL warehouses in the Databricks workspace. Use when you need to retrieve information about available SQL compute resources for running SQL commands. Returns the full list of SQL warehouses the user has access to, including their configuration, state, and connection details. |
| `DATABRICKS_LIST_CATALOG_TABLES` | List Catalog Tables | Tool to list all tables in a Unity Catalog schema with pagination support. Use when you need to retrieve tables from a specific catalog and schema combination. The API is paginated by default - continue reading pages using next_page_token until it's absent to ensure all results are retrieved. |
| `DATABRICKS_LIST_TOKENS` | List Tokens | Tool to list all valid personal access tokens (PATs) for a user-workspace pair. Use when you need to retrieve all tokens associated with the authenticated user in the current workspace. Note that each PAT is valid for only one workspace, and Databricks automatically revokes PATs that haven't been used for 90 days. |
| `DATABRICKS_LIST_USERS` | List Users | Tool to list all users in a Databricks workspace using SCIM 2.0 protocol. Use when you need to retrieve user identities and their attributes. Supports filtering, pagination, and sorting. |
| `DATABRICKS_LIST_VECTOR_SEARCH_ENDPOINTS` | List Vector Search Endpoints | Tool to list all vector search endpoints in the Databricks workspace. Use when you need to retrieve information about vector search endpoints which represent compute resources hosting vector search indexes. Supports pagination for handling large result sets. |
| `DATABRICKS_CREATE_MARKETPLACE_CONSUMER_INSTALLATION` | Create Marketplace Consumer Installation | Tool to create a marketplace consumer installation for Databricks Marketplace listings. Use when you need to install data products, datasets, notebooks, models, or other marketplace offerings into a workspace. Requires acceptance of consumer terms and the listing ID to proceed with installation. |
| `DATABRICKS_DELETE_MARKETPLACE_CONSUMER_INSTALLATION` | Delete Marketplace Consumer Installation | Tool to uninstall a Databricks Marketplace installation. Use when you need to remove an installed data product from your workspace. When an installation is deleted, the shared catalog is removed from the workspace. Requires CREATE CATALOG and USE PROVIDER permissions on the Unity Catalog metastore, or metastore admin role. |
| `DATABRICKS_UPDATE_MARKETPLACE_CONSUMER_INSTALLATION` | Update Marketplace Consumer Installation | Tool to update marketplace consumer installation fields and rotate tokens for marketplace listings. Use when you need to modify installation attributes or refresh access credentials. The token will be rotated if the rotate_token flag is true. |
| `DATABRICKS_BATCH_GET_MARKETPLACE_CONSUMER_LISTINGS` | Batch Get Marketplace Consumer Listings | Tool to batch get published listings from the Databricks Marketplace. Use when you need to retrieve multiple listing details in a single API call. Maximum limit of 50 listing IDs per request. |
| `DATABRICKS_GET_MARKETPLACE_CONSUMER_LISTING` | Get Marketplace Consumer Listing | Tool to retrieve a published listing from Databricks Marketplace that consumer has access to. Use when you need to get detailed information about a specific marketplace listing by its ID. Requires Unity Catalog permissions to access marketplace assets. |
| `DATABRICKS_GET_MARKETPLACE_CONSUMER_PERSONALIZATION_REQUESTS` | Get Marketplace Consumer Personalization Requests | Tool to retrieve personalization requests for a specific marketplace listing. Use when you need to check the status of customization or commercial transaction requests for a listing. Each consumer can make at most one personalization request per listing. |
| `DATABRICKS_BATCH_GET_MARKETPLACE_CONSUMER_PROVIDERS` | Batch Get Marketplace Consumer Providers | Tool to batch get providers from the Databricks Marketplace with visible listings. Use when you need to retrieve multiple provider details in a single API call. Maximum limit of 50 provider IDs per request. |
| `DATABRICKS_GET_MARKETPLACE_CONSUMER_PROVIDER` | Get Marketplace Consumer Provider | Tool to retrieve information about a specific provider in the Databricks Marketplace with visible listings. Use when you need to get provider details including contact information, description, and metadata. |
| `DATABRICKS_DELETE_LISTING_FROM_EXCHANGE` | Delete Listing From Exchange | Tool to remove the association between a marketplace exchange and a listing. Use when you need to disassociate an exchange from a provider listing. This removes the listing from the private exchange, and it will no longer be shared with the curated set of customers in that exchange. |
| `DATABRICKS_CREATE_MARKETPLACE_PROVIDER_LISTING` | Create Marketplace Provider Listing | Tool to create a new listing in Databricks Marketplace for data providers. Use when you need to publish data products, datasets, models, or notebooks to the marketplace. Requires a listing object with summary information (name and listing_type). For free and instantly available data products, a share must be included during creation. |
| `DATABRICKS_GET_MARKETPLACE_PROVIDER_LISTING` | Get Marketplace Provider Listing | Tool to retrieve a specific marketplace provider listing by its identifier. Use when you need to get detailed information about a published or draft listing including metadata, configuration, and assets. |
| `DATABRICKS_CREATE_PROVIDER_ANALYTICS_DASHBOARD` | Create Provider Analytics Dashboard | Tool to create a provider analytics dashboard for monitoring Databricks Marketplace listing metrics. Use when you need to establish analytics tracking for listing views, requests, installs, conversion rates, and consumer information. Requires Marketplace admin role and system tables to be enabled in the metastore. |
| `DATABRICKS_GET_PROVIDER_ANALYTICS_DASHBOARD` | Get Provider Analytics Dashboard | Tool to retrieve provider analytics dashboard information for monitoring consumer usage metrics. Use when you need to access the dashboard ID to view marketplace listing performance including views, requests, installs, and conversion rates. |
| `DATABRICKS_GET_LATEST_PROVIDER_ANALYTICS_DASHBOARD_VERSION` | Get Latest Provider Analytics Dashboard Version | Tool to retrieve the latest logical version of the provider analytics dashboard template. Use when you need to get the current dashboard template version for monitoring consumer usage metrics including listing views, requests, and installs. |
| `DATABRICKS_CREATE_ML_EXPERIMENT` | Create ML Experiment | Tool to create a new MLflow experiment for tracking machine learning runs and models. Use when you need to organize and track ML experiments within Databricks. Returns RESOURCE_ALREADY_EXISTS error if an experiment with the same name already exists. |
| `DATABRICKS_CREATE_LOGGED_MODEL` | Create Logged Model | Tool to create a new logged model in MLflow that ties together model metadata, parameters, metrics, and artifacts. Use when you need to create a LoggedModel object as part of the unified 'log + register' workflow introduced in MLflow 2.8. LoggedModel objects persist throughout a model's lifecycle and provide a centralized way to track model information. |
| `DATABRICKS_CREATE_MLFLOW_EXPERIMENT_RUN` | Create MLflow Experiment Run | Tool to create a new MLflow run within an experiment for tracking machine learning execution. Use when starting a new ML training run, experiment execution, or data pipeline that needs parameter and metric tracking. Returns the created run with a unique run_id for subsequent metric and parameter logging. |
| `DATABRICKS_DELETE_ML_EXPERIMENT` | Delete ML Experiment | Tool to delete an MLflow experiment and associated metadata, runs, metrics, params, and tags. Use when you need to remove an experiment from Databricks. If the experiment uses FileStore, artifacts associated with the experiment are also deleted. |
| `DATABRICKS_DELETE_LOGGED_MODEL` | Delete Logged Model | Tool to delete a logged model from MLflow tracking. Use when you need to permanently remove a LoggedModel from the tracking server. The deletion is permanent and cannot be undone. LoggedModels track a model's lifecycle across different training and evaluation runs. |
| `DATABRICKS_DELETE_LOGGED_MODEL_TAG` | Delete Logged Model Tag | Tool to delete a tag from a logged model in MLflow. Use when you need to remove metadata from a LoggedModel object. This operation is irreversible and permanently removes the tag from the logged model. Part of MLflow 3's logged model management capabilities. |
| `DATABRICKS_DELETE_ML_EXPERIMENT_RUN` | Delete ML Experiment Run | Tool to mark an MLflow run for deletion in ML experiments. Use when you need to remove a specific run from Databricks. This is a soft delete operation - the run is marked for deletion rather than immediately removed and can be restored unless permanently deleted. |
| `DATABRICKS_DELETE_ML_EXPERIMENT_RUNS` | Delete ML Experiment Runs | Tool to bulk delete runs in an ML experiment created before a specified timestamp. Use when you need to clean up old experiment runs. Only runs created prior to or at the specified timestamp are deleted. The maximum number of runs that can be deleted in one operation is 10000. |
| `DATABRICKS_DELETE_ML_EXPERIMENT_RUN_TAG` | Delete ML Experiment Run Tag | Tool to delete a tag from an MLflow experiment run. Use when you need to remove run metadata. This operation is irreversible and permanently removes the tag from the run. |
| `DATABRICKS_FINALIZE_LOGGED_MODEL` | Finalize Logged Model | Tool to finalize a logged model in MLflow by updating its status to READY or FAILED. Use when custom model preparation logic is complete and you need to mark the model as ready for use or indicate that upload failed. This is part of the experimental logged models feature introduced in MLflow 2.8+. |
| `DATABRICKS_GET_ML_EXPERIMENT_BY_NAME` | Get ML Experiment By Name | Tool to retrieve MLflow experiment metadata by name. Use when you need to get experiment details using the experiment name. Returns deleted experiments but prefers active ones if both exist with the same name. Throws RESOURCE_DOES_NOT_EXIST if no matching experiment exists. |
| `DATABRICKS_GET_ML_EXPERIMENT` | Get ML Experiment | Tool to retrieve metadata for an MLflow experiment by ID. Use when you need to get experiment details including name, artifact location, lifecycle stage, and tags. Works on both active and deleted experiments. |
| `DATABRICKS_GET_LOGGED_MODEL` | Get Logged Model | Tool to fetch logged model metadata by unique ID. Use when you need to retrieve a LoggedModel object representing a model logged to an MLflow Experiment. Returns comprehensive model information including metrics, parameters, tags, and artifact details. |
| `DATABRICKS_GET_ML_EXPERIMENT_PERMISSION_LEVELS` | Get ML Experiment Permission Levels | Tool to retrieve available permission levels for a Databricks ML experiment. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific experiment. Returns permission levels like CAN_READ, CAN_EDIT, CAN_MANAGE, and IS_OWNER with their descriptions. |
| `DATABRICKS_GET_ML_EXPERIMENT_PERMISSIONS` | Get ML Experiment Permissions | Tool to retrieve permissions for an MLflow experiment. Use when you need to check who has access to an experiment and their permission levels. Note that notebook experiments inherit permissions from their corresponding notebook, while workspace experiments have independent permissions. |
| `DATABRICKS_GET_MLFLOW_RUN` | Get MLflow Run | Tool to retrieve complete information about a specific MLflow run including metadata, metrics, parameters, tags, inputs, and outputs. Use when you need to get details of a run by its run_id. Returns the most recent metric values when multiple metrics with the same key exist. |
| `DATABRICKS_LOG_BATCH_MLFLOW_DATA` | Log Batch MLflow Data | Tool to log a batch of metrics, parameters, and tags for an MLflow run in a single request. Use when you need to efficiently log multiple metrics, params, or tags simultaneously. Items within each type are processed sequentially in the order specified. The combined total of all items across metrics, params, and tags cannot exceed 1000. |
| `DATABRICKS_LOG_MLFLOW_DATASET_INPUTS` | Log MLflow Dataset Inputs | Tool to log dataset inputs to an MLflow run for tracking data sources used during model development. Use when you need to track metadata about datasets used in ML experiment runs, including information about the dataset source, schema, and tags. Enables logging of dataset inputs to a run, allowing you to track data sources throughout the ML lifecycle. |
| `DATABRICKS_LOG_LOGGED_MODEL_PARAMETERS` | Log Logged Model Parameters | Tool to log parameters for a logged model in MLflow. Use when you need to attach hyperparameters or metadata to a LoggedModel object. A param can be logged only once for a logged model, and attempting to overwrite an existing param will result in an error. Available in MLflow 2.8+. |
| `DATABRICKS_LOG_MLFLOW_METRIC` | Log MLflow Metric | Tool to log a metric for an MLflow run with timestamp. Use when you need to record ML model performance metrics like accuracy, loss, or custom evaluation metrics. Metrics can be logged multiple times with different timestamps and values are never overwritten - each log appends to the metric history for that key. |
| `DATABRICKS_LOG_MLFLOW_MODEL` | Log MLflow Model | Tool to log a model artifact for an MLflow run (Experimental API). Use when you need to record model metadata including artifact paths, flavors, and versioning information for a training run. The model_json parameter should contain a complete MLmodel specification in JSON string format. |
| `DATABRICKS_LOG_MLFLOW_DATASET_OUTPUTS` | Log MLflow Dataset Outputs | Tool to log dataset outputs from an MLflow run for tracking data generated during model development. Use when you need to track metadata about datasets produced by ML experiment runs, including information about predictions, model outputs, or generated data. Enables logging of dataset outputs to a run, allowing you to track generated data throughout the ML lifecycle. |
| `DATABRICKS_LOG_MLFLOW_PARAMETER` | Log MLflow Parameter | Tool to log a parameter for an MLflow run as a key-value pair. Use when you need to record hyperparameters or constant values for ML model training or ETL pipelines. Parameters can only be logged once per run and cannot be changed after logging. Logging identical parameters is idempotent. |
| `DATABRICKS_RESTORE_ML_EXPERIMENT` | Restore ML Experiment | Tool to restore a deleted MLflow experiment and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted experiment from Databricks. If the experiment uses FileStore, underlying artifacts are also restored. |
| `DATABRICKS_RESTORE_ML_EXPERIMENT_RUN` | Restore ML Experiment Run | Tool to restore a deleted MLflow run and its associated metadata, runs, metrics, params, and tags. Use when you need to recover a previously deleted run from Databricks ML experiments. The operation cannot restore runs that were permanently deleted. |
| `DATABRICKS_RESTORE_ML_EXPERIMENT_RUNS` | Restore ML Experiment Runs | Tool to bulk restore runs in an ML experiment that were deleted at or after a specified timestamp. Use when you need to recover multiple deleted experiment runs. Only runs deleted at or after the specified timestamp are restored. The maximum number of runs that can be restored in one operation is 10000. |
| `DATABRICKS_SEARCH_LOGGED_MODELS` | Search Logged Models | Tool to search for logged models in MLflow experiments based on various criteria. Use when you need to find models that match specific metrics, parameters, tags, or attributes using SQL-like filter expressions. Supports pagination, ordering results, and filtering by datasets. |
| `DATABRICKS_SET_ML_EXPERIMENT_TAG` | Set ML Experiment Tag | Tool to set a tag on an MLflow experiment. Use when you need to add or update experiment metadata. Experiment tags are metadata that can be updated at any time. |
| `DATABRICKS_SET_LOGGED_MODEL_TAGS` | Set Logged Model Tags | Tool to set tags on a logged model in MLflow. Use when you need to add or update metadata tags on a LoggedModel object for organization and tracking. Tags are key-value pairs that can be used to search and filter logged models. Part of MLflow 3's logged model management capabilities. |
| `DATABRICKS_SET_ML_EXPERIMENT_PERMISSIONS` | Set ML Experiment Permissions | Tool to set permissions for an MLflow experiment, replacing all existing permissions. Use when you need to configure access control for an experiment. This operation replaces ALL existing permissions; for incremental updates, use the update permissions endpoint instead. |
| `DATABRICKS_SET_MLFLOW_RUN_TAG` | Set MLflow Run Tag | Tool to set a tag on an MLflow run. Use when you need to add custom metadata to runs for filtering, searching, and organizing experiments. Tags with the same key can be overwritten by successive writes. Logging the same tag (key, value) is idempotent. |
| `DATABRICKS_UPDATE_ML_EXPERIMENT` | Update ML Experiment | Tool to update MLflow experiment metadata, primarily for renaming experiments. Use when you need to rename an existing experiment. The new experiment name must be unique across all experiments in the workspace. |
| `DATABRICKS_UPDATE_ML_EXPERIMENT_PERMISSIONS` | Update ML Experiment Permissions | Tool to incrementally update permissions for an MLflow experiment. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. |
| `DATABRICKS_UPDATE_ML_EXPERIMENT_RUN` | Update ML Experiment Run | Tool to update MLflow run metadata including status, end time, and run name. Use when a run's status changes outside normal execution flow or when you need to rename a run. This endpoint allows you to modify a run's metadata after it has been created. |
| `DATABRICKS_DELETE_ML_FEATURE_ENGINEERING_KAFKA_CONFIG` | Delete ML Feature Engineering Kafka Config | Tool to delete a Kafka configuration from ML Feature Engineering. Use when you need to remove Kafka streaming source configurations. The deletion is permanent and cannot be undone. Kafka configurations define how features are streamed from Kafka sources. |
| `DATABRICKS_CREATE_ML_FEATURE_STORE_ONLINE_STORE` | Create ML Feature Store Online Store | Tool to create a Databricks Online Feature Store for real-time feature serving. Use when you need to establish serverless infrastructure for low-latency access to feature data at scale. Requires Databricks Runtime 16.4 LTS ML or above, or serverless compute. |
| `DATABRICKS_DELETE_ML_FEATURE_STORE_ONLINE_STORE` | Delete ML Feature Store Online Store | Tool to delete an online store from ML Feature Store. Use when you need to remove online stores that provide low-latency feature serving infrastructure. The deletion is permanent and cannot be undone. Online stores are used for real-time feature retrieval in production ML serving. |
| `DATABRICKS_DELETE_ML_FEATURE_STORE_ONLINE_TABLE` | Delete ML Feature Store Online Table | Tool to delete an online table from ML Feature Store. Use when you need to permanently remove an online table and stop data synchronization. This operation deletes all data in the online table permanently and releases all resources. |
| `DATABRICKS_CREATE_ML_FORECASTING_EXPERIMENT` | Create ML Forecasting Experiment | Tool to create a new AutoML forecasting experiment for time series prediction. Use when you need to automatically train and optimize forecasting models on time series data. The experiment will train multiple models and select the best one based on the primary metric. |
| `DATABRICKS_DELETE_ML_FEATURE_TAG` | Delete ML Feature Tag | Tool to delete a tag from a feature in a feature table in ML Feature Store. Use when you need to remove metadata tags from specific features. This operation removes the tag association from the feature but does not affect the feature data itself. |
| `DATABRICKS_GET_ML_FEATURE_TAG` | Get ML Feature Tag | Tool to retrieve a specific tag from a feature in a feature table in ML Feature Store. Use when you need to get metadata tag details from specific features. This operation returns the tag name and value associated with the feature. |
| `DATABRICKS_SET_OR_UPDATE_ML_FEATURE_TAG` | Set or Update ML Feature Tag | Tool to set or update a tag on a feature in a feature table in ML Feature Store. Use when you need to add or modify metadata tags on specific features. If the tag already exists, it will be updated with the new value. If the tag doesn't exist, it will be created automatically. This operation is idempotent and can be used to ensure a tag has a specific value. |
| `DATABRICKS_GET_ML_MODEL_REGISTRY_PERMISSION_LEVELS` | Get ML Model Registry Permission Levels | Tool to retrieve available permission levels for a Databricks ML registered model. Use when you need to understand what permission levels can be assigned to users or groups for a specific model. Returns permission levels like CAN_READ, CAN_EDIT, CAN_MANAGE, CAN_MANAGE_PRODUCTION_VERSIONS, CAN_MANAGE_STAGING_VERSIONS, and IS_OWNER with their descriptions. |
| `DATABRICKS_DELETE_OAUTH2_SERVICE_PRINCIPAL_SECRET` | Delete OAuth2 Service Principal Secret | Tool to delete an OAuth secret from a service principal at the account level. Use when you need to revoke OAuth credentials for service principal authentication. Once deleted, applications or scripts using tokens generated from that secret will no longer be able to access Databricks APIs. |
| `DATABRICKS_CREATE_OAUTH_SERVICE_PRINCIPAL_SECRET` | Create OAuth Service Principal Secret | Tool to create an OAuth secret for service principal authentication. Use when you need to obtain OAuth access tokens for accessing Databricks Accounts and Workspace APIs. A service principal can have up to five OAuth secrets, each valid for up to two years (730 days). The secret value is only shown once upon creation. |
| `DATABRICKS_DELETE_OAUTH2_SERVICE_PRINCIPAL_SECRET_PROXY` | Delete OAuth2 Service Principal Secret Proxy | Tool to delete an OAuth secret from a service principal. Use when you need to revoke OAuth credentials for service principal authentication. Once deleted, applications using tokens generated from that secret will no longer be able to access Databricks APIs. |
| `DATABRICKS_DELETE_DATABRICKS_PIPELINE` | Delete Databricks Pipeline | Tool to delete a Databricks Delta Live Tables pipeline permanently and stop any active updates. Use when you need to remove a pipeline completely. If the pipeline publishes to Unity Catalog, deletion will cascade to all pipeline tables. This action cannot be easily undone without Databricks support assistance. |
| `DATABRICKS_GET_PIPELINE_PERMISSION_LEVELS` | Get Pipeline Permission Levels | Tool to retrieve available permission levels for a Databricks Delta Live Tables pipeline. Use when you need to understand what permission levels can be assigned to users or groups for a specific pipeline. Returns permission levels like CAN_VIEW, CAN_RUN, CAN_MANAGE, and IS_OWNER with their descriptions. |
| `DATABRICKS_GET_PIPELINE_PERMISSIONS` | Get Pipeline Permissions | Tool to retrieve permissions for a Databricks Delta Live Tables pipeline. Use when you need to check who has access to a pipeline and their permission levels. Returns the complete permissions information including access control lists with user, group, and service principal permissions. |
| `DATABRICKS_LIST_PIPELINE_UPDATES` | List Pipeline Updates | Tool to retrieve a paginated list of updates for a Databricks Delta Live Tables pipeline. Use when you need to view the update history for a specific pipeline. Returns information about each update including state, creation time, and configuration details such as full refresh and table selection. |
| `DATABRICKS_UPDATE_PIPELINE_PERMISSIONS` | Update Pipeline Permissions | Tool to incrementally update permissions on a Databricks pipeline. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. Pipelines can inherit permissions from root object. |
| `DATABRICKS_CREATE_QUALITY_MONITOR_V2` | Create Quality Monitor V2 | Tool to create a quality monitor for Unity Catalog table. Use when you need to set up monitoring for data quality metrics, track drift over time, or monitor ML inference logs. Monitor creation is asynchronous; dashboard and metric tables take 8-20 minutes to complete. Exactly one monitor type (snapshot, time_series, or inference_log) must be specified. |
| `DATABRICKS_GET_DATABRICKS_JOB_RUN_OUTPUT` | Get Databricks Job Run Output | Tool to retrieve output and metadata of a single Databricks task run. Use when you need to get the output value from dbutils.notebook.exit() or check task execution results. IMPORTANT: This only works on task-level run IDs, not top-level job run IDs for multi-task jobs. API returns first 5 MB of output; for larger results use cloud storage. Runs are auto-removed after 60 days. |
| `DATABRICKS_SEARCH_MLFLOW_EXPERIMENTS` | Search MLflow Experiments | Tool to search for MLflow experiments with filtering, ordering, and pagination support. Use when you need to find experiments based on name patterns, tags, or other criteria. Supports SQL-like filtering expressions and ordering by experiment attributes. |
| `DATABRICKS_SEARCH_MLFLOW_RUNS` | Search MLflow Runs | Tool to search for MLflow runs with filtering, ordering, and pagination support. Use when you need to find runs based on metrics, parameters, tags, or other criteria. Supports complex filter expressions with operators like =, !=, >, >=, <, <= for metrics, params, and tags. |
| `DATABRICKS_CREATE_PROVISIONED_THROUGHPUT_ENDPOINT` | Create Provisioned Throughput Endpoint | Tool to create a provisioned throughput serving endpoint for AI models in Databricks. Use when you need to provision model units for production GenAI applications with guaranteed throughput. The endpoint name must be unique across the workspace and can consist of alphanumeric characters, dashes, and underscores. Returns a long-running operation that completes when the endpoint is ready. |
| `DATABRICKS_DELETE_SERVING_ENDPOINT` | Delete Serving Endpoint | Tool to delete a model serving endpoint and all associated data. Use when you need to permanently remove an endpoint. Deletion is permanent and cannot be undone. This operation disables usage and deletes all data associated with the endpoint. |
| `DATABRICKS_GET_SERVING_ENDPOINT_DETAILS` | Get Serving Endpoint Details | Tool to retrieve detailed information about a specific serving endpoint by name. Use when you need to get comprehensive information about a serving endpoint including its configuration, state, served entities, traffic routing, and metadata. |
| `DATABRICKS_GET_SERVING_ENDPOINT_OPENAPI_SPEC` | Get Serving Endpoint OpenAPI Spec | Tool to retrieve the OpenAPI 3.1.0 specification for a serving endpoint. Use when you need to understand the endpoint's schema, generate client code, or visualize the API structure. The endpoint must be in a READY state and the served model must have a model signature logged. |
| `DATABRICKS_GET_SERVING_ENDPOINT_PERMISSION_LEVELS` | Get Serving Endpoint Permission Levels | Tool to retrieve available permission levels for a Databricks serving endpoint. Use when you need to understand what permission levels can be assigned to users or groups for access control. Returns permission levels like CAN_MANAGE, CAN_QUERY, and CAN_VIEW with their descriptions. |
| `DATABRICKS_UPDATE_SERVING_ENDPOINT_RATE_LIMITS` | Update Serving Endpoint Rate Limits | Tool to update rate limits for a Databricks serving endpoint. Use when you need to control the number of API calls allowed within a time period. Note: This endpoint is deprecated; consider using AI Gateway for rate limit management instead. |
| `DATABRICKS_UPDATE_SERVING_ENDPOINT_AI_GATEWAY` | Update Serving Endpoint AI Gateway | Tool to update AI Gateway configuration of a Databricks serving endpoint. Use when you need to configure traffic fallback, AI guardrails, payload logging, rate limits, or usage tracking. Supports external model, provisioned throughput, and pay-per-token endpoints; agent endpoints currently only support inference tables. |
| `DATABRICKS_DELETE_AI_BI_DASHBOARD_EMBEDDING_ACCESS_POLICY` | Delete AI/BI Dashboard Embedding Access Policy | Tool to delete AI/BI dashboard embedding access policy, reverting to default. Use when you need to remove the workspace-level policy for AI/BI published dashboard embedding. Upon deletion, the workspace reverts to the default setting (ALLOW_APPROVED_DOMAINS), conditionally permitting AI/BI dashboards to be embedded on approved domains. |
| `DATABRICKS_GET_AI_BI_DASHBOARD_EMBEDDING_ACCESS_POLICY` | Get AI/BI Dashboard Embedding Access Policy | Tool to retrieve workspace AI/BI dashboard embedding access policy setting. Use when you need to check whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. The default setting is ALLOW_APPROVED_DOMAINS which permits AI/BI dashboards to be embedded on approved domains. |
| `DATABRICKS_UPDATE_AI_BI_DASHBOARD_EMBEDDING_ACCESS_POLICY` | Update AI/BI Dashboard Embedding Access Policy | Tool to update AI/BI dashboard embedding workspace access policy at the workspace level. Use when you need to control whether AI/BI published dashboard embedding is enabled, conditionally enabled, or disabled. Follows read-modify-write workflow with etag-based optimistic concurrency control to prevent race conditions. |
| `DATABRICKS_DELETE_AI_BI_DASHBOARD_EMBEDDING_APPROVED_DOMAINS` | Delete AI/BI Dashboard Embedding Approved Domains | Tool to delete the list of approved domains for AI/BI dashboard embedding, reverting to default. Use when you need to remove the workspace-level approved domains list for hosting embedded AI/BI dashboards. Upon deletion, the workspace reverts to an empty approved domains list. The approved domains list cannot be modified when the current access policy is not configured to ALLOW_APPROVED_DOMAINS. |
| `DATABRICKS_GET_AI_BI_DASHBOARD_EMBEDDING_APPROVED_DOMAINS` | Get AI/BI Dashboard Embedding Approved Domains | Tool to retrieve the list of domains approved to host embedded AI/BI dashboards. Use when you need to check which external domains are permitted to embed AI/BI dashboards. The approved domains list cannot be modified unless the workspace access policy is set to ALLOW_APPROVED_DOMAINS. |
| `DATABRICKS_UPDATE_AI_BI_DASHBOARD_EMBEDDING_APPROVED_DOMAINS` | Update AI/BI Dashboard Embedding Approved Domains | Tool to update the list of domains approved to host embedded AI/BI dashboards at the workspace level. Use when you need to modify the approved domains list. The approved domains list can only be modified when the current access policy is set to ALLOW_APPROVED_DOMAINS. |
| `DATABRICKS_GET_AUTOMATIC_CLUSTER_UPDATE_SETTING` | Get Automatic Cluster Update Setting | Tool to retrieve automatic cluster update setting for the workspace. Use when you need to check whether automatic cluster updates are enabled, view maintenance window configuration, or get restart behavior settings. This setting controls whether clusters automatically update during maintenance windows. Currently in Public Preview. |
| `DATABRICKS_UPDATE_AUTOMATIC_CLUSTER_UPDATE_SETTING` | Update Automatic Cluster Update Setting | Tool to update workspace automatic cluster update configuration with etag-based concurrency control. Use when you need to enable/disable automatic cluster updates, configure maintenance windows, or adjust restart behavior. Requires Premium pricing tier and admin access. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag. |
| `DATABRICKS_GET_COMPLIANCE_SECURITY_PROFILE_SETTING` | Get Compliance Security Profile Setting | Tool to retrieve workspace compliance security profile setting. Use when you need to check whether CSP is enabled or view configured compliance standards. The CSP enables additional monitoring, enforced instance types for inter-node encryption, hardened compute images, and other security controls. Once enabled, this setting represents a permanent workspace change that cannot be disabled. |
| `DATABRICKS_DELETE_DASHBOARD_EMAIL_SUBSCRIPTIONS_SETTING` | Delete Dashboard Email Subscriptions Setting | Tool to delete the dashboard email subscriptions setting, reverting to default value. Use when you need to revert the workspace setting that controls whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. Upon deletion, the setting reverts to its default value (enabled/true). This is a workspace-level setting. |
| `DATABRICKS_GET_DASHBOARD_EMAIL_SUBSCRIPTIONS_SETTING` | Get Dashboard Email Subscriptions Setting | Tool to retrieve dashboard email subscriptions setting for the workspace. Use when you need to check whether schedules or workload tasks for refreshing AI/BI Dashboards can send subscription emails. By default, this setting is enabled. |
| `DATABRICKS_UPDATE_DASHBOARD_EMAIL_SUBSCRIPTIONS_SETTING` | Update Dashboard Email Subscriptions Setting | Tool to update the Dashboard Email Subscriptions setting for the workspace with etag-based concurrency control. Use when you need to enable or disable whether dashboard schedules can send subscription emails. If the setting is updated concurrently, the PATCH request fails with HTTP 409 requiring retry with fresh etag. |
| `DATABRICKS_DELETE_DEFAULT_NAMESPACE_SETTING` | Delete Default Namespace Setting | Tool to delete the default namespace setting for the workspace, removing the default catalog configuration. Use when you need to remove the default catalog used for queries without fully qualified names. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). |
| `DATABRICKS_GET_DEFAULT_NAMESPACE_SETTING` | Get Default Namespace Setting | Tool to retrieve the default catalog namespace setting for the workspace. Use when you need to check which catalog is used for unqualified table references in Unity Catalog-enabled compute. Changes to this setting require restart of clusters and SQL warehouses to take effect. |
| `DATABRICKS_UPDATE_DEFAULT_NAMESPACE_SETTING` | Update Default Namespace Setting | Tool to update the default catalog namespace configuration for workspace queries with etag-based concurrency control. Use when you need to configure the default catalog used for queries without fully qualified three-level names. Requires a restart of clusters and SQL warehouses to take effect. Only applies to Unity Catalog-enabled compute. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag. |
| `DATABRICKS_DELETE_DEFAULT_WAREHOUSE_ID_SETTING` | Delete Default Warehouse ID Setting | Tool to delete the default warehouse ID setting for the workspace, reverting to default state. Use when you need to remove the default SQL warehouse configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). |
| `DATABRICKS_GET_DEFAULT_WAREHOUSE_ID_SETTING` | Get Default Warehouse ID Setting | Tool to retrieve the default SQL warehouse ID setting for the workspace. Use when you need to check which warehouse is configured as the default for SQL authoring surfaces, AI/BI dashboards, Genie, Alerts, and Catalog Explorer. |
| `DATABRICKS_UPDATE_DEFAULT_WAREHOUSE_ID_SETTING` | Update Default Warehouse ID Setting | Tool to update the default SQL warehouse configuration for the workspace with etag-based concurrency control. Use when you need to configure which warehouse is used as the default for SQL operations and queries in the workspace. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag. |
| `DATABRICKS_DELETE_DISABLE_LEGACY_ACCESS_SETTING` | Delete Disable Legacy Access Setting | Tool to delete the disable legacy access workspace setting, re-enabling legacy features. Use when you need to revert to allowing legacy Databricks features. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). Changes take up to 5 minutes and require cluster/warehouse restart. |
| `DATABRICKS_GET_DISABLE_LEGACY_ACCESS_SETTING` | Get Disable Legacy Access Setting | Tool to retrieve the disable legacy access workspace setting. Use when you need to check whether legacy feature access is disabled, including direct Hive Metastore access, Fallback Mode on external locations, and Databricks Runtime versions prior to 13.3 LTS. |
| `DATABRICKS_UPDATE_DISABLE_LEGACY_ACCESS_SETTING` | Update Disable Legacy Access Setting | Tool to update workspace disable legacy access setting with etag-based concurrency control. Use when you need to enable or disable legacy access features including direct Hive Metastore access, external location fallback mode, and Databricks Runtime versions prior to 13.3LTS. If concurrent updates occur, the request fails with HTTP 409 requiring retry with fresh etag from the error response. |
| `DATABRICKS_DELETE_DISABLE_LEGACY_DBFS_SETTING` | Delete Disable Legacy DBFS Setting | Tool to delete the disable legacy DBFS workspace setting, reverting to default DBFS access behavior. Use when you need to re-enable access to legacy DBFS root and mounts. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). |
| `DATABRICKS_GET_DISABLE_LEGACY_DBFS_SETTING` | Get Disable Legacy DBFS Setting | Tool to retrieve the disable legacy DBFS workspace setting. Use when you need to check whether legacy DBFS root and mount access is disabled across all interfaces (UI, APIs, CLI, FUSE). When enabled, this setting also disables Databricks Runtime versions prior to 13.3 LTS and requires manual restart of compute clusters and SQL warehouses to take effect. |
| `DATABRICKS_UPDATE_DISABLE_LEGACY_DBFS_SETTING` | Update Disable Legacy DBFS Setting | Tool to update workspace disable legacy DBFS setting with etag-based concurrency control. Use when you need to enable or disable legacy DBFS features including DBFS root access, mounts, and legacy Databricks Runtime versions prior to 13.3 LTS. Changes take up to 20 minutes to take effect and require manual restart of compute clusters and SQL warehouses. |
| `DATABRICKS_GET_ENABLE_EXPORT_NOTEBOOK_SETTING` | Get Enable Export Notebook Setting | Tool to retrieve workspace setting controlling notebook export functionality. Use when you need to check whether users can export notebooks and files from the Workspace UI. Administrators use this setting to manage data exfiltration controls. |
| `DATABRICKS_UPDATE_ENABLE_EXPORT_NOTEBOOK` | Update Enable Export Notebook | Tool to update workspace notebook and file export setting. Use when you need to enable or disable users' ability to export notebooks and files from the Workspace UI. Requires admin access. |
| `DATABRICKS_GET_ENABLE_NOTEBOOK_TABLE_CLIPBOARD_SETTING` | Get Enable Notebook Table Clipboard Setting | Tool to retrieve notebook table clipboard setting for the workspace. Use when you need to check whether notebook table clipboard functionality is enabled. This setting controls whether users can copy data from tables in notebooks to their clipboard. |
| `DATABRICKS_UPDATE_ENABLE_NOTEBOOK_TABLE_CLIPBOARD` | Update Enable Notebook Table Clipboard | Tool to update workspace setting for notebook table clipboard. Use when you need to enable or disable users' ability to copy tabular data from notebook result tables to clipboard. Requires workspace admin privileges. |
| `DATABRICKS_GET_ENABLE_RESULTS_DOWNLOADING_SETTING` | Get Enable Results Downloading Setting | Tool to retrieve workspace setting controlling notebook results download functionality. Use when you need to check whether users can download notebook query results. Requires workspace administrator privileges to access. |
| `DATABRICKS_UPDATE_ENABLE_RESULTS_DOWNLOADING` | Update Enable Results Downloading | Tool to update workspace notebook results download setting. Use when you need to enable or disable users' ability to download notebook results. Requires admin access. |
| `DATABRICKS_GET_ENHANCED_SECURITY_MONITORING_SETTING` | Get Enhanced Security Monitoring Setting | Tool to retrieve enhanced security monitoring workspace setting. Use when you need to check whether Enhanced Security Monitoring is enabled for the workspace. Enhanced Security Monitoring provides a hardened disk image and additional security monitoring agents. It is automatically enabled when compliance security profile is active, and can be manually toggled when compliance security profile is disabled. |
| `DATABRICKS_UPDATE_ENHANCED_SECURITY_MONITORING` | Update Enhanced Security Monitoring | Tool to update enhanced security monitoring workspace settings. Use when you need to enable or disable Enhanced Security Monitoring (ESM) for the workspace. Requires the etag from a previous GET request for optimistic concurrency control. |
| `DATABRICKS_CREATE_IP_ACCESS_LIST` | Create IP Access List | Tool to create a new IP access list for workspace access control. Use when you need to allow or block specific IP addresses and CIDR ranges from accessing the Databricks workspace. The API will reject creation if the resulting list would block the caller's current IP address. Changes may take a few minutes to take effect. |
| `DATABRICKS_GET_IP_ACCESS_LIST` | Get IP Access List | Tool to retrieve details of a specific IP access list by its ID. Use when you need to view the configuration of allowed or blocked IP addresses and subnets for accessing the workspace or workspace-level APIs. Requires workspace admin privileges. |
| `DATABRICKS_DELETE_LLM_PROXY_PARTNER_POWERED_SETTING` | Delete LLM Proxy Partner Powered Setting | Tool to delete (revert to default) the partner-powered AI features workspace setting. Use when you need to revert the workspace to default configuration for AI features powered by partner providers. By default, this setting is enabled for workspaces without a compliance security profile. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). |
| `DATABRICKS_GET_LLM_PROXY_PARTNER_POWERED_SETTING` | Get LLM Proxy Partner Powered Setting | Tool to retrieve workspace-level setting that controls whether partner-powered AI features are enabled. Use when you need to check if features like Databricks Assistant, Genie, and Data Science Agent can use models hosted by partner providers (Azure OpenAI or Anthropic). By default, this setting is enabled for non-CSP workspaces. |
| `DATABRICKS_UPDATE_LLM_PROXY_PARTNER_POWERED_SETTING` | Update LLM Proxy Partner Powered Setting | Tool to update workspace-level setting controlling whether AI features are powered by partner-hosted models with etag-based concurrency control. Use when you need to enable or disable partner-powered AI features (Azure OpenAI or Anthropic). When disabled, Databricks-hosted models are used. If concurrent updates occur, the request fails with 409 status requiring retry with fresh etag. |
| `DATABRICKS_CREATE_NOTIFICATION_DESTINATION` | Create Notification Destination | Tool to create a notification destination for alerts and jobs. Use when you need to set up destinations for sending notifications outside of Databricks (email, Slack, PagerDuty, Microsoft Teams, or webhooks). Only workspace admins can create notification destinations. Requires HTTPS for webhooks with SSL certificates signed by a trusted certificate authority. |
| `DATABRICKS_DELETE_NOTIFICATION_DESTINATION` | Delete Notification Destination | Tool to delete a notification destination from the Databricks workspace. Use when you need to permanently remove a notification destination. Only workspace administrators have permission to perform this delete operation. |
| `DATABRICKS_GET_NOTIFICATION_DESTINATION` | Get Notification Destination | Tool to retrieve details of a notification destination by its UUID identifier. Use when you need to get configuration details, display name, and type information for a specific notification destination. Only users with workspace admin permissions will see the full configuration details. |
| `DATABRICKS_UPDATE_NOTIFICATION_DESTINATION` | Update Notification Destination | Tool to update an existing notification destination configuration. Use when you need to modify display name or configuration settings for email, Slack, PagerDuty, Microsoft Teams, or webhook destinations. Requires workspace admin permissions. At least one field (display_name or config) must be provided. |
| `DATABRICKS_DELETE_RESTRICT_WORKSPACE_ADMINS_SETTING` | Delete Restrict Workspace Admins Setting | Tool to delete/revert the restrict workspace admins setting to its default state. Use when you need to restore default workspace administrator capabilities for service principal token creation and job ownership settings. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). |
| `DATABRICKS_GET_RESTRICT_WORKSPACE_ADMINS_SETTING` | Get Restrict Workspace Admins Setting | Tool to retrieve the restrict workspace admins setting for the workspace. Use when you need to check whether workspace administrators are restricted in their ability to create service principal tokens, change job owners, or modify job run_as settings. This setting controls security boundaries for admin privileges. |
| `DATABRICKS_UPDATE_RESTRICT_WORKSPACE_ADMINS_SETTING` | Update Restrict Workspace Admins Setting | Tool to update the restrict workspace admins setting with etag-based concurrency control. Use when you need to modify workspace administrator capabilities for service principal token creation and job ownership/run-as settings. Requires account admin permissions and workspace membership. If concurrent updates occur, the request fails with HTTP 409 requiring retry with fresh etag. |
| `DATABRICKS_DELETE_SQL_RESULTS_DOWNLOAD_SETTING` | Delete SQL Results Download Setting | Tool to delete SQL results download workspace setting, reverting to default state where users are permitted to download results. Use when you need to restore the factory default configuration. Strongly recommended to use etag in a read-delete pattern to prevent concurrent modification conflicts (HTTP 409). |
| `DATABRICKS_GET_SQL_RESULTS_DOWNLOAD_SETTING` | Get SQL Results Download Setting | Tool to retrieve SQL results download workspace setting. Use when you need to check whether users within the workspace are allowed to download results from the SQL Editor and AI/BI Dashboards UIs. By default, this setting is enabled (set to true). Returns etag for use in subsequent update/delete operations. |
| `DATABRICKS_UPDATE_SQL_RESULTS_DOWNLOAD_SETTING` | Update SQL Results Download Setting | Tool to update workspace SQL results download setting controlling whether users can download results from SQL Editor and AI/BI Dashboards. Use when you need to enable or disable SQL query results download capability. Requires workspace admin access and uses etag-based optimistic concurrency control to prevent conflicting updates. |
| `DATABRICKS_DELETE_TOKEN_VIA_TOKEN_MANAGEMENT` | Delete Token via Token Management | Tool to delete a token specified by ID via token management. Use when you need to revoke or remove access tokens. Admins can delete tokens for any user. |
| `DATABRICKS_GET_TOKEN_INFORMATION` | Get Token Information | Tool to retrieve detailed information about a specific token by its ID from the token management system. Use when you need to get token metadata including creation time, expiry, owner, and usage information. Requires appropriate permissions to access token information. |
| `DATABRICKS_GET_TOKEN_MANAGEMENT_PERMISSION_LEVELS` | Get Token Management Permission Levels | Tool to retrieve available permission levels for personal access token management. Use when you need to understand what permission levels can be assigned for managing tokens in the workspace. Returns permission levels like CAN_USE and CAN_MANAGE with their descriptions. |
| `DATABRICKS_GET_TOKEN_MANAGEMENT_PERMISSIONS` | Get Token Management Permissions | Tool to retrieve permissions for workspace token management. Use when you need to check which users, groups, and service principals have permissions to create and manage personal access tokens. Requires workspace admin privileges and is available only in Databricks Premium plan. |
| `DATABRICKS_SET_TOKEN_MANAGEMENT_PERMISSIONS` | Set Token Management Permissions | Tool to set permissions for personal access token management, replacing all existing permissions. Use when configuring which users, groups, and service principals can create and use tokens. This operation replaces ALL existing permissions; if you need to add or modify permissions without replacing existing ones, use the update_permissions method instead. Workspace admins always retain CAN_MANAGE permissions. |
| `DATABRICKS_UPDATE_TOKEN_MANAGEMENT_PERMISSIONS` | Update Token Management Permissions | Tool to incrementally update permissions for personal access token management. Use when you need to modify who can create and use personal access tokens. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. |
| `DATABRICKS_CREATE_PERSONAL_ACCESS_TOKEN` | Create Personal Access Token | Tool to create a personal access token (PAT) for Databricks API authentication. Use when you need to generate a new token for REST API requests. Each PAT is valid for only one workspace. Users can create up to 600 PATs per workspace. Databricks automatically revokes PATs that haven't been used for 90 days. |
| `DATABRICKS_GET_PUBLIC_WORKSPACE_SETTING` | Get Public Workspace Setting | Tool to retrieve workspace-level settings by setting ID. Use when you need to get the current value of a specific workspace setting along with its version (etag) for subsequent updates. Returns setting value with etag for optimistic concurrency control. |
| `DATABRICKS_SET_WORKSPACE_CONFIGURATION_STATUS` | Set Workspace Configuration Status | Tool to set workspace configuration settings for a Databricks workspace. Use when you need to enable/disable workspace features or update configuration values. Requires workspace admin permissions. Invalid configuration keys will cause the entire request to fail. |
| `DATABRICKS_CREATE_SHARING_PROVIDER` | Create Sharing Provider | Tool to create a new authentication provider in Unity Catalog for Delta Sharing. Use when establishing a provider object for receiving data from external sources that aren't Unity Catalog-enabled. Requires metastore admin privileges or CREATE_PROVIDER permission on the metastore. Most recipients should not need to create provider objects manually as they are typically auto-created during Delta Sharing. |
| `DATABRICKS_GET_SHARING_PROVIDER` | Get Sharing Provider | Tool to retrieve information about a specific Delta Sharing provider in Unity Catalog. Use when you need to get provider details including authentication type, ownership, and connection information. Requires metastore admin privileges or provider ownership. |
| `DATABRICKS_UPDATE_SHARING_PROVIDER` | Update Sharing Provider | Tool to update an existing Delta Sharing authentication provider in Unity Catalog. Use when you need to modify provider properties like comment, owner, or name. The caller must be either a metastore admin or the owner of the provider. To rename the provider, the caller must be BOTH a metastore admin AND the owner. |
| `DATABRICKS_CREATE_SHARING_RECIPIENT` | Create Sharing Recipient | Tool to create a Delta Sharing recipient in Unity Catalog metastore. Use when you need to create a recipient object representing an identity who will consume shared data. Recipients can be configured for Databricks-to-Databricks sharing or open sharing with token authentication. Requires metastore admin or CREATE_RECIPIENT privilege. |
| `DATABRICKS_DELETE_SHARING_RECIPIENT` | Delete Sharing Recipient | Tool to delete a Delta Sharing recipient from Unity Catalog metastore. Use when you need to permanently remove a recipient object. Deletion invalidates all access tokens and immediately revokes access to shared data for users represented by the recipient. Requires recipient owner privileges. |
| `DATABRICKS_GET_SHARING_RECIPIENT` | Get Sharing Recipient | Tool to retrieve a Delta Sharing recipient from Unity Catalog metastore by name. Use when you need to get information about a recipient object representing an entity that receives shared data. Requires recipient ownership or metastore admin privileges. |
| `DATABRICKS_CREATE_SHARE` | Create Share | Tool to create a new share for data objects in Unity Catalog. Use when you need to establish a share for distributing data assets via Delta Sharing protocol. Data objects can be added after creation with update. Requires metastore admin or CREATE_SHARE privilege on the metastore. |
| `DATABRICKS_DELETE_SHARE` | Delete Share | Tool to delete a Unity Catalog share from the metastore. Use when you need to permanently remove a share object. Deletion immediately revokes recipient access to the shared data. This operation is permanent and requires share owner privileges. |
| `DATABRICKS_GET_SHARE_DETAILS` | Get Share Details | Tool to retrieve details of a specific share from Unity Catalog. Use when you need to get information about a share including its metadata, owner, and optionally the list of shared data objects. Requires metastore admin privileges or share ownership. |
| `DATABRICKS_GET_SHARE_PERMISSIONS` | Get Share Permissions | Tool to retrieve permissions for a Delta Sharing share from Unity Catalog. Use when you need to check which principals have been granted privileges on a share. Requires metastore admin privileges or share ownership. |
| `DATABRICKS_UPDATE_SHARE` | Update Share | Tool to update an existing share in Unity Catalog with changes to metadata or data objects. Use when you need to modify share properties (comment, owner, name) or manage shared data objects (add, remove, or update tables/views/volumes). The caller must be a metastore admin or the owner of the share. For table additions, the owner must have SELECT privilege on the table. |
| `DATABRICKS_GET_SPARK_VERSIONS` | Get Spark Versions | Tool to retrieve all available Databricks Runtime and Spark versions for cluster creation. Use when you need to determine which Spark versions are available for creating or configuring clusters. |
| `DATABRICKS_CREATE_SQL_ALERT` | Create SQL Alert | Tool to create a new Databricks SQL alert for query monitoring. Use when you need to set up alerts that monitor query results and trigger notifications when specified conditions are met. The alert will evaluate the query results and send notifications when the condition threshold is crossed. |
| `DATABRICKS_DELETE_SQL_ALERT` | Delete SQL Alert | Tool to delete a Databricks SQL alert (soft delete to trash). Use when you need to remove an alert from active monitoring. The alert is moved to trash and can be restored through the UI within 30 days, after which it is permanently deleted. |
| `DATABRICKS_GET_SQL_ALERT_DETAILS` | Get SQL Alert Details | Tool to retrieve details of a specific Databricks SQL alert by its UUID. Use when you need to get information about an alert including its configuration, trigger conditions, state, and notification settings. |
| `DATABRICKS_CREATE_LEGACY_SQL_ALERT` | Create Legacy SQL Alert | Tool to create a legacy SQL alert that periodically runs a query and notifies when conditions are met. Use when you need to create alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated. |
| `DATABRICKS_DELETE_LEGACY_SQL_ALERT` | Delete Legacy SQL Alert | Tool to permanently delete a legacy SQL alert (permanent deletion). Use when you need to permanently remove an alert using the legacy API endpoint. Note: This is a legacy endpoint that permanently deletes alerts. Unlike the newer /api/2.0/sql/alerts endpoint, deleted alerts cannot be restored from trash. |
| `DATABRICKS_GET_LEGACY_SQL_ALERT` | Get Legacy SQL Alert | Tool to retrieve details of a specific legacy SQL alert by its ID. Use when you need to get information about a legacy alert including its configuration, state, query details, and notification settings. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts. |
| `DATABRICKS_LIST_LEGACY_SQL_ALERTS` | List Legacy SQL Alerts | Tool to list all legacy SQL alerts accessible to the authenticated user. Use when you need to retrieve a list of all legacy alerts in the workspace. Note: This is a legacy endpoint (/api/2.0/preview/sql/alerts) that is deprecated and being replaced by /api/2.0/sql/alerts. |
| `DATABRICKS_UPDATE_LEGACY_SQL_ALERT` | Update Legacy SQL Alert | Tool to update a legacy SQL alert configuration including name, query reference, trigger conditions, and notification settings. Use when you need to modify existing alerts using the legacy API endpoint. Note: This is a legacy endpoint that has been replaced by /api/2.0/sql/alerts and is deprecated. |
| `DATABRICKS_UPDATE_SQL_ALERT` | Update SQL Alert | Tool to update an existing Databricks SQL alert using partial update with field mask. Use when you need to modify alert properties including display name, query reference, trigger conditions, notification settings, or ownership. |
| `DATABRICKS_DELETE_SQL_DASHBOARD` | Delete SQL Dashboard | Tool to delete a legacy Databricks SQL dashboard by moving it to trash (soft delete). Use when you need to remove a dashboard from active use. The dashboard is moved to trash and can be restored later through the UI. Trashed dashboards do not appear in searches and cannot be shared. |
| `DATABRICKS_GET_SQL_DASHBOARD` | Get SQL Dashboard | Tool to retrieve complete legacy dashboard definition with metadata, widgets, and queries. Use when you need to get detailed information about a SQL dashboard. Note: Legacy dashboards API deprecated as of January 12, 2026. Databricks recommends using AI/BI dashboards (Lakeview API) for new implementations. |
| `DATABRICKS_UPDATE_SQL_DASHBOARD` | Update SQL Dashboard | Tool to update legacy Databricks SQL dashboard attributes (name, run_as_role, tags). Use when you need to modify dashboard metadata. Note: This operation only affects dashboard object attributes and does NOT add, modify, or remove widgets. |
| `DATABRICKS_GET_SQL_OBJECT_PERMISSIONS` | Get SQL Object Permissions | Tool to retrieve the access control list for a specified SQL object (alerts, dashboards, queries, or data_sources). Use when you need to check who has access to a SQL object and their permission levels. Note: This API is deprecated; use the Workspace API for new implementations. |
| `DATABRICKS_SET_SQL_OBJECT_PERMISSIONS` | Set SQL Object Permissions | Tool to set access control list for SQL objects (alerts, dashboards, queries, or data_sources). Use when you need to configure permissions for a SQL object. IMPORTANT: This operation REPLACES ALL existing permissions. To retain existing permissions, include them in the access_control_list. Note: This is a legacy/deprecated API; Databricks recommends using the Workspace API instead. |
| `DATABRICKS_CREATE_SQL_QUERY` | Create SQL Query | Tool to create a saved SQL query object in Databricks. Use when you need to create a new saved query definition that includes the target SQL warehouse, query text, name, description, tags, and parameters. Note: This creates a saved query object, not an immediate execution. Use Statement Execution API for immediate query execution. |
| `DATABRICKS_DELETE_SQL_QUERY` | Delete SQL Query | Tool to delete a Databricks SQL query (soft delete to trash). Use when you need to remove a query from searches and list views. The query is moved to trash and can be restored through the UI within 30 days, after which it is permanently deleted. |
| `DATABRICKS_GET_SQL_QUERY_DETAILS` | Get SQL Query Details | Tool to retrieve detailed information about a specific SQL query by its UUID. Use when you need to get query configuration including SQL text, warehouse ID, parameters, ownership, and metadata. |
| `DATABRICKS_CREATE_LEGACY_SQL_QUERY` | Create Legacy SQL Query | Tool to create a new SQL query definition using the legacy API. Use when you need to create queries with the legacy /preview/sql/queries endpoint that uses data_source_id. Note: This is a legacy endpoint. The API has been replaced by /api/2.0/sql/queries which uses warehouse_id instead of data_source_id. |
| `DATABRICKS_DELETE_LEGACY_SQL_QUERY` | Delete Legacy SQL Query | Tool to delete a legacy SQL query (soft delete to trash). Use when you need to remove a legacy query from searches and list views. The query is moved to trash and permanently deleted after 30 days. Note: This is a deprecated legacy API that will be phased out; use the non-legacy endpoint instead. |
| `DATABRICKS_GET_LEGACY_SQL_QUERY` | Get Legacy SQL Query | Tool to retrieve details of a specific legacy SQL query by its UUID. Use when you need to get information about a legacy query including its SQL text, parameters, configuration, and metadata. Note: This is a legacy endpoint (/api/2.0/preview/sql/queries) that has been replaced by /api/2.0/sql/queries and will be supported for six months to allow migration time. |
| `DATABRICKS_RESTORE_SQL_QUERY_LEGACY` | Restore SQL Query (Legacy) | Tool to restore a trashed SQL query to active state. Use when you need to recover a deleted query within 30 days of deletion. Once restored, the query reappears in list views and searches and can be used for alerts again. This is a legacy/deprecated API endpoint. |
| `DATABRICKS_UPDATE_LEGACY_SQL_QUERY` | Update Legacy SQL Query | Tool to update an existing SQL query definition using the legacy API. Use when you need to modify queries with the legacy /preview/sql/queries endpoint. Note: This is a legacy/deprecated endpoint. The newer API uses PATCH /api/2.0/sql/queries/{id} instead. |
| `DATABRICKS_UPDATE_SQL_QUERY` | Update SQL Query | Tool to update a saved SQL query object in Databricks using partial field updates. Use when you need to modify specific fields of an existing query without replacing the entire object. Requires update_mask parameter to specify which fields to update. Supports updating query text, configuration, parameters, and metadata. |
| `DATABRICKS_LIST_SQL_QUERY_HISTORY` | List SQL Query History | Tool to retrieve the history of SQL queries executed against SQL warehouses and serverless compute. Use when you need to list queries by time range, status, user, or warehouse. Returns most recently started queries first (up to max_results). Supports filtering and pagination. |
| `DATABRICKS_CREATE_SQL_QUERY_VISUALIZATION` | Create SQL Query Visualization | Tool to create a new visualization for a Databricks SQL query. Use when you need to add a visual representation (table, chart, counter, funnel, or pivot table) to an existing saved query. The visualization will be attached to the specified query and can be added to dashboards. |
| `DATABRICKS_CREATE_LEGACY_SQL_QUERY_VISUALIZATION` | Create Legacy SQL Query Visualization | Tool to create a visualization in a SQL query using the legacy API. Use when you need to add a visual representation (table, chart, counter, pivot, etc.) to an existing saved query. Note: This is a deprecated endpoint; users should migrate to the current /api/2.0/sql/visualizations API. Databricks does not recommend modifying visualization settings in JSON. |
| `DATABRICKS_DELETE_LEGACY_SQL_QUERY_VISUALIZATION` | Delete Legacy SQL Query Visualization | Tool to permanently delete a legacy SQL query visualization. Use when you need to remove a visualization from a SQL query using the legacy API endpoint. Note: This is a deprecated legacy endpoint. Databricks recommends migrating to /api/2.0/sql/visualizations/{id} instead. |
| `DATABRICKS_UPDATE_LEGACY_SQL_QUERY_VISUALIZATION` | Update Legacy SQL Query Visualization | Tool to update a visualization in a SQL query using the legacy API. Use when you need to modify visualization properties such as name, description, type, and options. Note: This is a deprecated endpoint; users should migrate to the current queryvisualizations/update method. Databricks does not recommend modifying visualization settings in JSON. |
| `DATABRICKS_UPDATE_SQL_QUERY_VISUALIZATION` | Update SQL Query Visualization | Tool to update an existing Databricks SQL query visualization using partial update with field mask. Use when you need to modify visualization properties such as display name, description, type, or query attachment. |
| `DATABRICKS_GET_REDASH_V2_CONFIG` | Get Redash V2 Config | Tool to retrieve workspace configuration for Redash V2 in Databricks SQL. Use when you need to get Redash configuration settings for the current workspace. |
| `DATABRICKS_CANCEL_SQL_STATEMENT_EXECUTION` | Cancel SQL Statement Execution | Tool to cancel an executing SQL statement on a Databricks warehouse. Use when you need to terminate a running SQL query. The response indicates successful receipt of the cancel request, but does not guarantee cancellation. Callers must poll the statement status to confirm the terminal state (CANCELED, SUCCEEDED, FAILED, or CLOSED). |
| `DATABRICKS_DELETE_SQL_WAREHOUSE` | Delete SQL Warehouse | Tool to delete a SQL warehouse from the Databricks workspace. Use when you need to permanently remove a SQL compute resource. Deleted warehouses may be restored within 14 days by contacting Databricks support. |
| `DATABRICKS_EDIT_SQL_WAREHOUSE` | Edit SQL Warehouse | Tool to update the configuration of an existing SQL warehouse. Use when you need to modify warehouse settings like cluster size, scaling parameters, auto-stop behavior, or enable features like Photon acceleration and serverless compute. The warehouse is identified by its ID, and you can update various properties including resource allocation and performance optimizations. |
| `DATABRICKS_GET_SQL_WAREHOUSE_DETAILS` | Get SQL Warehouse Details | Tool to retrieve detailed information about a specific SQL warehouse by its ID. Use when you need to get configuration, state, connection details, and resource allocation for a SQL warehouse. Returns comprehensive warehouse information including cluster settings, JDBC/ODBC connection strings, and health status. |
| `DATABRICKS_GET_SQL_WAREHOUSE_PERMISSION_LEVELS` | Get SQL Warehouse Permission Levels | Tool to retrieve available permission levels for a Databricks SQL warehouse. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific SQL warehouse. Returns permission levels like CAN_USE, CAN_MANAGE, IS_OWNER, CAN_VIEW, and CAN_MONITOR with their descriptions. |
| `DATABRICKS_GET_SQL_WAREHOUSE_PERMISSIONS` | Get SQL Warehouse Permissions | Tool to retrieve permissions for a Databricks SQL warehouse. Use when you need to check who has access to a specific SQL warehouse and their permission levels. Returns the access control list with user, group, and service principal permissions, including inherited permissions from parent objects. |
| `DATABRICKS_GET_WORKSPACE_WAREHOUSE_CONFIG` | Get Workspace Warehouse Config | Tool to retrieve workspace-level SQL warehouse configuration settings. Use when you need to check security policies, serverless compute settings, channel versions, or warehouse type restrictions that apply to all SQL warehouses in the workspace. |
| `DATABRICKS_SET_SQL_WAREHOUSE_PERMISSIONS` | Set SQL Warehouse Permissions | Tool to set permissions for a Databricks SQL warehouse, replacing all existing permissions. Use when you need to configure access control for a SQL warehouse. This operation is authoritative and overwrites all existing permissions. Exactly one IS_OWNER must be specified. Groups cannot have IS_OWNER permission. |
| `DATABRICKS_SET_WORKSPACE_WAREHOUSE_CONFIG` | Set Workspace Warehouse Config | Tool to configure workspace-level SQL warehouse settings shared by all SQL warehouses. Use when you need to set security policies, enable serverless compute, configure channel versions, or manage warehouse type restrictions across the workspace. |
| `DATABRICKS_START_SQL_WAREHOUSE` | Start SQL Warehouse | Tool to start a stopped Databricks SQL warehouse asynchronously. Use when you need to restart a stopped warehouse. The warehouse transitions through STARTING state before reaching RUNNING. Requires CAN MONITOR permissions or higher. |
| `DATABRICKS_UPDATE_SQL_WAREHOUSE_PERMISSIONS` | Update SQL Warehouse Permissions | Tool to incrementally update permissions for a Databricks SQL warehouse. Use when you need to modify specific permissions without replacing the entire permission set. This PATCH operation updates only the specified permissions, preserving existing permissions not included in the request. For replacing all permissions, use SetPermissions instead. |
| `DATABRICKS_SUBMIT_ONE_TIME_RUN` | Submit One-Time Run | Tool to submit a one-time run without creating a job. Use when you need to execute a task directly without saving it as a job definition. After submission, use the jobs/runs/get API with the returned run_id to check the run state and monitor progress. |
| `DATABRICKS_CREATE_TAG_POLICY` | Create Tag Policy | Tool to create a new tag policy (governed tag) in Databricks with built-in rules for consistency and control. Use when you need to establish governed tags with restricted values and define who can assign them. Maximum of 1,000 governed tags per account. Each governed tag can have up to 50 allowed values. Requires appropriate account-level permissions. |
| `DATABRICKS_DELETE_TAG_POLICY` | Delete Tag Policy | Tool to delete a tag policy by its key, making the tag ungoverned. Use when you need to remove governance from a tag without deleting the tag itself. Requires MANAGE permission on the governed tag. System governed tags cannot be deleted. |
| `DATABRICKS_GET_TAG_POLICY` | Get Tag Policy | Tool to retrieve a specific tag policy by its associated governed tag's key. Use when you need to get details about tag governance policies including allowed values and metadata. |
| `DATABRICKS_UPDATE_TAG_POLICY` | Update Tag Policy | Tool to update an existing tag policy (governed tag) with specified fields. Use when you need to modify tag policy properties like description, tag key, or allowed values. Users must have MANAGE permission on the governed tag to edit it. |
| `DATABRICKS_UPDATE_GROUP` | Update Group | Tool to update a Databricks group using SCIM 2.0 PATCH operations. Use when you need to modify group properties like displayName, add/remove members, or update roles. |
| `DATABRICKS_DELETE_GROUP_COPY` | Delete Group Copy | Tool to delete a group from Databricks workspace using SCIM v2 protocol. Use when you need to remove a group resource. Users in the group are not removed when the group is deleted. |
| `DATABRICKS_UPDATE_DATABRICKS_JOB_BY_ID` | Update Databricks Job By ID | Tool to completely reset all settings for a Databricks job. Use when you need to overwrite all job configuration at once. Changes to timeout_seconds apply immediately to active runs; other changes apply to future runs only. Consider using the update endpoint for partial updates instead of reset to minimize disruption. |
| `DATABRICKS_UPDATE_USER_BY_ID_PATCH` | Update User by ID (PATCH) | Tool to update a Databricks user by applying SCIM 2.0 PATCH operations on specific user attributes. Use when you need to modify user properties like active status, displayName, entitlements, roles, or other user attributes. |
| `DATABRICKS_CREATE_VECTOR_SEARCH_ENDPOINT` | Create Vector Search Endpoint | Tool to create a new vector search endpoint to host indexes in Databricks Mosaic AI Vector Search. Use when you need to provision compute resources for hosting vector search indexes. The endpoint will be in PROVISIONING state initially and transition to ONLINE when ready. |
| `DATABRICKS_DELETE_VECTOR_SEARCH_INDEX` | Delete Vector Search Index | Tool to delete a vector search index from Databricks workspace. Use when you need to remove unused or obsolete vector search indexes. When an index is deleted, any associated writeback tables are automatically removed. This operation is irreversible. |
| `DATABRICKS_QUERY_VECTOR_SEARCH_INDEX` | Query Vector Search Index | Tool to query vector search index to find similar vectors and return associated documents. Use when performing similarity search, hybrid keyword-similarity search, or full-text search on Databricks Vector Search indexes. Supports filtering, reranking, and returns configurable columns from matched documents with similarity scores. Must provide either query_vector or query_text. |
| `DATABRICKS_UPSERT_DATA_VECTOR_INDEX` | Upsert Data Vector Index | Tool to upsert (insert or update) data into a Direct Vector Access Index. Use when you need to manually add or update vectors in a Databricks vector search index. The index must be a Direct Vector Access Index type where updates are managed via REST API or SDK calls. Data structure must match the schema defined when the index was created, including the primary key field. |
| `DATABRICKS_CREATE_WORKSPACE_GIT_CREDENTIALS` | Create Workspace Git Credentials | Tool to create Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to set up Git integration for version control operations. Only one credential per user is supported - attempts to create when one already exists will fail. |
| `DATABRICKS_DELETE_WORKSPACE_GIT_CREDENTIALS` | Delete Workspace Git Credentials | Tool to delete Git credentials for remote repository authentication in Databricks. Use when you need to remove a Git credential entry from the workspace. Only one Git credential per user is supported in Databricks, making this useful for credential lifecycle management when credentials need to be revoked or replaced. |
| `DATABRICKS_GET_WORKSPACE_GIT_CREDENTIALS` | Get Workspace Git Credentials | Tool to retrieve Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to get details of existing Git integration credentials by credential ID. |
| `DATABRICKS_UPDATE_WORKSPACE_GIT_CREDENTIALS` | Update Workspace Git Credentials | Tool to update existing Git credentials for authenticating with remote Git repositories in Databricks. Use when you need to modify Git provider credentials, email, username, or access tokens. Note that the git_provider field cannot be changed after initial creation. For azureDevOpsServicesAad provider, do not specify personal_access_token or git_username. |
| `DATABRICKS_LIST_WORKSPACE_DIRECTORY` | List Workspace Directory | Tool to list the contents of a directory in Databricks workspace. Use when you need to view notebooks, files, directories, libraries, or repos at a specific path. Returns object information including paths, types, and metadata. Use object_id for setting permissions via the Permissions API. |
| `DATABRICKS_CREATE_WORKSPACE_REPO` | Create Workspace Repo | Tool to create and optionally checkout a Databricks Repo linking a Git repository to the workspace. Use when you need to connect a Git repository to Databricks for collaborative development. Can optionally specify branch or tag to checkout after creation and configure sparse checkout for performance. |
| `DATABRICKS_DELETE_WORKSPACE_REPO` | Delete Workspace Repo | Tool to delete a Git repository from Databricks workspace. Use when you need to permanently remove a repository. The repository cannot be recovered after deletion completes successfully. |
| `DATABRICKS_GET_WORKSPACE_REPO_PERMISSION_LEVELS` | Get Workspace Repo Permission Levels | Tool to retrieve available permission levels for a Databricks workspace repository. Use when you need to understand what permission levels can be assigned to users, groups, or service principals for a specific Git repository. Returns permission levels like CAN_READ, CAN_RUN, CAN_EDIT, and CAN_MANAGE with their descriptions. |
| `DATABRICKS_SET_WORKSPACE_REPO_PERMISSIONS` | Set Workspace Repo Permissions | Tool to set permissions for a workspace repository, replacing all existing permissions. Use when you need to configure access control for a workspace repo. This operation replaces ALL existing permissions; admin users cannot have their permissions lowered. Repos can inherit permissions from their root object. |
| `DATABRICKS_UPDATE_WORKSPACE_REPO` | Update Workspace Repo | Tool to update a workspace repo to a different branch or tag. Use when you need to switch branches, pull latest changes, or update sparse checkout settings. When updating to a tag, the repo enters a detached HEAD state and must be switched back to a branch before committing. |
| `DATABRICKS_UPDATE_WORKSPACE_REPO_PERMISSIONS` | Update Workspace Repo Permissions | Tool to incrementally update permissions on a Databricks workspace repository. Use when you need to modify specific permissions for users, groups, or service principals without replacing the entire permission set. This PATCH operation only modifies the permissions specified while leaving other existing permissions intact. Repos can inherit permissions from their root object. |
| `DATABRICKS_CREATE_SECRET_SCOPE` | Create Secret Scope | Tool to create a new secret scope in Databricks workspace. Use when you need to establish a secure location to store credentials and sensitive information. Scope names must be unique, case-insensitive, and cannot exceed 128 characters. By default, the scope is Databricks-backed with MANAGE permission for the creator. |
| `DATABRICKS_DELETE_SECRETS_ACL` | Delete Secrets ACL | Tool to delete an access control list from a Databricks secret scope. Use when you need to revoke permissions for a principal on a secret scope. Requires MANAGE permission on the scope. Fails if the ACL does not exist. |
| `DATABRICKS_DELETE_SECRET_SCOPE` | Delete Secret Scope | Tool to delete a secret scope and all associated secrets and ACLs. Use when you need to permanently remove a secret scope. This operation cannot be undone. The API throws errors if the scope does not exist or the user lacks authorization. |
| `DATABRICKS_DELETE_WORKSPACE_SECRET` | Delete Workspace Secret | Tool to delete a secret from a Databricks secret scope. Use when you need to remove a secret stored in a scope. Requires WRITE or MANAGE permission on the scope. Not supported for Azure KeyVault-backed scopes. |
| `DATABRICKS_GET_SECRETS_ACL` | Get Secrets ACL | Tool to retrieve ACL details for a principal on a Databricks secret scope. Use when you need to check the permission level granted to a specific user, service principal, or group. Requires MANAGE permission on the scope. Each permission level is hierarchical - WRITE includes READ, and MANAGE includes both WRITE and READ. |
| `DATABRICKS_GET_SECRET_VALUE` | Get Secret Value | Tool to get a secret value from a Databricks secret scope. Use when you need to retrieve the actual value of a secret stored in a scope. Important: This API can only be called from the DBUtils interface (from within a cluster/notebook). There is no API to read the actual secret value outside of a cluster. Requires READ permission on the scope. |
| `DATABRICKS_PUT_SECRETS_ACL` | Put Secrets ACL | Tool to create or overwrite access control list for a principal on a Databricks secret scope. Use when you need to grant or modify permissions for a user, group, or service principal on a secret scope. Requires MANAGE permission on the scope. Overwrites existing permission level for the principal if one already exists. |
| `DATABRICKS_PUT_SECRET_IN_SCOPE` | Put Secret in Scope | Tool to insert or update a secret in a Databricks secret scope. Use when you need to store sensitive information like passwords, API keys, or credentials. Overwrites existing secrets with the same key. Requires WRITE or MANAGE permission on the scope. Maximum 1,000 secrets per scope with 128 KB limit per secret. |
| `DATABRICKS_DELETE_WORKSPACE_OBJECT` | Delete Workspace Object | Tool to permanently delete a workspace object or directory. Use when you need to remove notebooks, files, or directories from the workspace. This is a hard delete operation that cannot be undone. Recursive deletion of non-empty directories is not atomic and may partially complete if it fails. |
| `DATABRICKS_EXPORT_WORKSPACE_OBJECT` | Export Workspace Object | Tool to export a workspace object (notebook, dashboard, or file) as file content or base64-encoded string. Use when you need to retrieve the content of workspace objects for backup, migration, or analysis. By default, returns base64-encoded content with file type information. Set direct_download=true to get raw file content directly. |
| `DATABRICKS_GET_WORKSPACE_OBJECT_STATUS` | Get Workspace Object Status | Tool to retrieve status and metadata for any workspace object including notebooks, directories, dashboards, and files. Use when you need to get object type, path, identifier, and additional metadata fields. Returns error with code RESOURCE_DOES_NOT_EXIST if the specified path does not exist. |
| `DATABRICKS_IMPORT_WORKSPACE_OBJECT` | Import Workspace Object | Tool to import a notebook or file into the Databricks workspace. Use when you need to import base64-encoded content as notebooks, files, or directories. The content must be base64-encoded and can be imported in various formats including SOURCE, HTML, JUPYTER, DBC, and R_MARKDOWN. Maximum content size is 10 MB. |
| `DATABRICKS_CREATE_WORKSPACE_DIRECTORY` | Create Workspace Directory | Tool to create a directory and necessary parent directories in the workspace. Use when you need to create new directories. The operation is idempotent - if the directory already exists, the command succeeds without action. Returns error RESOURCE_ALREADY_EXISTS if a file (not directory) exists at any prefix of the path. |

## Supported Triggers

None listed.

## Creating MCP Server - Stand-alone vs Composio SDK

Once connected, VS Code can access the Databricks MCP server via Composio to run the app actions you authorize, directly from your coding workflow.

## Complete Code

None listed.

## Conclusion

### Way Forward
Now that Databricks is connected, extend your setup by connecting the other apps you already use every day, so your agent can run true cross-app workflows end to end.
- Connect Calendar to turn threads into scheduled meetings automatically.
- Connect Slack or Teams to post summaries, approvals, and alerts where your team works.
- Connect Notion, Linear, Jira, or Asana to convert requests into tickets, tasks, and docs.
- Connect Drive, Dropbox, or OneDrive to fetch, file, and share attachments without manual steps.
- Connect HubSpot or Salesforce to log customer context, update records, and draft follow-ups.
Start with one workflow you do repeatedly, then keep adding apps as you find new handoffs. With everything behind a single MCP endpoint, your agent can coordinate multiple tools safely and reliably in one conversation.

## How to build Databricks MCP Agent with another framework

- [ChatGPT](https://composio.dev/toolkits/databricks/framework/chatgpt)
- [OpenAI Agents SDK](https://composio.dev/toolkits/databricks/framework/open-ai-agents-sdk)
- [Claude Agent SDK](https://composio.dev/toolkits/databricks/framework/claude-agents-sdk)
- [Claude Code](https://composio.dev/toolkits/databricks/framework/claude-code)
- [Claude Cowork](https://composio.dev/toolkits/databricks/framework/claude-cowork)
- [Codex](https://composio.dev/toolkits/databricks/framework/codex)
- [Cursor](https://composio.dev/toolkits/databricks/framework/cursor)
- [OpenCode](https://composio.dev/toolkits/databricks/framework/opencode)
- [OpenClaw](https://composio.dev/toolkits/databricks/framework/openclaw)
- [Hermes](https://composio.dev/toolkits/databricks/framework/hermes-agent)
- [CLI](https://composio.dev/toolkits/databricks/framework/cli)
- [Google ADK](https://composio.dev/toolkits/databricks/framework/google-adk)
- [LangChain](https://composio.dev/toolkits/databricks/framework/langchain)
- [Vercel AI SDK](https://composio.dev/toolkits/databricks/framework/ai-sdk)
- [Mastra AI](https://composio.dev/toolkits/databricks/framework/mastra-ai)
- [LlamaIndex](https://composio.dev/toolkits/databricks/framework/llama-index)
- [CrewAI](https://composio.dev/toolkits/databricks/framework/crew-ai)

## Related Toolkits

- [Firecrawl](https://composio.dev/toolkits/firecrawl) - Firecrawl automates large-scale web crawling and data extraction. It helps organizations efficiently gather, index, and analyze content from online sources.
- [Tavily](https://composio.dev/toolkits/tavily) - Tavily offers powerful search and data retrieval from documents, databases, and the web. It helps teams locate and filter information instantly, saving hours on research.
- [Exa](https://composio.dev/toolkits/exa) - Exa is a data extraction and search platform for gathering and analyzing information from websites, APIs, or databases. It helps teams quickly surface insights and automate data-driven workflows.
- [Serpapi](https://composio.dev/toolkits/serpapi) - SerpApi is a real-time API for structured search engine results. It lets you automate SERP data collection, parsing, and analysis for SEO and research.
- [Peopledatalabs](https://composio.dev/toolkits/peopledatalabs) - Peopledatalabs delivers B2B data enrichment and identity resolution APIs. Supercharge your apps with accurate, up-to-date business and contact data.
- [Snowflake](https://composio.dev/toolkits/snowflake) - Snowflake is a cloud data warehouse built for elastic scaling, secure data sharing, and fast SQL analytics across major clouds.
- [Posthog](https://composio.dev/toolkits/posthog) - PostHog is an open-source analytics platform for tracking user interactions and product metrics. It helps teams refine features, analyze funnels, and reduce churn with actionable insights.
- [Amplitude](https://composio.dev/toolkits/amplitude) - Amplitude is a digital analytics platform for product and behavioral data insights. It helps teams analyze user journeys and make data-driven decisions quickly.
- [Bright Data MCP](https://composio.dev/toolkits/brightdata_mcp) - Bright Data MCP is an AI-powered web scraping and data collection platform. Instantly access public web data in real time with advanced scraping tools.
- [Browseai](https://composio.dev/toolkits/browseai) - Browseai is a web automation and data extraction platform that turns any website into an API. It's perfect for monitoring websites and retrieving structured data without manual scraping.
- [ClickHouse](https://composio.dev/toolkits/clickhouse) - ClickHouse is an open-source, column-oriented database for real-time analytics and big data processing using SQL. Its lightning-fast query performance makes it ideal for handling large datasets and delivering instant insights.
- [Coinmarketcal](https://composio.dev/toolkits/coinmarketcal) - CoinMarketCal is a community-powered crypto calendar for upcoming events, announcements, and releases. It helps traders track market-moving developments and stay ahead in the crypto space.
- [Control d](https://composio.dev/toolkits/control_d) - Control d is a customizable DNS filtering and traffic redirection platform. It helps you manage internet access, enforce policies, and monitor usage across devices and networks.
- [Databox](https://composio.dev/toolkits/databox) - Databox is a business analytics platform that connects your data from any tool and device. It helps you track KPIs, build dashboards, and discover actionable insights.
- [Datagma](https://composio.dev/toolkits/datagma) - Datagma delivers data intelligence and analytics for business growth and market discovery. Get actionable market insights and track competitors to inform your strategy.
- [Delighted](https://composio.dev/toolkits/delighted) - Delighted is a customer feedback platform based on the Net Promoter System®. It helps you quickly gather, track, and act on customer sentiment.
- [Dovetail](https://composio.dev/toolkits/dovetail) - Dovetail is a research analysis platform for transcript review and insight generation. It helps teams code interviews, analyze feedback, and create actionable research summaries.
- [Dub](https://composio.dev/toolkits/dub) - Dub is a short link management platform with analytics and API access. Use it to easily create, manage, and track branded short links for your business.
- [Elasticsearch](https://composio.dev/toolkits/elasticsearch) - Elasticsearch is a distributed, RESTful search and analytics engine for all types of data. It delivers fast, scalable search and powerful analytics across massive datasets.
- [Fireflies](https://composio.dev/toolkits/fireflies) - Fireflies.ai is an AI-powered meeting assistant that records, transcribes, and analyzes voice conversations. It helps teams capture call notes automatically and search or summarize meetings effortlessly.

## Frequently Asked Questions

### What are the differences in Tool Router MCP and Databricks MCP?

With a standalone Databricks MCP server, the agents and LLMs can only access a fixed set of Databricks tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Databricks and many other apps based on the task at hand, all through a single MCP endpoint.

### Can I use Tool Router MCP with VS Code?

Yes, you can. VS Code fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Databricks tools.

### Can I manage the permissions and scopes for Databricks while using Tool Router?

Yes, absolutely. You can configure which Databricks scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

### How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Databricks data and credentials are handled as safely as possible.

---
[See all toolkits](https://composio.dev/toolkits) · [Composio docs](https://docs.composio.dev/llms.txt)
