Ollama CLI for AI Agents

Framework Integration Gradient
Ollama Logo
CLI Logo
divider

Introduction

CLIs are eating MCPs. The industry is converging on the very same idea. MCPs for all their merit can be token hungry, slow, and unreliable for complex tool chaining. However, coding agents have become incredibly good at working with CLIs, and in fact they are far more comfortable working with CLI tools than MCP.

With Composio's Universal CLI, your coding agents can talk to over 850+ SaaS applications. With Ollama, agents can run a custom prompt with llama2, get model info for all installed models, generate a text completion using mistral, and more — all without worrying about authentication.

This guide walks you through Composio Universal CLI and explains how you can connect it with coding agents like Claude Code, Codex, OpenCode, etc, for end-to-end Ollama automation.

Also integrate Ollama with

What is Universal CLI and why use it?

The idea behind building the universal CLI is to give agents a single command interface to interact with all your external applications. Here's what you'll get with it:

  • Agent-friendly: Coding agents like Claude Code, Codex, and OpenCode can use CLI tools natively — no MCP setup required.
  • Authentication handled: Connect once via OAuth or API Key, and all CLI commands work with your credentials automatically.
  • Tool discovery: Search, inspect, and execute 20,000+ tools across 850+ apps from one interface.
  • Trigger support: Use triggers to listen for events across your apps, powered by real-time webhooks or polling under the hood.
  • Type generation: Generate typed schemas for autocomplete and type safety in your projects.

Prerequisites

Install the Composio CLI, authenticate, and initialize your project:

bash
# Install the Composio CLI
curl -fsSL https://composio.dev/install | bash

# Authenticate with Composio
composio login

During login you'll be redirected to sign in page, finish the complete flow and you're all set.

Composio CLI authentication flow

Connecting Ollama to Coding Agents via Universal CLI

Once it is installed, it's essentially done. Claude Code, Codex, OpenCode, OpenClaw, or any other agent will be able to access the CLI. A few steps to give agents access to your apps.

  1. Launch your Coding Agent — Claude Code, Codex, OpenCode, anything you prefer.
  2. Prompt it to "Authenticate with Ollama"
  3. Complete the authentication and authorization flow and your Ollama integration is all set.
  4. Start asking anything you want.

Supported Tools & Triggers

Tools
Chat with Ollama modelTool to send a chat message with conversation history to Ollama.
Generate Text with OllamaTool to generate text responses from Ollama models with optional raw mode.
List ModelsTool to list all available Ollama models and their details.
OpenAI-Compatible Chat CompletionTool to create OpenAI-compatible chat completions using Ollama models.
OpenAI-Compatible Text CompletionTool to create OpenAI-compatible text completions using Ollama models.
List Models (OpenAI Compatible)Tool to list available models using OpenAI-compatible API format.
Show Model InformationTool to show comprehensive information about an Ollama model.
Get Ollama VersionTool to get the version of Ollama running locally.

Universal CLI Commands for Ollama

You can also manually execute CLI commands to interact with your Ollama.

Connect your Ollama account

Link your Ollama account and verify the connection:

bash
# Connect your Ollama account (opens OAuth flow)
composio connected-accounts link ollama

# Verify the connection
composio connected-accounts list --toolkits ollama

Discover Ollama tools

Search and inspect available Ollama tools:

bash
# List all available Ollama tools
composio tools list --toolkit ollama

# Search for Ollama tools by action
composio tools search "ollama"

# Inspect a tool's input schema
composio tools info OLLAMA_CHAT

Common Ollama Actions

Chat with Ollama modelTool to send a chat message with conversation history to Ollama

bash
composio tools execute OLLAMA_CHAT \
  --model "gemma3:4b" \
  --messages "<array>"

Generate Text with OllamaTool to generate text responses from Ollama models with optional raw mode

bash
composio tools execute OLLAMA_GENERATE \
  --model "gemma3:4b"

List ModelsTool to list all available Ollama models and their details

bash
composio tools execute OLLAMA_LIST_MODELS

OpenAI-Compatible Chat CompletionTool to create OpenAI-compatible chat completions using Ollama models

bash
composio tools execute OLLAMA_OPEN_AI_CHAT_COMPLETIONS \
  --model "gemma3:4b" \
  --messages "<array>"

Generate Type Definitions

Generate typed schemas for Ollama tools to get autocomplete and type safety in your project:

bash
# Auto-detect language
composio generate --toolkits ollama

# TypeScript
composio ts generate --toolkits ollama

# Python
composio py generate --toolkits ollama

Tips & Tricks

  • Always inspect a tool's input schema before executing: composio tools info <TOOL_NAME>
  • Pipe output with jq for better readability: composio tools execute TOOL_NAME -d '{}' | jq
  • Set COMPOSIO_API_KEY as an environment variable for CI/CD pipelines
  • Use composio dev logs tools to inspect execution logs and debug issues

Next Steps

  • Try asking your coding agent to perform various Ollama operations
  • Explore cross-app workflows by connecting more toolkits
  • Set up triggers for real-time automation
  • Use composio generate for typed schemas in your projects

How to build Ollama MCP Agent with another framework

FAQ

What is the Composio Universal CLI?

The Composio Universal CLI is a single command-line interface that lets coding agents and developers interact with 850+ SaaS applications. It handles authentication, tool discovery, action execution, and trigger setup — all from the terminal, without needing to configure MCP servers.

Which coding agents work with the Composio CLI?

Any coding agent that can run shell commands works with the Composio CLI — including Claude Code, Codex, OpenCode, OpenClaw, and others. Once the CLI is installed, agents automatically discover and use the composio commands to interact with Ollama and other connected apps.

How is the CLI different from using an MCP server for Ollama?

MCP servers require configuration and can be token-heavy for complex workflows. The CLI gives agents a direct, lightweight interface — no server setup needed. Agents simply call composio commands like any other shell tool. It's faster to set up, more reliable for multi-step tool chaining, and works natively with how coding agents already operate.

How safe is my Ollama data when using the Composio CLI?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Ollama data and credentials are handled as safely as possible. You can also bring your own OAuth credentials for full control.

Used by agents from

Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.