How to integrate Gitlab MCP with LlamaIndex

Framework Integration Gradient
Gitlab Logo
LlamaIndex Logo
divider

Introduction

This guide walks you through connecting Gitlab to LlamaIndex using the Composio tool router. By the end, you'll have a working Gitlab agent that can create new gitlab group for qa team, open bug issue in frontend project, create branch from latest main commit, archive completed api migration project through natural language commands.

This guide will help you understand how to give your LlamaIndex agent real control over a Gitlab account through Composio's Gitlab MCP server.

Before we dive in, let's take a quick look at the key ideas and tools involved.

TL;DR

Here's what you'll learn:
  • Set your OpenAI and Composio API keys
  • Install LlamaIndex and Composio packages
  • Create a Composio Tool Router session for Gitlab
  • Connect LlamaIndex to the Gitlab MCP server
  • Build a Gitlab-powered agent using LlamaIndex
  • Interact with Gitlab through natural language

What is LlamaIndex?

LlamaIndex is a data framework for building LLM applications. It provides tools for connecting LLMs to external data sources and services through agents and tools.

Key features include:

  • ReAct Agent: Reasoning and acting pattern for tool-using agents
  • MCP Tools: Native support for Model Context Protocol
  • Context Management: Maintain conversation context across interactions
  • Async Support: Built for async/await patterns

What is the Gitlab MCP server, and what's possible with it?

The Gitlab MCP server is an implementation of the Model Context Protocol that connects your AI agent and assistants like Claude, Cursor, etc directly to your Gitlab account. It provides structured and secure access to your repositories, projects, and issues, so your agent can perform actions like creating projects, managing issues, handling branches, and automating DevOps workflows on your behalf.

  • Project and group automation: Instantly create new Gitlab projects or organize your workspaces by setting up project groups—all without manual clicks.
  • Issue creation and tracking: Have your agent report bugs, request features, or open new issues in specific projects to keep your team on top of tasks.
  • Branch management: Let your agent create repository branches from any commit or base branch, making it easy to streamline your development process.
  • Project lifecycle management: Archive completed projects or delete unneeded ones, keeping your workspace clean and up to date with minimal effort.
  • Commit and job insights: Retrieve commit references, determine commit sequence in project history, or erase job artifacts and logs for deeper CI/CD control.

Supported Tools & Triggers

Tools
Archive ProjectTool to archive a project.
Create GitLab GroupTool to create a new group in gitlab.
Create ProjectTool to create a new project in gitlab.
Create Project IssueTool to create a new issue in a gitlab project.
Create Repository BranchTool to create a new branch in a project.
Delete ProjectTool to delete a gitlab project by its id.
Download Project AvatarTool to download a project’s avatar image.
Erase JobTool to erase the content of a specified job within a project.
Get Commit ReferencesTool to get all references (branches or tags) a commit is pushed to.
Get Commit SequenceTool to get the sequence number of a commit in a project by following parent links from the given commit.
Get Group DetailsTool to retrieve information about a specific group by its id.
Get Group MemberTool to retrieve details for a specific group member.
Get GroupsGet groups
Get Job DetailsTool to retrieve details of a single job by its id within a specified project.
Get Merge Request NotesTool to fetch comments on a merge request.
Get ProjectTool to get a single project by id or url-encoded path.
Get Project LanguagesTool to list programming languages used in a project with percentages.
Get Project MemberTool to retrieve details for a specific project member.
Get Project Member AllTool to retrieve details for a specific project member (including inherited and invited members).
Get Merge Request CommitsTool to get commits of a merge request.
Get Project Merge RequestsTool to retrieve a list of merge requests for a specific project.
Get ProjectsTool to list all projects accessible to the authenticated user.
List Merge Request DiffsTool to list all diff versions of a merge request.
Get Repository BranchTool to retrieve information about a specific branch in a project.
Get Repository BranchesRetrieves a list of repository branches for a project.
Get Single CommitTool to get a specific commit identified by the commit hash or name of a branch or tag.
Get Single PipelineTool to retrieve details of a single pipeline by its id within a specified project.
Get UserTool to retrieve information about a specific user by their id.
Get User PreferencesTool to get the current user's preferences.
Get UsersTool to retrieve a list of users from gitlab.
Get User StatusTool to get a user's status by id.
Get User StatusTool to get the current user's status.
Get User Support PINTool to get details of the current user's support pin.
Import project membersTool to import members from one project to another.
List All Group MembersTool to list all members of a group including direct, inherited, and invited members.
List All Project MembersTool to list all members of a project (direct, inherited, invited).
List Billable Group MembersTool to list billable members of a top-level group (including its subgroups and projects).
List Group MembersTool to list direct members of a group.
List Pending Group MembersTool to list pending members of a group and its subgroups and projects.
List Pipeline JobsTool to retrieve a list of jobs for a specified pipeline within a project.
List Project GroupsTool to list ancestor groups of a project.
List Project Invited GroupsTool to list groups invited to a project.
List Project PipelinesTool to retrieve a list of pipelines for a specified project.
List Project Shareable GroupsTool to list groups that can be shared with a project.
List Project Repository TagsTool to retrieve a list of repository tags for a specified project.
List Project Transfer LocationsTool to list namespaces available for project transfer.
List project usersTool to list users of a project.
List Repository CommitsTool to get a list of repository commits in a project.
List User ProjectsTool to list projects owned by a specific user.
Create Support PINTool to create a support pin for your authenticated user.
Update User PreferencesTool to update the current user's preferences.
Set User StatusTool to set the current user's status.
Share Project With GroupTool to share a project with a group.
Start Housekeeping TaskTool to start the housekeeping task for a project.

What is the Composio tool router, and how does it fit here?

What is Tool Router?

Composio's Tool Router helps agents find the right tools for a task at runtime. You can plug in multiple toolkits (like Gmail, HubSpot, and GitHub), and the agent will identify the relevant app and action to complete multi-step workflows. This can reduce token usage and improve the reliability of tool calls. Read more here: Getting started with Tool Router

The tool router generates a secure MCP URL that your agents can access to perform actions.

How the Tool Router works

The Tool Router follows a three-phase workflow:

  1. Discovery: Searches for tools matching your task and returns relevant toolkits with their details.
  2. Authentication: Checks for active connections. If missing, creates an auth config and returns a connection URL via Auth Link.
  3. Execution: Executes the action using the authenticated connection.

Step-by-step Guide

Prerequisites

Before you begin, make sure you have:
  • Python 3.8/Node 16 or higher installed
  • A Composio account with the API key
  • An OpenAI API key
  • A Gitlab account and project
  • Basic familiarity with async Python/Typescript

Getting API Keys for OpenAI, Composio, and Gitlab

OpenAI API key (OPENAI_API_KEY)
  • Go to the OpenAI dashboard
  • Create an API key if you don't have one
  • Assign it to OPENAI_API_KEY in .env
Composio API key and user ID
  • Log into the Composio dashboard
  • Copy your API key from Settings
    • Use this as COMPOSIO_API_KEY
  • Pick a stable user identifier (email or ID)
    • Use this as COMPOSIO_USER_ID

Installing dependencies

pip install composio-llamaindex llama-index llama-index-llms-openai llama-index-tools-mcp python-dotenv

Create a new Python project and install the necessary dependencies:

  • composio-llamaindex: Composio's LlamaIndex integration
  • llama-index: Core LlamaIndex framework
  • llama-index-llms-openai: OpenAI LLM integration
  • llama-index-tools-mcp: MCP client for LlamaIndex
  • python-dotenv: Environment variable management

Set environment variables

bash
OPENAI_API_KEY=your-openai-api-key
COMPOSIO_API_KEY=your-composio-api-key
COMPOSIO_USER_ID=your-user-id

Create a .env file in your project root:

These credentials will be used to:

  • Authenticate with OpenAI's GPT-5 model
  • Connect to Composio's Tool Router
  • Identify your Composio user session for Gitlab access

Import modules

import asyncio
import os
import dotenv

from composio import Composio
from composio_llamaindex import LlamaIndexProvider
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.workflow import Context
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec

dotenv.load_dotenv()

Create a new file called gitlab_llamaindex_agent.py and import the required modules:

Key imports:

  • asyncio: For async/await support
  • Composio: Main client for Composio services
  • LlamaIndexProvider: Adapts Composio tools for LlamaIndex
  • ReActAgent: LlamaIndex's reasoning and action agent
  • BasicMCPClient: Connects to MCP endpoints
  • McpToolSpec: Converts MCP tools to LlamaIndex format

Load environment variables and initialize Composio

OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY is not set in the environment")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set in the environment")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set in the environment")

What's happening:

This ensures missing credentials cause early, clear errors before the agent attempts to initialise.

Create a Tool Router session and build the agent function

async def build_agent() -> ReActAgent:
    composio_client = Composio(
        api_key=COMPOSIO_API_KEY,
        provider=LlamaIndexProvider(),
    )

    session = composio_client.create(
        user_id=COMPOSIO_USER_ID,
        toolkits=["gitlab"],
    )

    mcp_url = session.mcp.url
    print(f"Composio MCP URL: {mcp_url}")

    mcp_client = BasicMCPClient(mcp_url, headers={"x-api-key": COMPOSIO_API_KEY})
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    llm = OpenAI(model="gpt-5")

    description = "An agent that uses Composio Tool Router MCP tools to perform Gitlab actions."
    system_prompt = """
    You are a helpful assistant connected to Composio Tool Router.
    Use the available tools to answer user queries and perform Gitlab actions.
    """
    return ReActAgent(tools=tools, llm=llm, description=description, system_prompt=system_prompt, verbose=True)

What's happening here:

  • We create a Composio client using your API key and configure it with the LlamaIndex provider
  • We then create a tool router MCP session for your user, specifying the toolkits we want to use (in this case, gitlab)
  • The session returns an MCP HTTP endpoint URL that acts as a gateway to all your configured tools
  • LlamaIndex will connect to this endpoint to dynamically discover and use the available Gitlab tools.
  • The MCP tools are mapped to LlamaIndex-compatible tools and plug them into the Agent.

Create an interactive chat loop

async def chat_loop(agent: ReActAgent) -> None:
    ctx = Context(agent)
    print("Type 'quit', 'exit', or Ctrl+C to stop.")

    while True:
        try:
            user_input = input("\nYou: ").strip()
        except (KeyboardInterrupt, EOFError):
            print("\nBye!")
            break

        if not user_input or user_input.lower() in {"quit", "exit"}:
            print("Bye!")
            break

        try:
            print("Agent: ", end="", flush=True)
            handler = agent.run(user_input, ctx=ctx)

            async for event in handler.stream_events():
                # Stream token-by-token from LLM responses
                if hasattr(event, "delta") and event.delta:
                    print(event.delta, end="", flush=True)
                # Show tool calls as they happen
                elif hasattr(event, "tool_name"):
                    print(f"\n[Using tool: {event.tool_name}]", flush=True)

            # Get final response
            response = await handler
            print()  # Newline after streaming
        except KeyboardInterrupt:
            print("\n[Interrupted]")
            continue
        except Exception as e:
            print(f"\nError: {e}")

What's happening here:

  • We're creating a direct terminal interface to chat with your Gitlab database
  • The LLM's responses are streamed to the CLI for faster interaction.
  • The agent uses context to maintain conversation history
  • You can type 'quit' or 'exit' to stop the chat loop gracefully
  • Agent responses and any errors are displayed in a clear, readable format

Define the main entry point

async def main() -> None:
    agent = await build_agent()
    await chat_loop(agent)

if __name__ == "__main__":
    # Handle Ctrl+C gracefully
    signal.signal(signal.SIGINT, lambda s, f: (print("\nBye!"), exit(0)))
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print("\nBye!")

What's happening here:

  • We're orchestrating the entire application flow
  • The agent gets built with proper error handling
  • Then we kick off the interactive chat loop so you can start talking to Gitlab

Run the agent

npx ts-node llamaindex-agent.ts

When prompted, authenticate and authorise your agent with Gitlab, then start asking questions.

Complete Code

Here's the complete code to get you started with Gitlab and LlamaIndex:

import asyncio
import os
import signal
import dotenv

from composio import Composio
from composio_llamaindex import LlamaIndexProvider
from llama_index.core.agent.workflow import ReActAgent
from llama_index.core.workflow import Context
from llama_index.llms.openai import OpenAI
from llama_index.tools.mcp import BasicMCPClient, McpToolSpec

dotenv.load_dotenv()

OPENAI_API_KEY = os.getenv("OPENAI_API_KEY")
COMPOSIO_API_KEY = os.getenv("COMPOSIO_API_KEY")
COMPOSIO_USER_ID = os.getenv("COMPOSIO_USER_ID")

if not OPENAI_API_KEY:
    raise ValueError("OPENAI_API_KEY is not set")
if not COMPOSIO_API_KEY:
    raise ValueError("COMPOSIO_API_KEY is not set")
if not COMPOSIO_USER_ID:
    raise ValueError("COMPOSIO_USER_ID is not set")

async def build_agent() -> ReActAgent:
    composio_client = Composio(
        api_key=COMPOSIO_API_KEY,
        provider=LlamaIndexProvider(),
    )

    session = composio_client.create(
        user_id=COMPOSIO_USER_ID,
        toolkits=["gitlab"],
    )

    mcp_url = session.mcp.url
    print(f"Composio MCP URL: {mcp_url}")

    mcp_client = BasicMCPClient(mcp_url, headers={"x-api-key": COMPOSIO_API_KEY})
    mcp_tool_spec = McpToolSpec(client=mcp_client)
    tools = await mcp_tool_spec.to_tool_list_async()

    llm = OpenAI(model="gpt-5")
    description = "An agent that uses Composio Tool Router MCP tools to perform Gitlab actions."
    system_prompt = """
    You are a helpful assistant connected to Composio Tool Router.
    Use the available tools to answer user queries and perform Gitlab actions.
    """
    return ReActAgent(
        tools=tools,
        llm=llm,
        description=description,
        system_prompt=system_prompt,
        verbose=True,
    );

async def chat_loop(agent: ReActAgent) -> None:
    ctx = Context(agent)
    print("Type 'quit', 'exit', or Ctrl+C to stop.")

    while True:
        try:
            user_input = input("\nYou: ").strip()
        except (KeyboardInterrupt, EOFError):
            print("\nBye!")
            break

        if not user_input or user_input.lower() in {"quit", "exit"}:
            print("Bye!")
            break

        try:
            print("Agent: ", end="", flush=True)
            handler = agent.run(user_input, ctx=ctx)

            async for event in handler.stream_events():
                # Stream token-by-token from LLM responses
                if hasattr(event, "delta") and event.delta:
                    print(event.delta, end="", flush=True)
                # Show tool calls as they happen
                elif hasattr(event, "tool_name"):
                    print(f"\n[Using tool: {event.tool_name}]", flush=True)

            # Get final response
            response = await handler
            print()  # Newline after streaming
        except KeyboardInterrupt:
            print("\n[Interrupted]")
            continue
        except Exception as e:
            print(f"\nError: {e}")

async def main() -> None:
    agent = await build_agent()
    await chat_loop(agent)

if __name__ == "__main__":
    # Handle Ctrl+C gracefully
    signal.signal(signal.SIGINT, lambda s, f: (print("\nBye!"), exit(0)))
    try:
        asyncio.run(main())
    except KeyboardInterrupt:
        print("\nBye!")

Conclusion

You've successfully connected Gitlab to LlamaIndex through Composio's Tool Router MCP layer. Key takeaways:
  • Tool Router dynamically exposes Gitlab tools through an MCP endpoint
  • LlamaIndex's ReActAgent handles reasoning and orchestration; Composio handles integrations
  • The agent becomes more capable without increasing prompt size
  • Async Python provides clean, efficient execution of agent workflows
You can easily extend this to other toolkits like Gmail, Notion, Stripe, GitHub, and more by adding them to the toolkits parameter.

How to build Gitlab MCP Agent with another framework

FAQ

What are the differences in Tool Router MCP and Gitlab MCP?

With a standalone Gitlab MCP server, the agents and LLMs can only access a fixed set of Gitlab tools tied to that server. However, with the Composio Tool Router, agents can dynamically load tools from Gitlab and many other apps based on the task at hand, all through a single MCP endpoint.

Can I use Tool Router MCP with LlamaIndex?

Yes, you can. LlamaIndex fully supports MCP integration. You get structured tool calling, message history handling, and model orchestration while Tool Router takes care of discovering and serving the right Gitlab tools.

Can I manage the permissions and scopes for Gitlab while using Tool Router?

Yes, absolutely. You can configure which Gitlab scopes and actions are allowed when connecting your account to Composio. You can also bring your own OAuth credentials or API configuration so you keep full control over what the agent can do.

How safe is my data with Composio Tool Router?

All sensitive data such as tokens, keys, and configuration is fully encrypted at rest and in transit. Composio is SOC 2 Type 2 compliant and follows strict security practices so your Gitlab data and credentials are handled as safely as possible.

Used by agents from

Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai
Context
Letta
glean
HubSpot
Agent.ai
Altera
DataStax
Entelligence
Rolai

Never worry about agent reliability

We handle tool reliability, observability, and security so you never have to second-guess an agent action.