Opik

Opik

Model Context Protocol (MCP) implementation for Opik enabling seamless IDE integration and unified access to prompts, projects, traces, and metrics.

2.3K

13 Tools

Packaged by
Requires Secrets
Add to Docker Desktop

Version 4.43 or later needs to be installed to add the server automatically

About

Opik MCP Server

Model Context Protocol (MCP) implementation for Opik enabling seamless IDE integration and unified access to prompts, projects, traces, and metrics. .

What is an MCP Server?

Characteristics

AttributeDetails
Docker Imagemcp/opik
Authorcomet-ml
Repositoryhttps://github.com/comet-ml/opik-mcp
Dockerfilehttps://github.com/comet-ml/opik-mcp/blob/main/Dockerfile
Docker Image built byDocker Inc.
Docker Scout Health ScoreDocker Scout Health Score
Verify SignatureCOSIGN_REPOSITORY=mcp/signatures cosign verify mcp/opik --key https://raw.githubusercontent.com/docker/keyring/refs/heads/main/public/mcp/latest.pub
LicenceApache License 2.0

Available Tools (13)

Tools provided by this ServerShort Description
add-trace-feedbackAdd feedback scores to a trace for quality evaluation and monitoring.
create-projectCreate a new project
create-promptCreate a new prompt
get-prompt-versionRetrieve a specific version of a prompt
get-promptsGet a list of prompts with optional filtering
get-trace-by-idGet detailed information about a specific trace including input, output, metadata, and timing information
get-trace-statsGet aggregated statistics for traces including counts, costs, token usage, and performance metrics over time
get-trace-threadsGet trace threads (conversation groupings) to view related traces that belong to the same conversation or session
list-projectsGet a list of projects with optional filtering
list-tracesGet a list of traces from a project.
opik-integration-docsProvides detailed documentation on how to integrate Opik with your LLM application
save-prompt-versionSave a new version of a prompt
search-tracesAdvanced search for traces with complex filtering and query capabilities

Tools Details

Tool: add-trace-feedback

Add feedback scores to a trace for quality evaluation and monitoring. Useful for rating trace quality, relevance, or custom metrics

ParametersTypeDescription
scoresarrayArray of feedback scores to add. Each score should have a name and value between 0-1
traceIdstringID of the trace to add feedback to
workspaceNamestringoptionalWorkspace name to use instead of the default

Tool: create-project

Create a new project

ParametersTypeDescription
namestringName of the project
descriptionstringoptionalDescription of the project
workspaceNamestringoptionalWorkspace name to use instead of the default

Tool: create-prompt

Create a new prompt

ParametersTypeDescription
namestringName of the prompt
descriptionstringoptionalDescription of the prompt
tagsarrayoptionalList of tags for the prompt

Tool: get-prompt-version

Retrieve a specific version of a prompt

ParametersTypeDescription
namestringName of the prompt
commitstringoptionalSpecific commit/version to retrieve

Tool: get-prompts

Get a list of prompts with optional filtering

ParametersTypeDescription
namestringoptionalFilter by prompt name
pagenumberoptionalPage number for pagination
sizenumberoptionalNumber of items per page

Tool: get-trace-by-id

Get detailed information about a specific trace including input, output, metadata, and timing information

ParametersTypeDescription
traceIdstringID of the trace to fetch (UUID format, e.g. "123e4567-e89b-12d3-a456-426614174000")
workspaceNamestringoptionalWorkspace name to use instead of the default workspace

Tool: get-trace-stats

Get aggregated statistics for traces including counts, costs, token usage, and performance metrics over time

ParametersTypeDescription
endDatestringoptionalEnd date in ISO format (YYYY-MM-DD). Example: "2024-01-31"
projectIdstringoptionalProject ID to filter traces. If not provided, will use the first available project
projectNamestringoptionalProject name to filter traces (alternative to projectId)
startDatestringoptionalStart date in ISO format (YYYY-MM-DD). Example: "2024-01-01"
workspaceNamestringoptionalWorkspace name to use instead of the default workspace

Tool: get-trace-threads

Get trace threads (conversation groupings) to view related traces that belong to the same conversation or session

ParametersTypeDescription
pagenumberoptionalPage number for pagination
projectIdstringoptionalProject ID to filter threads
projectNamestringoptionalProject name to filter threads
sizenumberoptionalNumber of threads per page
threadIdstringoptionalSpecific thread ID to retrieve (useful for getting all traces in a conversation)
workspaceNamestringoptionalWorkspace name to use instead of the default

Tool: list-projects

Get a list of projects with optional filtering

ParametersTypeDescription
pagenumberoptionalPage number for pagination
sizenumberoptionalNumber of items per page
workspaceNamestringoptionalWorkspace name to use instead of the default

Tool: list-traces

Get a list of traces from a project. Use this for basic trace retrieval and overview

ParametersTypeDescription
pagenumberoptionalPage number for pagination (starts at 1)
projectIdstringoptionalProject ID to filter traces. If not provided, will use the first available project
projectNamestringoptionalProject name to filter traces (alternative to projectId). Example: "My AI Assistant"
sizenumberoptionalNumber of traces per page (1-100, default 10)
workspaceNamestringoptionalWorkspace name to use instead of the default workspace

Tool: opik-integration-docs

Provides detailed documentation on how to integrate Opik with your LLM application

Tool: save-prompt-version

Save a new version of a prompt

ParametersTypeDescription
namestringName of the prompt
templatestringTemplate content for the prompt version
change_descriptionstringoptionalDescription of changes in this version
metadataobjectoptionalAdditional metadata for the prompt version
typestringoptionalTemplate type

Tool: search-traces

Advanced search for traces with complex filtering and query capabilities

ParametersTypeDescription
filtersobjectoptionalAdvanced filters as key-value pairs. Examples: {"status": "error"}, {"model": "gpt-4"}, {"duration_ms": {"$gt": 1000}}
pagenumberoptionalPage number for pagination
projectIdstringoptionalProject ID to search within
projectNamestringoptionalProject name to search within
querystringoptionalText query to search in trace names, inputs, outputs, and metadata. Example: "error" or "user_query:hello"
sizenumberoptionalNumber of traces per page (max 100)
sortBystringoptionalField to sort by. Options: "created_at", "duration", "name", "status"
sortOrderstringoptionalSort order: ascending or descending
workspaceNamestringoptionalWorkspace name to use instead of the default

Use this MCP Server

{
  "mcpServers": {
    "opik": {
      "command": "docker",
      "args": [
        "run",
        "-i",
        "--rm",
        "-e",
        "OPIK_API_BASE_URL",
        "-e",
        "OPIK_WORKSPACE_NAME",
        "-e",
        "OPIK_API_KEY",
        "mcp/opik"
      ],
      "env": {
        "OPIK_API_BASE_URL": "https://www.comet.com/opik/api",
        "OPIK_WORKSPACE_NAME": "default",
        "OPIK_API_KEY": "your_api_key"
      }
    }
  }
}

Why is it safer to run MCP Servers with Docker?

Manual installation

You can install the MCP server using:

Installation for

Related servers