CircleCI

CircleCI

A specialized server implementation for the Model Context Protocol (MCP) designed to integrate with CircleCI's development workflow. This project serves as a bridge between CircleCI's infrastructure and the Model Context Protocol, enabling enhanced AI-powered development experiences.

3.2K

12 Tools

Packaged by
Requires Secrets
Add to Docker Desktop

Version 4.43 or later needs to be installed to add the server automatically

Tools

NameDescription
create_prompt_templateABOUT THIS TOOL: - This tool is part of a toolchain that generates and provides test cases for a prompt template. - This tool helps an AI assistant to generate a prompt template based on one of the following: 1. feature requirements defined by a user - in which case the tool will generate a new prompt template based on the feature requirements. 2. a pre-existing prompt or prompt template that a user wants to test, evaluate, or modify - in which case the tool will convert it into a more structured and testable prompt template while leaving the original prompt language relatively unchanged. - This tool will return a structured prompt template (e.g. `template`) along with a context schema (e.g. ``contextSchema``) that defines the expected input parameters for the prompt template. - In some cases, a user will want to add test coverage for ALL of the prompts in a given application. In these cases, the AI agent should use this tool to generate a prompt template for each prompt in the application, and should check the entire application for AI prompts that are not already covered by a prompt template in the `./prompts` directory. WHEN SHOULD THIS TOOL BE TRIGGERED? - This tool should be triggered whenever the user provides requirements for a new AI-enabled application or a new AI-enabled feature of an existing application (i.e. one that requires a prompt request to an LLM or any AI model). - This tool should also be triggered if the user provides a pre-existing prompt or prompt template from their codebase that they want to test, evaluate, or modify. - This tool should be triggered even if there are pre-existing files in the `./prompts` directory with the `<relevant-name>.prompt.yml` convention (e.g. `bedtime-story-generator.prompt.yml`, `plant-care-assistant.prompt.yml`, `customer-support-chatbot.prompt.yml`, etc.). Similar files should NEVER be generated directly by the AI agent. Instead, the AI agent should use this tool to first generate a new prompt template. PARAMETERS: - params: object - prompt: string (the feature requirements or pre-existing prompt/prompt template that will be used to generate a prompt template. Can be a multi-line string.) - promptOrigin: "codebase" | "requirements" (indicates whether the prompt comes from an existing codebase or from new requirements) - model: string (the model that the prompt template will be tested against. Explicitly specify the model if it can be inferred from the codebase. Otherwise, defaults to `gpt-4.1-mini`.) - temperature: number (the temperature of the prompt template. Explicitly specify the temperature if it can be inferred from the codebase. Otherwise, defaults to 1.) EXAMPLE USAGE (from new requirements): { "params": { "prompt": "Create an app that takes any topic and an age (in years), then renders a 1-minute bedtime story for a person of that age.", "promptOrigin": "requirements" "model": "gpt-4.1-mini" "temperature": 1.0 } } EXAMPLE USAGE (from pre-existing prompt/prompt template in codebase): { "params": { "prompt": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "promptOrigin": "codebase" "model": "claude-3-5-sonnet-latest" "temperature": 0.7 } } TOOL OUTPUT INSTRUCTIONS: - The tool will return... - a `template` that reformulates the user's prompt into a more structured format. - a ``contextSchema`` that defines the expected input parameters for the template. - a `promptOrigin` that indicates whether the prompt comes from an existing prompt or prompt template in the user's codebase or from new requirements. - The tool output -- the `template`, ``contextSchema``, and `promptOrigin` -- will also be used as input to the `recommend_prompt_template_tests` tool to generate a list of recommended tests that can be used to test the prompt template.
recommend_prompt_template_testsAbout this tool: - This tool is part of a toolchain that generates and provides test cases for a prompt template. - This tool generates an array of recommended tests for a given prompt template. Parameters: - params: object - promptTemplate: string (the prompt template to be tested) - contextSchema: object (the context schema that defines the expected input parameters for the prompt template) - promptOrigin: "codebase" | "requirements" (indicates whether the prompt comes from an existing codebase or from new requirements) - model: string (the model that the prompt template will be tested against) Example usage: { "params": { "promptTemplate": "The user wants a bedtime story about {{topic}} for a person of age {{age}} years old. Please craft a captivating tale that captivates their imagination and provides a delightful bedtime experience.", "contextSchema": { "topic": "string", "age": "number" }, "promptOrigin": "codebase" } } The tool will return a structured array of test cases that can be used to test the prompt template. Tool output instructions: - The tool will return a `recommendedTests` array that can be used to test the prompt template.

Manual installation

You can install the MCP server using:

Installation for

Related servers