More Docker. Easy Access. New Streamlined Plans. Learn more.

ai/mistral

Verified Publisher

By Docker

Updated 3 months ago

Mistral’s vllm-powered, high-speed conversational AI for real-time CUDA performance.

Image
Machine Learning & AI
Gen AI
3

1.6K

Mistral Model - ai/mistral

Overview

The Mistral model Docker image, leveraging vllm, is optimized for demanding conversational AI applications. Built with CUDA 12.6 support, Mistral performs well in real-time scenarios, including customer service automation and extended context dialog systems.

Key Features and Use Cases

  • High-Performance Conversational AI: Designed for chatbots, virtual assistants, and customer support systems requiring quick responses.
  • Extended Context Processing: Supports handling longer dialogues, making it ideal for detailed conversational applications.
  • CUDA-Optimized: Ensures efficient use of GPU resources, even with complex and extended conversational inputs.

Getting Started

To use the Mistral model, run the following:

docker run -it --rm --gpus=all -p 8000:8000 --name vllm ai/mistral:7B-Instruct-v0.2-cuda-12.6 --cpu-offload-gb 5 --max-model-len 30576

Once started, the model can process OpenAI-compatible requests:

curl -s http://localhost:8000/v1/completions \
-H "Content-Type: application/json" \
-d '{
  "model": "mistral",
  "prompt": "Enter your prompt here",
  "max_tokens": 10,
  "temperature": 0.5
}'

Image Variants and Tags

  • 7B-Instruct-v0.2, 7B-Instruct-v0.2-cuda-12.6: Designed for high-performance conversational AI tasks.

License

Mistral AI offers its models under two primary licenses:

  • Apache 2.0 License: This permissive open-source license allows users to freely use, modify, and distribute the models for any purpose, including commercial applications. Models like Mistral 7B and Mistral NeMo are available under this license. MISTRAL HELP

  • Mistral AI Non-Production License (MNPL): Introduced in May 2024, the MNPL permits usage of certain models for non-commercial purposes, such as research and testing. Commercial use under this license is prohibited. For commercial applications, a separate commercial license must be obtained. The Codestral model, for instance, is released under the MNPL. MISTRAL

For detailed information on these licenses and to determine which applies to a specific model, please refer to Mistral AI's official license documentation.

Docker Pull Command

docker pull ai/mistral