More Docker. Easy Access. New Streamlined Plans. Learn more.

openvino/model_server

By openvino

Updated 6 days ago

Intel® Distribution of OpenVINO™ Model Server Docker images

Image
15

100K+

Supported tags

TagSupported devicesBase OS
2025.0-gpuCPU, iGPU and dGPUubuntu24.04
2025.0CPUubuntu24.04
2024.5-gpuCPU, iGPU and dGPUubuntu22.04
2024.5CPUubuntu22.04
2024.4-gpuCPU, iGPU and dGPUubuntu22.04
2024.4CPUubuntu22.04
2024.3-gpuCPU, iGPU and dGPUubuntu22.04
2024.3CPUubuntu22.04
2024.2-gpuCPU, iGPU and dGPUubuntu22.04
2024.2CPUubuntu22.04
2024.1-gpuCPU, iGPU and dGPUubuntu22.04
2024.1CPUubuntu22.04
2024.0-gpuCPU, iGPU and dGPUubuntu22.04
2024.0CPUubuntu22.04
2023.3-gpuCPU, iGPU and dGPUubuntu22.04
2023.3CPUubuntu22.04
2023.2-gpuCPU, iGPU and dGPUubuntu22.04
2023.2CPUubuntu20.04
2023.1-gpuCPU, iGPU and dGPUubuntu22.04
2023.1CPUubuntu20.04
2023.0-gpuCPU, iGPU and dGPUubuntu22.04
2023.0CPUubuntu20.04

Intel® Distribution of OpenVINO™ Model Server Docker image

OpenVINO™ Model Server is a scalable, high-performance solution for serving machine learning models optimized for Intel® architectures. The server provides an inference service via gRPC, REST API or C API -- making it easy to deploy new algorithms and AI experiments.

The Intel® Distribution of OpenVINO™ toolkit quickly deploys applications and solutions that emulate human vision. Based on Convolutional Neural Networks (CNN), the toolkit extends computer vision (CV) workloads across Intel® hardware, maximizing performance.

To run the image, use the following command:

docker run -it --rm openvino/model_server:latest --help

Start the Docker container with the OpenVINO™ model server and enable just a single model, you do not need any extra configuration file, so this process can be completed with just one command like below:

docker run --rm -d  -v /models/:/opt/ml:ro -p 9001:9001 -p 8001:8001 openvino/model_server:latest --model_path /opt/ml/model1 --model_name my_model --port 9001 --rest_port 8001

Check the Quickstart Guide

Licenses

LEGAL NOTICE: By accessing, downloading or using this software and any required dependent software (the “Software Package”), you agree to the terms and conditions of the software license agreements for the Software Package, which may also include notices, disclaimers, or license terms for third party software included with the Software Package. Please refer to the “third-party-programs.txt” or other similarly-named text file for additional details.

By downloading and using this container and the included software, you agree to the terms and conditions of the software license agreements located here.

As for any pre-built image usage, it is the image user's responsibility to ensure that any use of this image complies with any relevant licenses and potential fees for all software contained within. We will have no indemnity or warranty coverage from suppliers.

Components:


More Containers for running HPC, AI, ML and etc. workloads can be found at the Intel® oneContainer Portal.

Docker Pull Command

docker pull openvino/model_server