Introducing our new CEO Don Johnson - Read More

eisai/chenyme-aavt

By eisai

Updated 8 months ago

Chenyme/Chenyme-AAVT with CUDA 12.1.

Image
Machine Learning & AI
2

478

Only Windows build version 20348 or newer are supported (Win11, Server2022)
Hyper-V or CUDA toolkits are not required
Windows images are huge, expect 5+ minutes for pull.

Preparation

Create the following folders:

chenyme-aavt
├───conf
├───model
├───output
└───docker-compose.yaml

Docker Compose

Standalone

networks:
  chenyme-aavt:

services:
  chenyme-aavt:
    container_name: chenyme-aavt
    image: eisai/chenyme-aavt:latest
    restart: unless-stopped
    isolation: process
    networks:
      - chenyme-aavt
    ports:
      - "8501:8501"        # Web GUI
    volumes:
      - '.\conf:C:\app\project\config'
      - '.\model:C:\app\model'
      - '.\output:C:\app\project\cache'
    devices: 
      - class/5B45201D-F2F2-4F3B-85BB-30FF1F953599     # Passing GPU, Dont Change. Delete this block if you dont want to use GPU

Use with Ollama

Create an additional folder ollama_model first.

networks:
  chenyme-aavt:

services:
  chenyme-aavt:
    container_name: chenyme-aavt
    image: eisai/chenyme-aavt:latest
    restart: unless-stopped
    isolation: process
    networks:
      - chenyme-aavt
    ports:
      - "8501:8501"        # Web GUI
    volumes:
      - '.\conf:C:\app\project\config'
      - '.\model:C:\app\model'
      - '.\output:C:\app\project\cache'
    devices: 
      - class/5B45201D-F2F2-4F3B-85BB-30FF1F953599     # Passing GPU, Dont Change. Delete this block if you dont want to use GPU

  ollama:
    container_name: ollama
    image: eisai/ollama:latest
    restart: unless-stopped
    isolation: process
    networks:
      - chenyme-aavt
    cpu_count: 8
    volumes:
      - '.\ollama_model:C:\models'
    devices:
      - class/5B45201D-F2F2-4F3B-85BB-30FF1F953599     # Passing GPU

Once running, pull LLM models for ollama before use, example:

docker exec ollama ollama pull aya:35b-23-q6_K

Configuration example in chenyme-aavt:

LOCAL-API-KEY: ollama
LOCAL-API-BASE:http://ollama:11434/v1/
LOCAL-MODEL-NAME:aya:35b-23-q6_K

Reference

Github Repo
https://github.com/Chenyme/Chenyme-AAVT

Install an ultralight docker engine on Windows
https://eisaichen.com/?p=76

Docker Pull Command

docker pull eisai/chenyme-aavt