Files
compose-anything/src/ollama
2025-10-14 09:36:35 +08:00
..
2025-09-24 22:24:11 +08:00
2025-10-14 09:36:35 +08:00
2025-09-24 14:16:10 +08:00
2025-09-24 14:16:10 +08:00

Ollama

English | 中文

This service deploys Ollama for running local LLM models.

Usage

  • Pull DeepSeek R1 7B model:

    docker exec -it ollama ollama pull deepseek-r1:7b
    
  • List all local models:

    docker exec -it ollama ollama list
    
  • Get all local models via API:

    curl http://localhost:11434/api/tags 2> /dev/null | jq
    

Services

  • ollama: The Ollama service.

Configuration

  • OLLAMA_VERSION: The version of the Ollama image, default is 0.12.0.
  • OLLAMA_PORT_OVERRIDE: The host port for Ollama, default is 11434.

Volumes

  • ollama_models: A volume for storing Ollama models.