Files
compose-anything/src/ollama
Sun-ZhenXing c03771751c feat(opensandbox): add initial configuration files and documentation for OpenSandbox platform
feat(elasticsearch): upgrade Elasticsearch version to 9.3.0 in environment and docker-compose files

feat(gitlab): update GitLab version to 18.8.3-ce.0 in environment and docker-compose files

feat(grafana): bump Grafana version to 12.3.2 in environment and docker-compose files

feat(jenkins): upgrade Jenkins version to 2.541-lts-jdk17 in environment and docker-compose files

fix(minio): remove unnecessary newline in docker-compose file

feat(nginx): downgrade Nginx version to 1.28.2-alpine3.22 in environment and docker-compose files

feat(ollama): update Ollama version to 0.14.3 in environment and docker-compose files

feat(prometheus): upgrade Prometheus version to 3.5.1 in environment and docker-compose files

feat(rabbitmq): update RabbitMQ version to 4.2.3-management-alpine in environment and docker-compose files
2026-02-07 18:11:02 +08:00
..
2025-12-11 14:11:37 +08:00
2025-12-11 14:11:37 +08:00

Ollama

English | 中文

This service deploys Ollama for running local LLM models.

Usage

  • Pull DeepSeek R1 7B model:

    docker exec -it ollama-ollama-1 ollama pull deepseek-r1:7b
    
  • List all local models:

    docker exec -it ollama-ollama-1 ollama list
    
  • Get all local models via API:

    curl http://localhost:11434/api/tags 2> /dev/null | jq
    

Services

  • ollama: The Ollama service.

Configuration

  • OLLAMA_VERSION: The version of the Ollama image, default is 0.12.0.
  • OLLAMA_PORT_OVERRIDE: The host port for Ollama, default is 11434.

Volumes

  • ollama_models: A volume for storing Ollama models.

Troubleshooting

GPU Becomes Unavailable After Long Run (Linux Docker)

If Ollama initially works on the GPU in a Docker container, but then switches to running on CPU after some period of time with errors in the server log reporting GPU discovery failures, this can be resolved by disabling systemd cgroup management in Docker.

Edit /etc/docker/daemon.json on the host and add "exec-opts": ["native.cgroupdriver=cgroupfs"] to the Docker configuration:

{
  "exec-opts": ["native.cgroupdriver=cgroupfs"]
}

Then restart Docker:

sudo systemctl restart docker

For more details, see Ollama Troubleshooting - Linux Docker.