feat: add README

This commit is contained in:
Sun-ZhenXing
2025-09-24 14:16:10 +08:00
parent 232517b58f
commit 70f39867cf
65 changed files with 1695 additions and 103 deletions

0
src/ollama/.env.example Normal file
View File

View File

@@ -1,19 +1,38 @@
# Ollama
拉取 DeepSeek R1 7B 模型:
[English](./README.md) | [中文](./README.zh.md)
```bash
docker exec -it ollama ollama pull deepseek-r1:7b
```
This service deploys Ollama for running local LLM models.
列出本地所有模型:
## Usage
```bash
docker exec -it ollama ollama list
```
- Pull DeepSeek R1 7B model:
API 请求获取本地所有模型:
```bash
docker exec -it ollama ollama pull deepseek-r1:7b
```
```bash
curl http://localhost:11434/api/tags 2> /dev/null | jq
```
- List all local models:
```bash
docker exec -it ollama ollama list
```
- Get all local models via API:
```bash
curl http://localhost:11434/api/tags 2> /dev/null | jq
```
## Services
- `ollama`: The Ollama service.
## Configuration
- `OLLAMA_VERSION`: The version of the Ollama image, default is `0.12.0`.
- `OLLAMA_PORT_OVERRIDE`: The host port for Ollama, default is `11434`.
## Volumes
- `ollama_models`: A volume for storing Ollama models.

38
src/ollama/README.zh.md Normal file
View File

@@ -0,0 +1,38 @@
# Ollama
[English](./README.md) | [中文](./README.zh.md)
此服务用于部署 Ollama本地运行大语言模型。
## 用法
- 拉取 DeepSeek R1 7B 模型:
```bash
docker exec -it ollama ollama pull deepseek-r1:7b
```
- 列出本地所有模型:
```bash
docker exec -it ollama ollama list
```
- 通过 API 获取本地所有模型:
```bash
curl http://localhost:11434/api/tags 2> /dev/null | jq
```
## 服务
- `ollama`: Ollama 服务。
## 配置
- `OLLAMA_VERSION`: Ollama 镜像的版本,默认为 `0.12.0`。
- `OLLAMA_PORT_OVERRIDE`: Ollama 的主机端口,默认为 `11434`。
## 卷
- `ollama_models`: 用于存储 Ollama 模型的卷。