feat: add more Agent services & easytier

This commit is contained in:
Summer Shen
2026-04-19 12:26:54 +08:00
parent 0e948befac
commit 0b5ba69cb0
30 changed files with 1775 additions and 0 deletions
+23
View File
@@ -0,0 +1,23 @@
# Global Registry Prefix (optional)
# GLOBAL_REGISTRY=
# Letta Image Version
LETTA_VERSION=0.16.7
# Timezone
TZ=UTC
# Host port for the Letta REST API server
LETTA_PORT_OVERRIDE=8283
# LLM Provider API Keys (optional; at least one is required for agent functionality)
# OPENAI_API_KEY=sk-...
# ANTHROPIC_API_KEY=sk-ant-...
# GROQ_API_KEY=gsk_...
# OLLAMA_BASE_URL=http://host.docker.internal:11434
# Resource Limits
LETTA_CPU_LIMIT=1
LETTA_MEMORY_LIMIT=1G
LETTA_CPU_RESERVATION=0.25
LETTA_MEMORY_RESERVATION=256M
+49
View File
@@ -0,0 +1,49 @@
# Letta
[English](./README.md) | [中文](./README.zh.md)
Quick start: <https://docs.letta.com>.
This service deploys Letta (formerly MemGPT), a framework for building stateful AI agents with long-term memory, persistent state, and tool use. Letta exposes a REST API for creating and managing agents programmatically.
## Services
- `letta`: The Letta agent server.
## Quick Start
```bash
docker compose up -d
```
The Letta REST API will be available at `http://localhost:8283`. You can interact with it via the [Letta Python SDK](https://github.com/letta-ai/letta) or the [ADE web interface](https://app.letta.com).
To connect a local LLM (Ollama), set `OLLAMA_BASE_URL` in your `.env` file before starting.
## Configuration
| Variable | Description | Default |
| ---------------------- | -------------------------------------------------------- | -------- |
| `LETTA_VERSION` | Image version | `0.16.7` |
| `TZ` | Container timezone | `UTC` |
| `LETTA_PORT_OVERRIDE` | Host port for the REST API | `8283` |
| `OPENAI_API_KEY` | OpenAI API key (optional) | *(empty)*|
| `ANTHROPIC_API_KEY` | Anthropic API key (optional) | *(empty)*|
| `GROQ_API_KEY` | Groq API key (optional) | *(empty)*|
| `OLLAMA_BASE_URL` | Ollama base URL, e.g. `http://host.docker.internal:11434`| *(empty)*|
| `LETTA_CPU_LIMIT` | CPU limit | `1` |
| `LETTA_MEMORY_LIMIT` | Memory limit | `1G` |
| `LETTA_CPU_RESERVATION`| CPU reservation | `0.25` |
## Volumes
- `letta_data`: Persists agent state, memory, and configuration at `/root/.letta`.
## Ports
- **8283**: REST API
## Notes
- At least one LLM provider API key (or `OLLAMA_BASE_URL`) is required to create functioning agents.
- The health check uses the `/health` endpoint.
+49
View File
@@ -0,0 +1,49 @@
# Letta
[English](./README.md) | [中文](./README.zh.md)
快速开始:<https://docs.letta.com>。
此服务用于部署 Letta(前身为 MemGPT),一个用于构建具备长期记忆、持久状态和工具调用能力的有状态 AI Agent 框架。Letta 提供 REST API,支持以编程方式创建和管理 Agent。
## 服务
- `letta`Letta Agent 服务器。
## 快速开始
```bash
docker compose up -d
```
Letta REST API 将在 `http://localhost:8283` 可用。你可以通过 [Letta Python SDK](https://github.com/letta-ai/letta) 或 [ADE Web 界面](https://app.letta.com) 与其交互。
如需连接本地 LLM(Ollama),请在启动前在 `.env` 文件中设置 `OLLAMA_BASE_URL`
## 配置
| 变量 | 说明 | 默认值 |
| ---------------------- | ----------------------------------------------------------- | -------- |
| `LETTA_VERSION` | 镜像版本 | `0.16.7` |
| `TZ` | 容器时区 | `UTC` |
| `LETTA_PORT_OVERRIDE` | REST API 的宿主机端口 | `8283` |
| `OPENAI_API_KEY` | OpenAI API Key(可选) | *(空)* |
| `ANTHROPIC_API_KEY` | Anthropic API Key(可选) | *(空)* |
| `GROQ_API_KEY` | Groq API Key(可选) | *(空)* |
| `OLLAMA_BASE_URL` | Ollama 基础 URL,例如 `http://host.docker.internal:11434` | *(空)* |
| `LETTA_CPU_LIMIT` | CPU 限制 | `1` |
| `LETTA_MEMORY_LIMIT` | 内存限制 | `1G` |
| `LETTA_CPU_RESERVATION`| CPU 预留 | `0.25` |
## 数据卷
- `letta_data`:在 `/root/.letta` 持久化 Agent 状态、记忆和配置。
## 端口
- **8283**REST API
## 说明
- 创建可用的 Agent 至少需要一个 LLM 提供商的 API Key(或 `OLLAMA_BASE_URL`)。
- 健康检查使用 `/health` 端点。
+43
View File
@@ -0,0 +1,43 @@
x-defaults: &defaults
restart: unless-stopped
logging:
driver: json-file
options:
max-size: 100m
max-file: '3'
services:
letta:
<<: *defaults
image: ${GLOBAL_REGISTRY:-}letta/letta:${LETTA_VERSION:-0.16.7}
ports:
- '${LETTA_PORT_OVERRIDE:-8283}:8283'
volumes:
- letta_data:/root/.letta
environment:
- TZ=${TZ:-UTC}
- OPENAI_API_KEY=${OPENAI_API_KEY:-}
- ANTHROPIC_API_KEY=${ANTHROPIC_API_KEY:-}
- GROQ_API_KEY=${GROQ_API_KEY:-}
- OLLAMA_BASE_URL=${OLLAMA_BASE_URL:-}
healthcheck:
test:
- CMD
- python3
- -c
- "import urllib.request; urllib.request.urlopen('http://localhost:8283/health')"
interval: 30s
timeout: 10s
retries: 5
start_period: 20s
deploy:
resources:
limits:
cpus: ${LETTA_CPU_LIMIT:-1}
memory: ${LETTA_MEMORY_LIMIT:-1G}
reservations:
cpus: ${LETTA_CPU_RESERVATION:-0.25}
memory: ${LETTA_MEMORY_RESERVATION:-256M}
volumes:
letta_data: