- Introduced Convex, an open-source reactive database, with README and environment variable configurations.
- Added Chinese translation for Convex documentation.
- Created docker-compose configuration for Convex services.
- Introduced llama-swap, a model swapping proxy for OpenAI/Anthropic compatible servers, with comprehensive README and example configuration.
- Added Chinese translation for llama-swap documentation.
- Included example environment file and docker-compose setup for llama-swap.
- Configured health checks and resource limits for both Convex and llama-swap services.
feat(elasticsearch): upgrade Elasticsearch version to 9.3.0 in environment and docker-compose files
feat(gitlab): update GitLab version to 18.8.3-ce.0 in environment and docker-compose files
feat(grafana): bump Grafana version to 12.3.2 in environment and docker-compose files
feat(jenkins): upgrade Jenkins version to 2.541-lts-jdk17 in environment and docker-compose files
fix(minio): remove unnecessary newline in docker-compose file
feat(nginx): downgrade Nginx version to 1.28.2-alpine3.22 in environment and docker-compose files
feat(ollama): update Ollama version to 0.14.3 in environment and docker-compose files
feat(prometheus): upgrade Prometheus version to 3.5.1 in environment and docker-compose files
feat(rabbitmq): update RabbitMQ version to 4.2.3-management-alpine in environment and docker-compose files
feat(openclaw): introduce OpenClaw personal AI assistant with multi-channel support and CLI
fix(mineru): update MinerU version to 2.7.6 in Dockerfile and documentation
- Created README.zh.md for DeepTutor with comprehensive features, installation steps, and usage instructions in Chinese.
- Added docker-compose.yaml for DeepTutor to define services, environment variables, and resource limits.
- Introduced .env.example for llama.cpp with configuration options for server settings and resource management.
- Added README.md and README.zh.md for llama.cpp detailing features, prerequisites, quick start guides, and API documentation.
- Implemented docker-compose.yaml for llama.cpp to support various server configurations (CPU, CUDA, ROCm) and CLI usage.