feat: add more services

This commit is contained in:
Sun-ZhenXing
2026-02-19 23:04:16 +08:00
parent be71b96317
commit 990b40d730
77 changed files with 2085 additions and 1 deletions

16
src/litellm/Makefile Normal file
View File

@@ -0,0 +1,16 @@
HELM_RELEASE_NAME ?= litellm
HELM_APPLICATION_NAME ?= litellm
HELM_NAMESPACE ?= litellm
HELM_DIR ?= ./helm
HELM_CHART_VERSION ?=
HELM_VALUES_FILE ?= ./values.yaml
HELM_OCI_REGISTRY ?=
HELM_OCI_NAMESPACE ?=
HELM_OCI_USERNAME ?=
HELM_OCI_PASSWORD ?=
HELM_REPO_NAME ?= litellm
HELM_REPO_URL ?= https://berriai.github.io/litellm-helm
HELM_CHART_REPO ?= $(HELM_REPO_NAME)/litellm-helm
HELM_LANE ?=
include ../_template/base.mk

28
src/litellm/README.md Normal file
View File

@@ -0,0 +1,28 @@
# LiteLLM
## Introduction
LiteLLM is a unified API to call over 100+ LLM APIs (OpenAI, Anthropic, Azure, VertexAI, Bedrock, Cohere, Mistral, Ollama, etc.) using the OpenAI format.
## Installation
To install LiteLLM, run:
```bash
make install
```
## Usage
After installation, verify the deployment:
```bash
kubectl get pods -n litellm
```
To configure LiteLLM, edit the values.yaml file with your LLM provider API keys.
## Documentation
- [Official LiteLLM Documentation](https://docs.litellm.ai/)
- [Helm Chart Source](https://github.com/BerriAI/litellm/tree/main/deploy/charts/litellm-helm)

28
src/litellm/README.zh.md Normal file
View File

@@ -0,0 +1,28 @@
# LiteLLM
## 简介
LiteLLM 是一个统一的 API可以使用 OpenAI 格式调用 100+ 个 LLM APIOpenAI、Anthropic、Azure、VertexAI、Bedrock、Cohere、Mistral、Ollama 等)。
## 安装
要安装 LiteLLM请运行
```bash
make install
```
## 使用
安装完成后,验证部署:
```bash
kubectl get pods -n litellm
```
要配置 LiteLLM请在 values.yaml 文件中编辑您的 LLM 提供商 API 密钥。
## 文档
- [官方 LiteLLM 文档](https://docs.litellm.ai/)
- [Helm Chart 源码](https://github.com/BerriAI/litellm/tree/main/deploy/charts/litellm-helm)

22
src/litellm/values.yaml Normal file
View File

@@ -0,0 +1,22 @@
# LiteLLM Helm Chart Values
# https://github.com/BerriAI/litellm/blob/main/deploy/charts/litellm-helm/values.yaml
# Default values for litellm
# This is a YAML-formatted file
image:
repository: ghcr.io/berriai/litellm-database
tag: latest
pullPolicy: IfNotPresent
replicaCount: 1
config:
# LiteLLM proxy configuration
# See: https://docs.litellm.ai/docs/proxy/configs
config.yaml: |
model_list:
- model_name: openai-gpt-4
litellm_params:
model: openai/gpt-4
api_key: os.environ/OPENAI_API_KEY