One-api is an OpenAI interface management and distribution system that supports Azure, Anthropic Claude, Google PaLM 2, Zhipu ChatGLM, Baidu Wenxin Yiyuan, Xunfei Xinghuo Cognition, Ali Tongyi Qianwen, 360 Zhinao, and Tencent Hunyuan, and can be used for secondary distribution and management of keys.
- At the same time as load balancing, it effectively avoids the risk of key leakage.
Deploy One-api#
- Reverse Proxy
- Account:
root
- Password:
123456
mkdir -p ~/app/one-api && cd ~/app/one-api && nano docker-compose.yml
sudo docker-compose up -d
version: '3'
services:
one-api:
image: justsong/one-api:latest
environment:
- TZ=Asia/Shanghai
volumes:
- ./data:/data
restart: unless-stopped
networks:
default:
external: true
name: ngpm
Client#
- ChatGPT-Next-Web
- Immersive Translation
- Create a token in Tokens, and fill in the endpoint with the address of the reverse proxy.
Create a new channel#
- For Azure, make sure the deployed model name is
gpt-35-turbo
. - You can use one-api itself as a channel nesting.
- Load balancing is performed on different channels in the Logs.
Additional Deployment of Next-Web#
mkdir -p ~/app/next-web && cd ~/app/next-web && nano docker-compose.yml
sudo docker-compose up -d
version: '3'
services:
next-web:
image: yidadaa/chatgpt-next-web:latest
environment:
- TZ=Asia/Shanghai
- OPENAI_API_KEY=<token added to one-api>
- BASE_URL=<reverse proxy address of one-api>
- HIDE_USER_API_KEY=1
- DISABLE_GPT4=1
restart: unless-stopped
networks:
default:
external: true
name: ngpm