|
|
--- |
|
|
title: Uni API |
|
|
emoji: π |
|
|
colorFrom: gray |
|
|
colorTo: yellow |
|
|
sdk: docker |
|
|
app_port: 8000 |
|
|
pinned: false |
|
|
license: gpl-3.0 |
|
|
--- |
|
|
|
|
|
# Uni API Deployment on HuggingFace Spaces |
|
|
|
|
|
This space deploys the uni-api service, which provides unified management of multiple LLM API backends. |
|
|
|
|
|
## Features |
|
|
|
|
|
- Unified API interface for multiple LLM providers |
|
|
- Load balancing across different API endpoints |
|
|
- Support for OpenAI, Anthropic, Gemini, and other providers |
|
|
- Automatic retry and failover mechanisms |
|
|
|
|
|
## Usage |
|
|
|
|
|
Once deployed, you can use this API endpoint like any OpenAI-compatible API: |
|
|
|
|
|
```bash |
|
|
curl -X POST https://your-space-name.hf.space/v1/chat/completions \ |
|
|
-H "Content-Type: application/json" \ |
|
|
-H "Authorization: Bearer YOUR_API_KEY" \ |
|
|
-d '{ |
|
|
"model": "gpt-4o", |
|
|
"messages": [{"role": "user", "content": "Hello!"}] |
|
|
}' |
|
|
``` |
|
|
|
|
|
## Configuration |
|
|
|
|
|
The API configuration is managed through the `API_YAML_CONTENT` secret in the Space settings. |
|
|
|
|
|
For more information, visit: https://github.com/yym68686/uni-api |