Spaces:
Running
Running
File size: 3,542 Bytes
3352264 4825115 3352264 4825115 3352264 4825115 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 |
---
title: Free Coding API
emoji: π
colorFrom: blue
colorTo: purple
sdk: docker
pinned: false
license: mit
---
# π Free Coding API
**OpenAI & Anthropic Compatible API for Coding Tasks**
Built with skills, not money! This is a free, open-source API endpoint that runs on HuggingFace Spaces, providing coding assistance similar to OpenAI Codex and Claude Code.
## Features
- β
**OpenAI API Compatible** (`/v1/chat/completions`)
- β
**Anthropic API Compatible** (`/v1/messages`)
- β
**Streaming Support** (SSE)
- β
**Coding Optimized** (Qwen2.5-Coder model)
- β
**100% Free** (Runs on HF Spaces free tier)
## Quick Start
### Using OpenAI SDK
```python
from openai import OpenAI
client = OpenAI(
base_url="https://YOUR-SPACE.hf.space/v1",
api_key="sk-free-coding-api"
)
response = client.chat.completions.create(
model="gpt-4", # Mapped to Qwen2.5-Coder
messages=[
{"role": "system", "content": "You are an expert Python developer."},
{"role": "user", "content": "Write a function to find prime numbers"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
```
### Using Anthropic SDK
```python
import anthropic
client = anthropic.Anthropic(
base_url="https://YOUR-SPACE.hf.space",
api_key="sk-free-coding-api"
)
response = client.messages.create(
model="claude-3-sonnet", # Mapped to Qwen2.5-Coder
max_tokens=1024,
messages=[
{"role": "user", "content": "Write a REST API in FastAPI"}
]
)
print(response.content[0].text)
```
### Using cURL
```bash
# OpenAI format
curl -X POST https://YOUR-SPACE.hf.space/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-free-coding-api" \
-d '{
"model": "gpt-4",
"messages": [{"role": "user", "content": "Hello, write Python code"}]
}'
# Anthropic format
curl -X POST https://YOUR-SPACE.hf.space/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: sk-free-coding-api" \
-d '{
"model": "claude-3-sonnet",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello, write Python code"}]
}'
```
## API Endpoints
| Endpoint | Method | Description |
|----------|--------|-------------|
| `/v1/chat/completions` | POST | OpenAI-compatible chat |
| `/v1/messages` | POST | Anthropic-compatible messages |
| `/v1/models` | GET | List available models |
| `/health` | GET | Health check |
| `/docs` | GET | Swagger UI documentation |
## Supported Models
All model names are aliases mapped to `Qwen2.5-Coder-1.5B-Instruct`:
**OpenAI aliases:** `gpt-4`, `gpt-4-turbo`, `gpt-3.5-turbo`, `codex`, `code-davinci-002`
**Anthropic aliases:** `claude-3-opus`, `claude-3-sonnet`, `claude-3-haiku`, `claude-3-5-sonnet`, `claude-code`
## Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| `MODEL_ID` | `Qwen/Qwen2.5-Coder-1.5B-Instruct` | HuggingFace model to use |
| `API_KEY` | `sk-free-coding-api` | API key for authentication |
## Limitations
Running on HF Spaces free tier:
- **CPU only** (2 vCPU, 16GB RAM)
- **Response time**: 10-30 seconds for typical requests
- **Max context**: ~4K tokens
- **Best for**: Code generation, debugging, explanations
## Deploy Your Own
1. Fork this Space
2. (Optional) Set environment variables in Space Settings
3. Your API is ready at `https://YOUR-USERNAME-YOUR-SPACE.hf.space`
## License
MIT License - Build with skills, not money! π
|