free-coding-api / README.md
megharudushi's picture
Upload folder using huggingface_hub
4825115 verified
---
title: Free Coding API
emoji: πŸš€
colorFrom: blue
colorTo: purple
sdk: docker
pinned: false
license: mit
---
# πŸš€ Free Coding API
**OpenAI & Anthropic Compatible API for Coding Tasks**
Built with skills, not money! This is a free, open-source API endpoint that runs on HuggingFace Spaces, providing coding assistance similar to OpenAI Codex and Claude Code.
## Features
- βœ… **OpenAI API Compatible** (`/v1/chat/completions`)
- βœ… **Anthropic API Compatible** (`/v1/messages`)
- βœ… **Streaming Support** (SSE)
- βœ… **Coding Optimized** (Qwen2.5-Coder model)
- βœ… **100% Free** (Runs on HF Spaces free tier)
## Quick Start
### Using OpenAI SDK
```python
from openai import OpenAI
client = OpenAI(
base_url="https://YOUR-SPACE.hf.space/v1",
api_key="sk-free-coding-api"
)
response = client.chat.completions.create(
model="gpt-4", # Mapped to Qwen2.5-Coder
messages=[
{"role": "system", "content": "You are an expert Python developer."},
{"role": "user", "content": "Write a function to find prime numbers"}
],
stream=True
)
for chunk in response:
if chunk.choices[0].delta.content:
print(chunk.choices[0].delta.content, end="")
```
### Using Anthropic SDK
```python
import anthropic
client = anthropic.Anthropic(
base_url="https://YOUR-SPACE.hf.space",
api_key="sk-free-coding-api"
)
response = client.messages.create(
model="claude-3-sonnet", # Mapped to Qwen2.5-Coder
max_tokens=1024,
messages=[
{"role": "user", "content": "Write a REST API in FastAPI"}
]
)
print(response.content[0].text)
```
### Using cURL
```bash
# OpenAI format
curl -X POST https://YOUR-SPACE.hf.space/v1/chat/completions \
-H "Content-Type: application/json" \
-H "Authorization: Bearer sk-free-coding-api" \
-d '{
"model": "gpt-4",
"messages": [{"role": "user", "content": "Hello, write Python code"}]
}'
# Anthropic format
curl -X POST https://YOUR-SPACE.hf.space/v1/messages \
-H "Content-Type: application/json" \
-H "x-api-key: sk-free-coding-api" \
-d '{
"model": "claude-3-sonnet",
"max_tokens": 1024,
"messages": [{"role": "user", "content": "Hello, write Python code"}]
}'
```
## API Endpoints
| Endpoint | Method | Description |
|----------|--------|-------------|
| `/v1/chat/completions` | POST | OpenAI-compatible chat |
| `/v1/messages` | POST | Anthropic-compatible messages |
| `/v1/models` | GET | List available models |
| `/health` | GET | Health check |
| `/docs` | GET | Swagger UI documentation |
## Supported Models
All model names are aliases mapped to `Qwen2.5-Coder-1.5B-Instruct`:
**OpenAI aliases:** `gpt-4`, `gpt-4-turbo`, `gpt-3.5-turbo`, `codex`, `code-davinci-002`
**Anthropic aliases:** `claude-3-opus`, `claude-3-sonnet`, `claude-3-haiku`, `claude-3-5-sonnet`, `claude-code`
## Environment Variables
| Variable | Default | Description |
|----------|---------|-------------|
| `MODEL_ID` | `Qwen/Qwen2.5-Coder-1.5B-Instruct` | HuggingFace model to use |
| `API_KEY` | `sk-free-coding-api` | API key for authentication |
## Limitations
Running on HF Spaces free tier:
- **CPU only** (2 vCPU, 16GB RAM)
- **Response time**: 10-30 seconds for typical requests
- **Max context**: ~4K tokens
- **Best for**: Code generation, debugging, explanations
## Deploy Your Own
1. Fork this Space
2. (Optional) Set environment variables in Space Settings
3. Your API is ready at `https://YOUR-USERNAME-YOUR-SPACE.hf.space`
## License
MIT License - Build with skills, not money! πŸš€