cogvideox-2b-endpoint
CogVideoX-2b with custom handler for Inference Endpoints
Usage
This repository contains a custom inference handler for HuggingFace Inference Endpoints.
Deploy
- Go to HuggingFace Inference Endpoints
- Click "New Endpoint"
- Select this repository:
ainativestudio/cogvideox-2b-endpoint - Choose GPU instance (NVIDIA L4 or A100)
- Deploy
API Usage
import requests
import base64
response = requests.post(
"YOUR_ENDPOINT_URL",
headers={"Authorization": "Bearer YOUR_TOKEN"},
json={
"inputs": "A beautiful sunset over the ocean",
"num_inference_steps": 50,
"guidance_scale": 6.0
}
)
video_base64 = response.json()["video"]
video_bytes = base64.b64decode(video_base64)
with open("output.mp4", "wb") as f:
f.write(video_bytes)
Custom Handler
The handler.py file implements the EndpointHandler class required for custom inference.
See the HuggingFace documentation for more details.
- Downloads last month
- -
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support