Upload 4 files
Browse files
README.md
CHANGED
|
@@ -1,20 +1,24 @@
|
|
| 1 |
---
|
| 2 |
-
title: veureu-
|
| 3 |
-
emoji:
|
| 4 |
colorFrom: purple
|
| 5 |
-
colorTo:
|
| 6 |
sdk: gradio
|
| 7 |
sdk_version: "4.44.1"
|
| 8 |
app_file: app.py
|
| 9 |
pinned: false
|
| 10 |
---
|
| 11 |
|
| 12 |
-
#
|
| 13 |
|
| 14 |
## Endpoints
|
| 15 |
-
- **`/api/predict`** (Gradio):
|
| 16 |
-
|
| 17 |
-
- **`/api/
|
| 18 |
|
| 19 |
-
|
|
|
|
| 20 |
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
title: veureu-schat
|
| 3 |
+
emoji: 💬
|
| 4 |
colorFrom: purple
|
| 5 |
+
colorTo: red
|
| 6 |
sdk: gradio
|
| 7 |
sdk_version: "4.44.1"
|
| 8 |
app_file: app.py
|
| 9 |
pinned: false
|
| 10 |
---
|
| 11 |
|
| 12 |
+
# 💬 veureu-schat (Salamandra-7B-Instruct · ZeroGPU)
|
| 13 |
|
| 14 |
## Endpoints
|
| 15 |
+
- **`/api/predict`** (Gradio): entrada `["<prompt>"]` → salida `"<texto>"`.
|
| 16 |
+
➜ Este es el endpoint que usa el Space **engine**.
|
| 17 |
+
- **`/api/generate`** (Gradio): entrada `[prompt, system, max_new_tokens, temperature, top_p]` → salida `"<texto>"`.
|
| 18 |
|
| 19 |
+
### Variables de entorno
|
| 20 |
+
- `MODEL_ID` (opcional): por defecto `BSC-LT/salamandra-7b-instruct`.
|
| 21 |
|
| 22 |
+
### Notas
|
| 23 |
+
- El modelo usa `chat_template` si existe; si no, se compone un prompt clásico con bloque `system`.
|
| 24 |
+
- GPU: se activa con `@spaces.GPU` automáticamente (ZeroGPU).
|