File size: 909 Bytes
0d47ef8
c6c8587
 
0d47ef8
96a6300
0d47ef8
 
c2f6135
0d47ef8
 
96a6300
c6c8587
 
96a6300
 
 
 
c6c8587
 
 
96a6300
 
 
 
c6c8587
 
96a6300
c6c8587
96a6300
c6c8587
96a6300
 
 
c6c8587
96a6300
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
---
title: Visualisable AI Backend
emoji: 🧠
colorFrom: blue
colorTo: green
sdk: docker
pinned: false
short_description: LLM code generation with real-time trace visualization
---

# Visualisable.ai Backend Service

This is the backend service for Visualisable.ai, providing:

- Real-time model inference with trace extraction
- WebSocket streaming for live visualization
- REST API for model information and generation

## API Endpoints

- `GET /` - Health check
- `GET /health` - Detailed health status
- `GET /model/info` - Model architecture details
- `POST /generate` - Generate text with traces
- `WebSocket /ws` - Real-time trace streaming

## Configuration

Set the following secrets in your Space settings:

- `API_KEY` (optional) - API key for authentication

## Frontend

The frontend is deployed separately on Vercel. Connect it by setting the backend URL in your frontend environment variables.