AI_Conversations / docker-compose.yml
NeonClary
Initial commit: LLMChats3 - LLM conversation app with React + FastAPI, persona config, orchestrated conversation lifecycle, collapsible sidebar accordions, stop button, and export capabilities
085db90
raw
history blame contribute delete
199 Bytes
services:
app:
build: .
ports:
- "8000:8000"
env_file:
- .env
environment:
CORS_ORIGINS: "http://localhost:8000,http://localhost:3000"
restart: unless-stopped