Strip thinking content from all LLM responses and skip redundant role generation 6d068d4 NeonClary commited on Mar 26
Add response timing, chat stats, and role prompt viewer to settings 03151b9 NeonClary commited on Mar 26
Rebrand UI with Neon.ai header, sidebar styling, expert personas, and UX improvements 0f3e9c6 NeonClary commited on Mar 26
Add speed-priority setting with 5s racing fallback for faster responses 2845211 NeonClary commited on Mar 25
Keep height:100vh scroll containment at all breakpoints to fix HF iframe 5d098fa NeonClary commited on Mar 23
Add freeform persona input mode with file upload and Developer menu toggle 1a97647 NeonClary commited on Mar 23
Add Developer menu with orchestrator picker, download buttons, and LLMChats2-style dropdown 3e23f7a NeonClary commited on Mar 23
Initial commit: LLMChats3 - LLM conversation app with React + FastAPI, persona config, orchestrated conversation lifecycle, collapsible sidebar accordions, stop button, and export capabilities 085db90 NeonClary commited on Mar 23