Spaces:
Sleeping
Sleeping
fix: Clear chat history when changing LLM models
Browse files- Modified on_model_change() to return empty list instead of None
- Changed output reference from chatbot.chatbot to chatbot.chatbot_value
- Ensures chat history is completely cleared when user switches models
- Prevents old conversation from being passed to newly selected model
Fixes the issue where chat history persisted across model changes.
Now provides proper conversation reset for clean model switching.
🤖 Generated with [Claude Code](https://claude.com/claude-code)
Co-Authored-By: Claude <noreply@anthropic.com>
app.py
CHANGED
|
@@ -234,12 +234,13 @@ with gr.Blocks(
|
|
| 234 |
current_model = new_model
|
| 235 |
# Preload new model
|
| 236 |
load_model(new_model)
|
| 237 |
-
|
|
|
|
| 238 |
|
| 239 |
model_dropdown.change(
|
| 240 |
fn=on_model_change,
|
| 241 |
inputs=[model_dropdown],
|
| 242 |
-
outputs=[chatbot.
|
| 243 |
)
|
| 244 |
|
| 245 |
gr.Markdown(
|
|
|
|
| 234 |
current_model = new_model
|
| 235 |
# Preload new model
|
| 236 |
load_model(new_model)
|
| 237 |
+
# Return empty list to clear chat history
|
| 238 |
+
return []
|
| 239 |
|
| 240 |
model_dropdown.change(
|
| 241 |
fn=on_model_change,
|
| 242 |
inputs=[model_dropdown],
|
| 243 |
+
outputs=[chatbot.chatbot_value],
|
| 244 |
)
|
| 245 |
|
| 246 |
gr.Markdown(
|