akseljoonas HF Staff commited on
Commit
56e52d8
·
1 Parent(s): 9ba4cd4

fix: merge consecutive assistant messages on page refresh

Browse files

The live streaming transport groups all text and tool parts into one
assistant UIMessage per turn (single start/finish pair). The message
converter was creating separate UIMessages for each backend assistant
message, splitting turns into multiple bubbles after refresh.

frontend/src/lib/convert-llm-messages.ts CHANGED
@@ -101,11 +101,20 @@ export function llmMessagesToUIMessages(
101
  }
102
  }
103
 
104
- uiMessages.push({
105
- id: nextId(),
106
- role: 'assistant',
107
- parts,
108
- });
 
 
 
 
 
 
 
 
 
109
  }
110
  }
111
 
 
101
  }
102
  }
103
 
104
+ // During live streaming the SDK groups all text + tool parts between
105
+ // user messages into one assistant UIMessage (one start/finish pair per
106
+ // turn). The backend stores multiple assistant messages per turn (one
107
+ // per LLM API call), so merge consecutive assistant messages to match.
108
+ const prev = uiMessages[uiMessages.length - 1];
109
+ if (prev && prev.role === 'assistant') {
110
+ prev.parts.push(...parts);
111
+ } else {
112
+ uiMessages.push({
113
+ id: nextId(),
114
+ role: 'assistant',
115
+ parts,
116
+ });
117
+ }
118
  }
119
  }
120