YuPeng0214 commited on
Commit
697a71f
·
verified ·
1 Parent(s): e85acc5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -186,14 +186,14 @@ documents = [
186
  input_texts = queries + documents
187
  model = LLM(model="Kingsoft-LLM/QZhou-Embedding")
188
  outputs = model.embed(input_texts)
189
- scores = [F.normalize(torch.tensor(x.outputs.embedding), p=2, dim=0) for x in outputs]
190
  ```
191
 
192
  ### FAQs
193
  **1. Does the model support MRL?**<br>
194
  The model currently does not support MRL in this release due to observed performance degradation.<br>
195
  **2. Why not build upon the Qwen3 series models?**<br>
196
- Our initial research experiments commenced prior to the release of Qwen3. To maintain our experimental consistency, we retained the original base model throughout the study😊😊😊. While we subsequently conducted first-stage (retrieval) training with Qwen3, the performance after 32k steps showed no significant improvement over Qwen2.5, leading to discontinuation of further development with this architecture.
197
 
198
  ### Citation
199
  If you find our work worth citing, please use the following citation:<br>
 
186
  input_texts = queries + documents
187
  model = LLM(model="Kingsoft-LLM/QZhou-Embedding")
188
  outputs = model.embed(input_texts)
189
+ outputs = [F.normalize(torch.tensor(x.outputs.embedding), p=2, dim=0) for x in outputs]
190
  ```
191
 
192
  ### FAQs
193
  **1. Does the model support MRL?**<br>
194
  The model currently does not support MRL in this release due to observed performance degradation.<br>
195
  **2. Why not build upon the Qwen3 series models?**<br>
196
+ Our initial research experiments commenced prior to the release of Qwen3. We retained the original base model throughout the study to maintain our experimental consistency. While we subsequently conducted first-stage (retrieval) training with Qwen3, the performance after 32k steps showed no significant improvement over Qwen2.5, leading to discontinuation of further development with this architecture.
197
 
198
  ### Citation
199
  If you find our work worth citing, please use the following citation:<br>