Update README.md
Browse files
README.md
CHANGED
|
@@ -45,7 +45,7 @@ The overall training process was conducted with reference to snowflake-arctic-2.
|
|
| 45 |
**In V2, a three-stage training process was introduced as a key component of the overall learning strategy.**<br>
|
| 46 |
The training process consisted of three stages: Adaptation-training, Pre-training, and Post-training.
|
| 47 |
|
| 48 |
-
* In the adaptation-training stage, we observed through preliminary experiments that multi-vector
|
| 49 |
* In the pre-training stage, we introduced knowledge distillation, **where the multi-vector retrieval loss was distilled into the dense retrieval loss**. This allowed the model to capture fine-grained token-level similarity signals while being trained with in-batch negatives.
|
| 50 |
* In the post-training stage, we utilized the multilingual-e5-large model to mine hard negatives—specifically, the top 4 samples with a similarity score below a 99% threshold—and fine-tuned the model further using these examples.
|
| 51 |
|
|
|
|
| 45 |
**In V2, a three-stage training process was introduced as a key component of the overall learning strategy.**<br>
|
| 46 |
The training process consisted of three stages: Adaptation-training, Pre-training, and Post-training.
|
| 47 |
|
| 48 |
+
* In the adaptation-training stage, we observed through preliminary experiments that multi-vector retrieval consistently outperformed standard dense retrieval. To reflect this, we first trained the model using a multi-vector retrieval objective.
|
| 49 |
* In the pre-training stage, we introduced knowledge distillation, **where the multi-vector retrieval loss was distilled into the dense retrieval loss**. This allowed the model to capture fine-grained token-level similarity signals while being trained with in-batch negatives.
|
| 50 |
* In the post-training stage, we utilized the multilingual-e5-large model to mine hard negatives—specifically, the top 4 samples with a similarity score below a 99% threshold—and fine-tuned the model further using these examples.
|
| 51 |
|