Update README.md
Browse files
README.md
CHANGED
|
@@ -27,6 +27,8 @@ language:
|
|
| 27 |
library_name: transformers
|
| 28 |
---
|
| 29 |
|
|
|
|
|
|
|
| 30 |
# Model Description:
|
| 31 |
|
| 32 |
**Tower+ 9B** is build on top of Gemma 2 9B. The model goes through the Continuous Pretraining (CPT), Instruction Tuning (IT), Weighted Preference Optimization (WPO). During all stages we include parallel and multilingual data (covering 22 languages).
|
|
|
|
| 27 |
library_name: transformers
|
| 28 |
---
|
| 29 |
|
| 30 |
+
[Tower-plus-pareto](https://huggingface.co/Unbabel/Tower-Plus-72B/blob/main/Tower-plus-pareto.png)
|
| 31 |
+
|
| 32 |
# Model Description:
|
| 33 |
|
| 34 |
**Tower+ 9B** is build on top of Gemma 2 9B. The model goes through the Continuous Pretraining (CPT), Instruction Tuning (IT), Weighted Preference Optimization (WPO). During all stages we include parallel and multilingual data (covering 22 languages).
|