Update README.md
Browse files
README.md
CHANGED
|
@@ -27,6 +27,8 @@ language:
|
|
| 27 |
library_name: transformers
|
| 28 |
---
|
| 29 |
|
|
|
|
|
|
|
| 30 |
# Model Description:
|
| 31 |
|
| 32 |
**Tower+ 2B** is build on top of Gemma 2 2B. The model goes through the Continuous Pretraining (CPT), Instruction Tuning (IT), Weighted Preference Optimization (WPO) and GRPO with verifiable rewards. During all stages we include parallel and multilingual data (covering 22 languages).
|
|
|
|
| 27 |
library_name: transformers
|
| 28 |
---
|
| 29 |
|
| 30 |
+

|
| 31 |
+
|
| 32 |
# Model Description:
|
| 33 |
|
| 34 |
**Tower+ 2B** is build on top of Gemma 2 2B. The model goes through the Continuous Pretraining (CPT), Instruction Tuning (IT), Weighted Preference Optimization (WPO) and GRPO with verifiable rewards. During all stages we include parallel and multilingual data (covering 22 languages).
|