Mykes commited on
Commit
b8a722d
·
verified ·
1 Parent(s): 01f119c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -26,7 +26,7 @@ The fine-tuning process utilized the [Continued Pretraining](https://docs.unslot
26
 
27
  ## 🔍 Model Details
28
 
29
- - **Base Model**: [Gemma-2b-it](https://huggingface.co/google/gemma-2b-it)
30
  - **Languages Supported**: English and Russian
31
  - **Training Method**: [Continued Pretraining](https://docs.unsloth.ai/basics/continued-pretraining)
32
  - **Epochs**: 10
@@ -192,7 +192,7 @@ For issues, questions, or suggestions, feel free to open an issue in the reposit
192
 
193
  ## 🔍 Детали модели
194
 
195
- - **Базовая модель**: [Gemma-2b-it](https://huggingface.co/google/gemma-2b-it)
196
  - **Поддерживаемые языки**: Русский и английский
197
  - **Метод обучения**: [Continued Pretraining](https://docs.unsloth.ai/basics/continued-pretraining)
198
  - **Количество эпох**: 10
 
26
 
27
  ## 🔍 Model Details
28
 
29
+ - **Base Model**: [Gemma2-2b-it](https://huggingface.co/google/gemma-2b-it)
30
  - **Languages Supported**: English and Russian
31
  - **Training Method**: [Continued Pretraining](https://docs.unsloth.ai/basics/continued-pretraining)
32
  - **Epochs**: 10
 
192
 
193
  ## 🔍 Детали модели
194
 
195
+ - **Базовая модель**: [Gemma2-2b-it](https://huggingface.co/google/gemma-2b-it)
196
  - **Поддерживаемые языки**: Русский и английский
197
  - **Метод обучения**: [Continued Pretraining](https://docs.unsloth.ai/basics/continued-pretraining)
198
  - **Количество эпох**: 10