|
|
--- |
|
|
library_name: transformers |
|
|
tags: [] |
|
|
--- |
|
|
|
|
|
# gemma2-mitra-base |
|
|
|
|
|
This is based on gemma2-9b and continously pretrained for 2 epochs on a total of 7B tokens from various Buddhist data collections preserved in Sanskrit, Tibetan, English, and Pāli. |
|
|
A publication describing the dataset and training details will follow soon. |
|
|
|
|
|
## Model Details |
|
|
For details on how to run this please see the gemma2-9b repository: https://huggingface.co/google/gemma-2-9b |
|
|
Please be aware that this is a base model without any instruction finetuning, so it will perform badly on general tasks without giving at least few-shot examples. |
|
|
There is an instruction-finetuned version here: https://huggingface.co/buddhist-nlp/gemma-2-mitra-it |