Update README.md
Browse files
README.md
CHANGED
|
@@ -26,6 +26,7 @@ SambaLingo-Arabic-Chat is a human aligned chat model trained in Arabic and Engli
|
|
| 26 |
- **Language(s):** Arabic, English
|
| 27 |
- **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
|
| 28 |
- **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
|
|
|
|
| 29 |
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
|
| 30 |
|
| 31 |
## Getting Started
|
|
@@ -82,6 +83,9 @@ The DPO phase was done on the [ultrafeedback](https://huggingface.co/datasets/Hu
|
|
| 82 |
## Tokenizer Details
|
| 83 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
| 84 |
|
|
|
|
|
|
|
|
|
|
| 85 |
## Uses
|
| 86 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
| 87 |
|
|
@@ -124,12 +128,12 @@ We would like to give a special thanks to the following groups:
|
|
| 124 |
|
| 125 |
## Cite SambaLingo
|
| 126 |
```
|
| 127 |
-
@
|
| 128 |
-
|
| 129 |
-
|
| 130 |
-
|
| 131 |
-
|
| 132 |
-
|
| 133 |
-
|
| 134 |
}
|
| 135 |
```
|
|
|
|
| 26 |
- **Language(s):** Arabic, English
|
| 27 |
- **Finetuned from model:** [Llama-2-7b](https://huggingface.co/meta-llama/Llama-2-7b-hf)
|
| 28 |
- **Try This Model:** [SambaLingo-chat-space](https://huggingface.co/spaces/sambanovasystems/SambaLingo-chat-space)
|
| 29 |
+
- **Paper:** [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
| 30 |
- **Blog Post**: [sambalingo-open-source-language-experts](https://sambanova.ai/blog/sambalingo-open-source-language-experts)
|
| 31 |
|
| 32 |
## Getting Started
|
|
|
|
| 83 |
## Tokenizer Details
|
| 84 |
We extended the vocabulary of the base llama model from 32,000 tokens to 57,000 tokens by adding up to 25,000 non-overlapping tokens from the new language.
|
| 85 |
|
| 86 |
+
## Evaluation
|
| 87 |
+
For evaluation results see our paper: [SambaLingo: Teaching Large Language Models New Languages](https://arxiv.org/abs/2404.05829)
|
| 88 |
+
|
| 89 |
## Uses
|
| 90 |
<!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
|
| 91 |
|
|
|
|
| 128 |
|
| 129 |
## Cite SambaLingo
|
| 130 |
```
|
| 131 |
+
@misc{csaki2024sambalingo,
|
| 132 |
+
title={SambaLingo: Teaching Large Language Models New Languages},
|
| 133 |
+
author={Zoltan Csaki and Bo Li and Jonathan Li and Qiantong Xu and Pian Pawakapan and Leon Zhang and Yun Du and Hengyu Zhao and Changran Hu and Urmish Thakker},
|
| 134 |
+
year={2024},
|
| 135 |
+
eprint={2404.05829},
|
| 136 |
+
archivePrefix={arXiv},
|
| 137 |
+
primaryClass={cs.CL}
|
| 138 |
}
|
| 139 |
```
|