Instructions to use nluai/test_format2 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use nluai/test_format2 with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("vilm/vinallama-7b-chat") model = PeftModel.from_pretrained(base_model, "nluai/test_format2") - Notebooks
- Google Colab
- Kaggle
Upload tokenizer_config.json with huggingface_hub
Browse files- tokenizer_config.json +1 -1
tokenizer_config.json
CHANGED
|
@@ -50,7 +50,7 @@
|
|
| 50 |
"eos_token": "<|im_end|>",
|
| 51 |
"legacy": false,
|
| 52 |
"model_max_length": 1000000000000000019884624838656,
|
| 53 |
-
"pad_token": "<
|
| 54 |
"sp_model_kwargs": {},
|
| 55 |
"tokenizer_class": "LlamaTokenizer",
|
| 56 |
"unk_token": "<unk>",
|
|
|
|
| 50 |
"eos_token": "<|im_end|>",
|
| 51 |
"legacy": false,
|
| 52 |
"model_max_length": 1000000000000000019884624838656,
|
| 53 |
+
"pad_token": "</s>",
|
| 54 |
"sp_model_kwargs": {},
|
| 55 |
"tokenizer_class": "LlamaTokenizer",
|
| 56 |
"unk_token": "<unk>",
|