Missing tokenizer files (tokenizer.model, tokenizer.json)

#3
by ehsan-hightech - opened

Hi, thanks for sharing the Persian-Mistral-7B model.
I tried to load it with AutoTokenizer.from_pretrained("aidal/Persian-Mistral-7B") but it fails because the tokenizer files are missing.

Could you please upload the tokenizer files (tokenizer.model, tokenizer.json, special_tokens_map.json, tokenizer_config.json)?
Since the model was trained with a new Persian tokenizer, using the original Mistral tokenizer does not work properly (outputs are garbled).

Thanks a lot! πŸ™

Sign up or log in to comment