Where is tokenizer.model ?
When I downloaded this model, I received an error message stating that tokenizer.model was missing. Please check the hash value of the uploaded file just to be sure. Please also upload tokenizer.model.
Hi, thanks for raising this.
This repo only includes tokenizer.json (the JSON format tokenizer), which works fine with vLLM and newer versions of Hugging Face Transformers. That鈥檚 why it runs without issues on my side.
If your environment is still looking for tokenizer.model, it鈥檚 likely because you are using an older version of Transformers that expects the SentencePiece file. Updating to the latest transformers should solve the problem.
Alternatively, you can copy the tokenizer.model file from the official Gemma-3-27B repo (google/gemma-3-27b-it) into this folder if you need compatibility.
Hope this helps!
I鈥檝e just uploaded a copy of the tokenizer.model file from the official Gemma-3-27B repo to this repository.
Thank you for uploading tokenizer.model.