File size: 255 Bytes
15d68fd |
1 2 3 4 5 6 7 |
{
"tokenizer_class": "CustomTokenizer",
"model_type": "custom",
"do_lower_case": false,
"vocab_size": 30000,
"note": "This model uses custom vocabulary-based tokenization. Tokenizer should be loaded from base model or use custom tokenization."
} |