exp-29ft / tokenizer /understanding_yap.json
WICKED4950's picture
Upload tokenizer/understanding_yap.json with huggingface_hub
5da072b verified
raw
history blame contribute delete
158 Bytes
{
"name": "kaggle_testing",
"vocab_size": 16384,
"special_tokens": [
"<|EOS|>",
"<|UK|>"
],
"eos_token": "<|EOS|>",
"uk_token": "<|UK|>"
}