gpt2-librispeech960-tokenizer / special_tokens_map.json
klemenk's picture
Upload tokenizer
50bf87d verified
{
"eos_token": "<|endoftext|>",
"pad_token": "<|pad|>",
"unk_token": "<|unk|>"
}