camembert-large / added_tokens.json

Commit History

Added Fast tokenizer files with model_max_length
757d2aa
unverified

wissamantoun commited on