not a CamembertTokenizer?

#3
by AngledLuffa - opened

This model does not seem to use a CamembertTokenizer, but rather uses a BPE tokenizer. As written, it is not functional with transformers 5.3.0

See https://github.com/huggingface/transformers/issues/44488#issuecomment-4097093131

Would you change the model json so that it has the proper tokenizer type, possibly PreTrainedTokenizerFast?

Sign up or log in to comment