fnet_model / tokenizer /vectorizer_config.json
RyanJT's picture
Upload folder using huggingface_hub
67444cb verified
raw
history blame contribute delete
50 Bytes
{"max_tokens": 8192, "output_sequence_length": 40}