distilbert-gsa-eula-opp / tokenizer_config.json
system's picture
system HF Staff
Update tokenizer_config.json
2b1e482
raw
history blame contribute delete
233 Bytes
{"do_lower_case": false, "model_max_length": 512, "special_tokens_map_file": "drive/My Drive/ai-ml-challenge-2020/submissions/distilbert/models/distilbert_gsa_eula_opp_pretrained/special_tokens_map.json", "full_tokenizer_file": null}