bert_funting_test_ai10 / tokenizer_config.json
junzai's picture
add model first
eeeb761
raw
history blame contribute delete
59 Bytes
{"do_lower_case": false, "max_len": 512, "init_inputs": []}