trained_lukeL_model / added_tokens.json
hama3's picture
Upload tokenizer
ebedb98
raw
history blame contribute delete
40 Bytes
{
"<ent2>": 32771,
"<ent>": 32770
}