Hugging Face
Models
Datasets
Spaces
Community
Docs
Enterprise
Pricing
Log In
Sign Up
Datasets:
batmanLovesAI
/
HeliumLM-tokenizer
like
0
License:
mit
Dataset card
Files
Files and versions
xet
Community
main
HeliumLM-tokenizer
/
tokenizers
740 kB
1 contributor
History:
1 commit
batmanLovesAI
Removed previous tokenizer because they were very large and replaced them with custom small tokenizer
6af2b0c
about 1 month ago
merges.txt
68.5 kB
Removed previous tokenizer because they were very large and replaced them with custom small tokenizer
about 1 month ago
special_tokens_map.json
99 Bytes
Removed previous tokenizer because they were very large and replaced them with custom small tokenizer
about 1 month ago
tokenizer.json
554 kB
Removed previous tokenizer because they were very large and replaced them with custom small tokenizer
about 1 month ago
tokenizer_config.json
471 Bytes
Removed previous tokenizer because they were very large and replaced them with custom small tokenizer
about 1 month ago
vocab.json
117 kB
Removed previous tokenizer because they were very large and replaced them with custom small tokenizer
about 1 month ago