Upload dataset
5de1018 verified - data Upload dataset
- 2.82 kB Update val.txt: humanities+wikitext corpus, vocab 4000
- 3.27 kB Upload dataset
- 3.8 kB Upload encode_corpus.py with huggingface_hub
- 13.8 kB Upload merges.txt with huggingface_hub
- 130 kB Upload tokenizer.json with huggingface_hub
- 268 kB Upload tokenizer_4k.json with huggingface_hub
- 3.18 GB Upload train.bin with huggingface_hub
- 549 MB Update train.txt: humanities+wikitext corpus, vocab 4000
- 34 MB Update train_enriched.jsonl: fixed chunker, regex, phase categorization
- 9.16 MB Update train_philosophy.txt: fixed chunker, regex, phase categorization
- 1.67 MB Update train_quadrivium.txt: fixed chunker, regex, phase categorization
- 11.5 MB Update train_trivium.txt: fixed chunker, regex, phase categorization
- 353 MB Upload val.bin with huggingface_hub
- 57.2 MB Update val.txt: humanities+wikitext corpus, vocab 4000
- 33.6 kB Upload vocab.json with huggingface_hub