wikiqa_tokenized / tokenization_log.txt
Mavies5526's picture
Upload folder using huggingface_hub
5b25df4 verified
raw
history blame contribute delete
373 Bytes
Tokenization Log
Start time: 2025-09-29 13:59:13
End time: 2025-09-29 13:59:21
Duration: 00:00:08
Total rows tokenized: 20360
Total chunks saved: 1
Chunk size: 100000
Tokenizer file: tokenizer_output/tokenizer.json
Sequence length stats:
Shortest sequence: 18 tokens
Longest sequence: 449 tokens
Rows <= 1024 tokens: 20360
Rows <= 2048 tokens: 20360
Rows > 2048 tokens: 0