| Tokenization Log | |
| Start time: 2025-09-29 13:59:13 | |
| End time: 2025-09-29 13:59:21 | |
| Duration: 00:00:08 | |
| Total rows tokenized: 20360 | |
| Total chunks saved: 1 | |
| Chunk size: 100000 | |
| Tokenizer file: tokenizer_output/tokenizer.json | |
| Sequence length stats: | |
| Shortest sequence: 18 tokens | |
| Longest sequence: 449 tokens | |
| Rows <= 1024 tokens: 20360 | |
| Rows <= 2048 tokens: 20360 | |
| Rows > 2048 tokens: 0 | |