AureliusGPT / model /vocab /tokenize_test.txt
Tarush-AI's picture
Upload folder using huggingface_hub
f451089 verified
raw
history blame contribute delete
185 Bytes
This is a test sentence for the tokenizer.
It includes some punctuation: Hello, world!
And maybe a Greek word if the model supports it: κόσμος.
<BEGIN> Special tokens test <END>