Text Generation
Transformers
Safetensors
English
kate
KateAI / custom_tokenizer.json
Toasteror's picture
Upload 2 files
d2d4009
raw
history blame
409 Bytes
{"word_to_index": {"[PAD]": 0, "[UNK]": 1, "[SOS]": 2, "[EOS]": 3, "a": 4, "sentence.": 5, "are": 6, "language": 7, "this": 8, "is": 9, "test": 10, "another": 11, "example": 12, "transformers": 13, "powerful": 14, "models.": 15, "let's": 16, "train": 17, "simple": 18, "model.": 19, "models": 20, "amazing": 21, "at": 22, "generating": 23, "text.": 24}, "special_tokens": ["[PAD]", "[UNK]", "[SOS]", "[EOS]"]}