This CerebrasGPT-111M model employs a standard Byte-Pair-Encoding (BPE) tokenizer for Danish text. It serves as a benchmark to compare against morphology-aware tokenizers.
Downloads last month
-
Safetensors
Model size
0.1B params
Tensor type
F16
·
Collection including OpenFacedSandwich/DA-BPE-CEREBRAS