A standard CerebrasGPT-111M model using a pretrained Byte-Pair-Encoding (BPE) tokenizer. This model is used as a baseline for understanding how pretrained tokenizers perform on Danish text.
Downloads last month
2
Safetensors
Model size
0.1B params
Tensor type
F16
·
Collection including OpenFacedSandwich/STD-BPE-CEREBRAS