| library_name: transformers | |
| tags: [Danish, BPE Tokenization, CerebrasGPT] | |
| ### STD-BPE-CEREBRAS | |
| A standard CerebrasGPT-111M model using a pretrained Byte-Pair-Encoding (BPE) tokenizer. This model is used as a baseline for understanding how pretrained tokenizers perform on Danish text. |