| --- |
| library_name: transformers |
| tags: [Danish, Mixed Tokenization, LLaMA] |
| --- |
| ``` |
| _______ ___ .___ ___. ______ .______ .______ __ __ |
| | \ / \ | \/ | / __ \ | _ \ | _ \ | | | | |
| | .--. | / ^ \ | \ / | | | | | | |_) | | |_) | | |__| | |
| | | | | / /_\ \ | |\/| | | | | | | / | ___/ | __ | |
| | '--' | / _____ \ | | | | | `--' | | |\ \----.| | | | | | |
| |_______/ /__/ \__\ |__| |__| \______/ | _| `._____|| _| |__| |__| |
| |
| ``` |
|
|
| ### DA-MIXED-LLAMA3.2 |
|
|
| An experimental model built on the LLaMA-3.2 architecture, combining morphological and BPE tokenization strategies. This model investigates the effects of mixed tokenization on Danish language understanding and generation. |