--- base_model: unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit tags: - text-generation-inference - transformers - unsloth - llama - trl license: apache-2.0 language: - en --- This only example from Unsloth.ai This model based on Llama 3.2 3B is trained for understand the "Byte Latent Transformer: Patches Scale Better Than Tokens" [research paper] (https://ai.meta.com/research/publications/## byte-latent-transformer-patches-scale-better-than-tokens/) that was ## published in December 2024. # Uploaded model - **Developed by:** eugrug-60 - **License:** apache-2.0 - **Finetuned from model :** unsloth/llama-3.2-3b-instruct-unsloth-bnb-4bit This llama model was trained 2x faster with [Unsloth](https://github.com/unslothai/unsloth) and Huggingface's TRL library. [](https://github.com/unslothai/unsloth)