The Llama 3 Herd of Models
Paper • 2407.21783 • Published • 118
YAML Metadata Warning:empty or missing yaml metadata in repo card
Check out the documentation for more information.
python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("toolevalxm/llama3.1-8b_spec") tokenizer = AutoTokenizer.from_pretrained("toolevalxm/llama3.1-8b_spec") ## Citation If you use this model, please cite our paper. Base Model This model is fine-tuned from meta-llama/Llama-3.1-8B. ## BibTeX Citation bibtex @article{grattafiori2024llama, title={The Llama 3 Herd of Models}, author={Grattafiori, Aaron and Dubey, Abhimanyu and Jauhri, Abhinav and Pandey, Abhinav and Kadian, Abhishek and Al-Dahle, Ahmad and Letman, Aiesha and Mathur, Akhil and Schelten, Alan and Vaughan, Alex and others}, journal={arXiv preprint arXiv:2407.21783}, year={2024} }
License
The license for this model is llama3.1.