metadata
license: apache-2.0
License: Must comply with license of Llama2 since it's a model derived from Llama2.
Pruned-LLaMA-1.3B is a model pruned and further pre-trained from meta-llama/Llama-2-7b-hf.
Usage
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("minhchuxuan/pruned-1.3b")
tokenizer = AutoTokenizer.from_pretrained("minhchuxuan/pruned-1.3b")