License: Must comply with license of Llama2 since it's a model derived from Llama2.


Pruned-LLaMA-2.7B is a model pruned and further pre-trained from meta-llama/Llama-2-7b-hf.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("minhchuxuan/pruned-2.7b")
tokenizer = AutoTokenizer.from_pretrained("minhchuxuan/pruned-2.7b")
Downloads last month
1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for minhchuxuan/pruned-2.7b

Quantizations
1 model