Merged LLaMA Model

This is a merged version of the LLaMA2-13b model based on hyperboloid projections. The model retains 31 layers with significant performance retention across all benchmarks.

Downloads last month
1
Safetensors
Model size
10B params
Tensor type
F16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for namannn/llama2-13b-hyperbolic-cluster-pruned

Finetuned
(6)
this model
Quantizations
1 model

Space using namannn/llama2-13b-hyperbolic-cluster-pruned 1

Evaluation results