Solar-Open-100B-pruned-5pct

This model is a pruned version of upstage/Solar-Open-100B.

Pruning Details

Property Value
Original Model upstage/Solar-Open-100B
Original Parameters 47.92B
Pruned Parameters 7.77B
Compression Ratio 0.1622 (6.2x smaller)
Strategy Layer + FFN (Recommended)
Importance Metric magnitude
Layers Removed 26
FFN Reduction 0.00%

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("llaa33219/Solar-Open-100B-pruned-5pct")
tokenizer = AutoTokenizer.from_pretrained("llaa33219/Solar-Open-100B-pruned-5pct")

Notes

This model was created using structured pruning techniques including:

  • Layer pruning (removing entire transformer layers)
  • FFN dimension pruning (reducing intermediate layer sizes)

The pruning was based on magnitude importance scoring to preserve the most important weights.

Downloads last month
3
Safetensors
Model size
8B params
Tensor type
F32
·
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for llaa33219/Solar-Open-100B-to-8B-test

Finetuned
(7)
this model