Pruned Language Model: olusegunola/phi-3-mini-pruned-wanda
This is a structurally pruned version of the microsoft/Phi-3-mini-4k-instruct model using the Wanda (Weight-Activation Magnitude) pruning method.
- Base Model:
microsoft/Phi-3-mini-4k-instruct - Parameter Reduction: 44.26% of total trainable parameters were removed.
- Average Reconstruction Loss (MSE):
0.048602
Evaluation Results
Performance on common language model benchmarks (0-shot), limited to 100 samples:
| Task | Metric | Value |
|---|---|---|
| Evaluation not run |
- Downloads last month
- 1
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support