gpt2-mild_pruning

This is a pruned version of gpt2 generated as part of the Atropos pruning exercise.

Pruning Details

  • Strategy: mild_pruning
  • Method: PyTorch magnitude-based unstructured pruning
  • Generated by: Atropos pruning exercise scripts

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("gpt2-mild_pruning")
tokenizer = AutoTokenizer.from_pretrained("gpt2-mild_pruning")

About Atropos

Atropos estimates ROI for pruning and quantization optimizations in LLM deployments.

This model was pruned to validate Atropos projections against actual results.

Citation

If you use this model, please cite the original model and Atropos:

@software{atropos,
  title = {Atropos: ROI Estimation for LLM Pruning},
  year = {2026},
}
Downloads last month
12
Safetensors
Model size
0.1B params
Tensor type
F32
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support