--- language: en license: mit # Adjust based on original model tags: - pruning - sparse - atropos - code-generation --- # gpt2-mild_pruning This is a pruned version of [gpt2](https://huggingface.co/gpt2) generated as part of the Atropos pruning exercise. ## Pruning Details - **Strategy:** mild_pruning - **Method:** PyTorch magnitude-based unstructured pruning - **Generated by:** Atropos pruning exercise scripts ## Usage ```python from transformers import AutoModelForCausalLM, AutoTokenizer model = AutoModelForCausalLM.from_pretrained("gpt2-mild_pruning") tokenizer = AutoTokenizer.from_pretrained("gpt2-mild_pruning") ``` ## About Atropos [Atropos](https://github.com/its-not-rocket-science/atropos) estimates ROI for pruning and quantization optimizations in LLM deployments. This model was pruned to validate Atropos projections against actual results. ## Citation If you use this model, please cite the original model and Atropos: ```bibtex @software{atropos, title = {Atropos: ROI Estimation for LLM Pruning}, year = {2026}, } ```