File size: 1,135 Bytes
e21004c acacd83 d6df056 e21004c d6df056 e21004c |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
license: apache-2.0
tags:
- pruned
- python
- optimized
base_model: LGAI-EXAONE/EXAONE-4.0-1.2B
---
# EXAONE-4.0-1.2B-python-heavy
This model is a **heavy** pruned version of [LGAI-EXAONE/EXAONE-4.0-1.2B](https://huggingface.co/LGAI-EXAONE/EXAONE-4.0-1.2B), specialized for **PYTHON** tasks.
## Pruning Details
- **Base Model**: LGAI-EXAONE/EXAONE-4.0-1.2B
- **Specialization**: Python
- **Prune Mode**: Heavy
- **Method**: Activation-based weight pruning
## Performance Comparison
| Category | Original | Pruned |
|----------|----------|--------|
| Python | 20.0% | 40.0% |
| HTML | 6.7% | 6.7% |
| Trivia | 86.7% | 86.7% |
| Math | 60.0% | 60.0% |
| Reasoning | N/A | N/A |
| Medical | 93.3% | 80.0% |
| Linux | 93.3% | 86.7% |
| Writing | 46.7% | 40.0% |

## Usage
```python
from transformers import AutoModelForCausalLM, AutoTokenizer
model = AutoModelForCausalLM.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-heavy")
tokenizer = AutoTokenizer.from_pretrained("CompactAI/EXAONE-4.0-1.2B-python-heavy")
```
## License
This model inherits the license from the base model.
|