ASTMODEL / README.md
mgbam's picture
Create README.md
6d4f3d6 verified
---
tags:
- adaptive-sparse-training
- energy-efficient
- sustainability
metrics:
- accuracy
- energy_savings
license: mit
language:
- en
---
# resnet18 (AST-Trained)
**Trained with 65% less energy than standard training**
## Model Details
- **Architecture:** resnet18
- **Dataset:** CIFAR-10
- **Training Method:** Adaptive Sparse Training (AST)
- **Target Activation Rate:** 35%
## Performance
- **Accuracy:** 6809.00%
- **Energy Savings:** 65%
- **Training Epochs:** 10
## Sustainability Report
This model was trained using Adaptive Sparse Training, which dynamically selects
the most important training samples. This resulted in:
-**65% energy savings** compared to standard training
- 🌍 **Lower carbon footprint**
- ⏱️ **Faster training time**
- 🎯 **Maintained accuracy** (minimal degradation)
## How to Use
```python
import torch
from torchvision import models
# Load model
model = models.resnet18(num_classes=10)
model.load_state_dict(torch.load("pytorch_model.bin"))
model.eval()
# Inference
# ... (your inference code)
```
## Training Details
**AST Configuration:**
- Target Activation Rate: 35%
- Adaptive PI Controller: Enabled
- Mixed Precision (AMP): Enabled
## Reproducing This Model
```bash
pip install adaptive-sparse-training
python -c "
from adaptive_sparse_training import AdaptiveSparseTrainer, ASTConfig
config = ASTConfig(target_activation_rate=0.35)
# ... (full training code)
"
```
## Citation
If you use this model or AST, please cite:
```bibtex
@software{adaptive_sparse_training,
title={Adaptive Sparse Training},
author={Idiakhoa, Oluwafemi},
year={2024},
url={https://github.com/oluwafemidiakhoa/adaptive-sparse-training}
}
```
## Acknowledgments
Trained using the `adaptive-sparse-training` package. Special thanks to the PyTorch and HuggingFace communities.
---
*This model card was auto-generated by the AST Training Dashboard.*