|
|
--- |
|
|
license: apache-2.0 |
|
|
tags: |
|
|
- transformer |
|
|
- education |
|
|
- arithmetic |
|
|
- algorithmic-learning |
|
|
language: |
|
|
- en |
|
|
pipeline_tag: text-generation |
|
|
--- |
|
|
|
|
|
# addGPT: Teaching Transformers to Add |
|
|
|
|
|
A transformer encoder-decoder that learns integer addition, demonstrating algorithmic learning capabilities. |
|
|
|
|
|
## Model Description |
|
|
|
|
|
- **Architecture**: Transformer encoder-decoder |
|
|
- **Parameters**: ~2.7M (tiny config) |
|
|
- **Task**: Integer addition (up to 5-digit numbers) |
|
|
- **Accuracy**: >99% on held-out test set |
|
|
|
|
|
## Usage |
|
|
|
|
|
```python |
|
|
# Download checkpoint |
|
|
from huggingface_hub import hf_hub_download |
|
|
checkpoint_path = hf_hub_download(repo_id="m-bano/addGPT", filename="ckpt_step12500.pt") |
|
|
|
|
|
# Or use the provided script |
|
|
# python download_checkpoint.py |
|
|
``` |
|
|
|
|
|
See the [GitHub repository](https://github.com/mbano/addGPT) for complete code and usage instructions. |
|
|
|
|
|
## Training Details |
|
|
|
|
|
- **Framework**: PyTorch 2.0+ |
|
|
- **Training Time**: ~10 minutes on GPU |
|
|
- **Dataset**: 6.4M randomly generated addition problems |
|
|
- **Optimizer**: AdamW with cosine annealing |
|
|
|
|
|
## Configuration |
|
|
|
|
|
```yaml |
|
|
n_emb: 512 |
|
|
n_heads: 8 |
|
|
n_blocks: 4 |
|
|
max_src_len: 5 # 5-digit operands |
|
|
``` |
|
|
|
|
|
## License |
|
|
|
|
|
Apache 2.0 |