| | --- |
| | library_name: transformers |
| | license: apache-2.0 |
| | datasets: |
| | - HuggingFaceTB/smollm-corpus |
| | language: |
| | - en |
| | pipeline_tag: text-generation |
| | --- |
| | |
| | # **Doge 120M MoE checkpoint** |
| |
|
| |  |
| |
|
| | Doge uses `wsd_scheduler` as the training scheduler, which divides the learning rate into three stages: `warmup`, `stable`, and `decay`. It allows us to continue training on any new dataset from any checkpoint in the `stable stage` without spikes of the training. |
| |
|
| | Here are the initial learning rates required to continue training at each checkpoint: |
| |
|
| | - **[Doge-20M](https://huggingface.co/SmallDoge/Doge-20M-checkpoint)**: 8e-3 |
| | - **[Doge-20M-MoE](https://huggingface.co/SmallDoge/Doge-20M-MoE-checkpoint)**: 8e-3 |
| | - **[Doge-60M](https://huggingface.co/SmallDoge/Doge-60M-checkpoint)**: 6e-3 |
| | - **[Doge-120M-MoE](https://huggingface.co/SmallDoge/Doge-120M-MoE-checkpoint)**: 6e-3 |
| | - **[Doge-160M](https://huggingface.co/SmallDoge/Doge-160M-checkpoint)**: 4e-3 |
| | - **[Doge-480M-MoE](https://huggingface.co/SmallDoge/Doge-480M-MoE-checkpoint)**: 4e-3 |
| | - **[Doge-320M](https://huggingface.co/SmallDoge/Doge-320M-checkpoint)**: 2e-3 |
| | - **[Doge-1.4B-MoE](https://huggingface.co/SmallDoge/Doge-960M-MoE-checkpoint)**: 2e-3 |
| |
|
| | | Model | Learning Rate | Schedule | Warmup Steps | Stable Steps | |
| | |-------|---------------|----------|--------------|--------------| |
| | | [Doge-20M](https://huggingface.co/SmallDoge/Doge-20M-checkpoint) | 8e-3 | wsd_scheduler | 800 | 6400 | |
| | | [Doge-20M-MoE](https://huggingface.co/SmallDoge/Doge-20M-MoE-checkpoint) | 8e-3 | wsd_scheduler | 800 | 6400 | |
| | | [Doge-60M](https://huggingface.co/SmallDoge/Doge-60M-checkpoint) | 6e-3 | wsd_scheduler | 1600 | 12800 | |
| | | [Doge-120M-MoE](https://huggingface.co/SmallDoge/Doge-120M-MoE-checkpoint) | 6e-3 | wsd_scheduler | 1600 | 12800 | |
| | | [Doge-160M](https://huggingface.co/SmallDoge/Doge-160M-checkpoint) | 4e-3 | wsd_scheduler | 2400 | 19200 | |
| | | [Doge-480M-MoE](https://huggingface.co/SmallDoge/Doge-480M-MoE-checkpoint) | 4e-3 | wsd_scheduler | 2400 | 19200 | |
| | | [Doge-320M](https://huggingface.co/SmallDoge/Doge-320M-checkpoint) | 2e-3 | wsd_scheduler | 3200 | 25600 | |
| | | [Doge-1.4B-MoE](https://huggingface.co/SmallDoge/Doge-960M-MoE-checkpoint) | 2e-3 | wsd_scheduler | 3200 | 25600 | |
| |
|