EarlyCheckpoint-v1 / README.md
SOTAagi2030's picture
Upload folder using huggingface_hub
cb3c184 verified
---
license: apache-2.0
library_name: transformers
---
# EarlyCheckpoint
<!-- markdownlint-disable first-line-h1 -->
<!-- markdownlint-disable html -->
<!-- markdownlint-disable no-duplicate-header -->
<div align="center">
<img src="figures/fig1.png" width="60%" alt="EarlyCheckpoint" />
</div>
<hr>
<div align="center" style="line-height: 1;">
<a href="LICENSE" style="margin: 2px;">
<img alt="License" src="figures/fig2.png" style="display: inline-block; vertical-align: middle;"/>
</a>
</div>
## 1. Introduction
EarlyCheckpoint is the first saved checkpoint from our training run, captured at the very beginning of training. It serves as a baseline for comparing training progress.
<p align="center">
<img width="80%" src="figures/fig3.png">
</p>
This model represents the initial state of training and is useful for ablation studies and understanding training dynamics.
## 2. Model Information
| Property | Value |
|---|---|
| Architecture | BERT |
| Training Step | step_100 |
| License | Apache-2.0 |
## 3. How to Use
```python
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("EarlyCheckpoint-v1")
tokenizer = AutoTokenizer.from_pretrained("EarlyCheckpoint-v1")
```
## 4. License
This model is licensed under the [Apache-2.0 License](LICENSE).
## 5. Contact
Open an issue on our GitHub for questions.