File size: 1,369 Bytes
2c1f5ba | 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 | ---
license: apache-2.0
library_name: transformers
---
# EarlyCheckpoint
<!-- markdownlint-disable first-line-h1 -->
<!-- markdownlint-disable html -->
<!-- markdownlint-disable no-duplicate-header -->
<div align="center">
<img src="figures/fig1.png" width="60%" alt="EarlyCheckpoint" />
</div>
<hr>
<div align="center" style="line-height: 1;">
<a href="LICENSE" style="margin: 2px;">
<img alt="License" src="figures/fig2.png" style="display: inline-block; vertical-align: middle;"/>
</a>
</div>
## 1. Introduction
EarlyCheckpoint is the first saved checkpoint from our training run, captured at the very beginning of training. It serves as a baseline for comparing training progress.
<p align="center">
<img width="80%" src="figures/fig3.png">
</p>
This model represents the initial state of training and is useful for ablation studies and understanding training dynamics.
## 2. Model Information
| Property | Value |
|---|---|
| Architecture | BERT |
| Training Step | step_100 |
| License | Apache-2.0 |
## 3. How to Use
```python
from transformers import AutoModel, AutoTokenizer
model = AutoModel.from_pretrained("EarlyCheckpoint-v1")
tokenizer = AutoTokenizer.from_pretrained("EarlyCheckpoint-v1")
```
## 4. License
This model is licensed under the [Apache-2.0 License](LICENSE).
## 5. Contact
Open an issue on our GitHub for questions.
|