--- license: apache-2.0 library_name: transformers --- # EarlyCheckpoint
EarlyCheckpoint

License
## 1. Introduction EarlyCheckpoint is the first saved checkpoint from our training run, captured at the very beginning of training. It serves as a baseline for comparing training progress.

This model represents the initial state of training and is useful for ablation studies and understanding training dynamics. ## 2. Model Information | Property | Value | |---|---| | Architecture | BERT | | Training Step | step_100 | | License | Apache-2.0 | ## 3. How to Use ```python from transformers import AutoModel, AutoTokenizer model = AutoModel.from_pretrained("EarlyCheckpoint-v1") tokenizer = AutoTokenizer.from_pretrained("EarlyCheckpoint-v1") ``` ## 4. License This model is licensed under the [Apache-2.0 License](LICENSE). ## 5. Contact Open an issue on our GitHub for questions.