Instructions to use damgomz/fp_bs2_lr3e4_x1 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use damgomz/fp_bs2_lr3e4_x1 with Transformers:
# Use a pipeline as a high-level helper from transformers import pipeline pipe = pipeline("fill-mask", model="damgomz/fp_bs2_lr3e4_x1")# Load model directly from transformers import AutoTokenizer, AutoModelForMaskedLM tokenizer = AutoTokenizer.from_pretrained("damgomz/fp_bs2_lr3e4_x1") model = AutoModelForMaskedLM.from_pretrained("damgomz/fp_bs2_lr3e4_x1") - Notebooks
- Google Colab
- Kaggle
Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|---|---|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
Environmental Impact (for one core)
| Metric | Value |
|---|---|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
Note
5 juillet 2024 !
My Config
| Config | Value |
|---|---|
| checkpoint | albert-base-v2 |
| model_name | fp_bs2_lr3e4_x1 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 0.0003 |
| batch_size | 2 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 325918 |
Training and Testing steps
| Epoch | Train Loss | Test Loss |
|---|---|---|
| 0.0 | 13.190971 | 8.201885 |
| 0.5 | 7.566634 | 7.504056 |
| 1.0 | 7.393551 | 7.312474 |
| 1.5 | 7.285843 | 7.245680 |
| 2.0 | 7.277350 | 7.181346 |
- Downloads last month
- 1