| --- |
| language: en |
| tags: |
| - fill-mask |
| kwargs: |
| timestamp: '2024-05-24T07:32:11' |
| project_name: ft_8_4e6_mlm_cv_emissions_tracker |
| run_id: 540c605b-00bf-4a6b-9831-e338e3f3ffd0 |
| duration: 69752.5319519043 |
| emissions: 0.0422083433539418 |
| emissions_rate: 6.051155731958987e-07 |
| cpu_power: 42.5 |
| gpu_power: 0.0 |
| ram_power: 3.75 |
| cpu_energy: 0.8234658257799006 |
| gpu_energy: 0 |
| ram_energy: 0.072658319226404 |
| energy_consumed: 0.8961241450063017 |
| country_name: Switzerland |
| country_iso_code: CHE |
| region: .nan |
| cloud_provider: .nan |
| cloud_region: .nan |
| os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34 |
| python_version: 3.10.4 |
| codecarbon_version: 2.3.4 |
| cpu_count: 2 |
| cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| gpu_count: .nan |
| gpu_model: .nan |
| longitude: .nan |
| latitude: .nan |
| ram_total_size: 10 |
| tracking_mode: machine |
| on_cloud: N |
| pue: 1.0 |
| --- |
| |
| ## Environmental Impact (CODE CARBON DEFAULT) |
|
|
| | Metric | Value | |
| |--------------------------|---------------------------------| |
| | Duration (in seconds) | 69752.5319519043 | |
| | Emissions (Co2eq in kg) | 0.0422083433539418 | |
| | CPU power (W) | 42.5 | |
| | GPU power (W) | [No GPU] | |
| | RAM power (W) | 3.75 | |
| | CPU energy (kWh) | 0.8234658257799006 | |
| | GPU energy (kWh) | [No GPU] | |
| | RAM energy (kWh) | 0.072658319226404 | |
| | Consumed energy (kWh) | 0.8961241450063017 | |
| | Country name | Switzerland | |
| | Cloud provider | nan | |
| | Cloud region | nan | |
| | CPU count | 2 | |
| | CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz | |
| | GPU count | nan | |
| | GPU model | nan | |
|
|
| ## Environmental Impact (for one core) |
|
|
| | Metric | Value | |
| |--------------------------|---------------------------------| |
| | CPU energy (kWh) | 0.13427362400741574 | |
| | Emissions (Co2eq in kg) | 0.027319741681162513 | |
|
|
| ## Note |
|
|
| 21 May 2024 |
|
|
| ## My Config |
|
|
| | Config | Value | |
| |--------------------------|-----------------| |
| | checkpoint | damgomz/ThunBERT_bs16_lr5_MLM | |
| | model_name | ft_8_4e6_mlm_cv | |
| | sequence_length | 400 | |
| | num_epoch | 6 | |
| | learning_rate | 4e-06 | |
| | batch_size | 8 | |
| | weight_decay | 0.0 | |
| | warm_up_prop | 0.0 | |
| | drop_out_prob | 0.1 | |
| | packing_length | 100 | |
| | train_test_split | 0.2 | |
| | num_steps | 32586 | |
| |
| ## Training and Testing steps |
| |
| |
| |
| |
| |
| |
| |
| Epoch | Train Loss | Test Loss | Accuracy | Recall |
| ---|---|---|---|--- |
| | 0 | 0.420520 | 0.346607 | 0.847082 | 0.877867 | |
| | 1 | 0.318169 | 0.338375 | 0.845903 | 0.855383 | |
| | 2 | 0.266432 | 0.351313 | 0.844136 | 0.855602 | |
| | 3 | 0.199349 | 0.386016 | 0.838390 | 0.867479 | |
| | 4 | 0.111738 | 0.455603 | 0.836033 | 0.839846 | |
| | 5 | 0.045670 | 0.558804 | 0.831907 | 0.844940 | |
| |