File size: 2,270 Bytes
a882f3b
8b1831d
 
 
a882f3b
 
8b1831d
a882f3b
8b1831d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
a882f3b
8b1831d
a882f3b
8b1831d
 
 
 
a882f3b
8b1831d
a882f3b
8b1831d
a882f3b
8b1831d
a882f3b
8b1831d
 
 
 
 
 
 
 
 
 
 
 
 
 
a882f3b
8b1831d
a882f3b
 
 
 
 
 
8b1831d
 
 
 
7e89218
a886c5e
10033b7
8630550
cb6a56a
f179d4e
d02177d
b00bfb2
d3353ad
a557d63
dd5312c
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
---
language: en
tags:
- fill-mask
---

## Environmental Impact (CODE CARBON DEFAULT)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| Duration (in seconds)    | [More Information Needed]  |
| Emissions (Co2eq in kg)  | [More Information Needed] |
| CPU power (W)            | [NO CPU]  |
| GPU power (W)            | [No GPU]  |
| RAM power (W)            | [More Information Needed]  |
| CPU energy (kWh)         | [No CPU]  |
| GPU energy (kWh)         | [No GPU]  |
| RAM energy (kWh)         | [More Information Needed]  |
| Consumed energy (kWh)    | [More Information Needed]  |
| Country name             | [More Information Needed]  |
| Cloud provider           | [No Cloud]  |
| Cloud region             | [No Cloud]  |
| CPU count                | [No CPU]  |
| CPU model                | [No CPU]  |
| GPU count                | [No GPU]  |
| GPU model                | [No GPU]  |

## Environmental Impact (for one core)

| Metric                   | Value                           |
|--------------------------|---------------------------------|
| CPU energy (kWh)         | [No CPU]  |
| Emissions (Co2eq in kg)  | [More Information Needed] |

## Note

11 May 2024

## My Config

| Config                   | Value           |
|--------------------------|-----------------|
| checkpoint               | albert-base-v2  |
| model_name               | ThunBERT_bs8_lr4_MLM |
| sequence_length          | 400  |
| num_epoch                | 6  |
| learning_rate            | 0.0005  |
| batch_size               | 8  |
| weight_decay             | 0.0  |
| warm_up_prop             | 0.0  |
| drop_out_prob            | 0.1 |
| packing_length           | 100 |
| train_test_split         | 0.2 |
| num_steps                | 81680 |

## Training and Testing steps






 
Epoch | Train Loss | Test Loss
---|---|---
| 0.0 | 10.543088 | 10.480216 |
| 0.5 | 7.136020 | 7.056888 |
| 1.0 | 7.012019 | 7.004119 |
| 1.5 | 6.990851 | 6.982289 |
| 2.0 | 7.008927 | 6.981744 |
| 2.5 | 6.986691 | 6.975362 |
| 3.0 | 6.971014 | 6.972791 |
| 3.5 | 6.971395 | 6.976788 |
| 4.0 | 6.964603 | 6.968367 |
| 4.5 | 6.960862 | 6.960208 |
| 5.0 | 6.952116 | 6.964635 |
| 5.5 | 6.951814 | 6.951701 |