|
--- |
|
language: en |
|
tags: |
|
- fill-mask |
|
--- |
|
|
|
## Environmental Impact (CODE CARBON DEFAULT) |
|
|
|
| Metric | Value | |
|
|--------------------------|---------------------------------| |
|
| Duration (in seconds) | [More Information Needed] | |
|
| Emissions (Co2eq in kg) | [More Information Needed] | |
|
| CPU power (W) | [NO CPU] | |
|
| GPU power (W) | [No GPU] | |
|
| RAM power (W) | [More Information Needed] | |
|
| CPU energy (kWh) | [No CPU] | |
|
| GPU energy (kWh) | [No GPU] | |
|
| RAM energy (kWh) | [More Information Needed] | |
|
| Consumed energy (kWh) | [More Information Needed] | |
|
| Country name | [More Information Needed] | |
|
| Cloud provider | [No Cloud] | |
|
| Cloud region | [No Cloud] | |
|
| CPU count | [No CPU] | |
|
| CPU model | [No CPU] | |
|
| GPU count | [No GPU] | |
|
| GPU model | [No GPU] | |
|
|
|
## Environmental Impact (for one core) |
|
|
|
| Metric | Value | |
|
|--------------------------|---------------------------------| |
|
| CPU energy (kWh) | [No CPU] | |
|
| Emissions (Co2eq in kg) | [More Information Needed] | |
|
|
|
## Note |
|
|
|
2 May 2024 |
|
|
|
## My Config |
|
|
|
| Config | Value | |
|
|--------------------------|-----------------| |
|
| checkpoint | albert-base-v2 | |
|
| model_name | BERTrand_bs64_lr5_MLM | |
|
| sequence_length | 400 | |
|
| num_epoch | 12 | |
|
| learning_rate | 5e-05 | |
|
| batch_size | 64 | |
|
| weight_decay | 0.0 | |
|
| warm_up_prop | 0 | |
|
| drop_out_prob | 0.1 | |
|
| packing_length | 100 | |
|
| train_test_split | 0.2 | |
|
| num_steps | 3270 | |
|
|
|
## Training and Testing steps |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Epoch | Train Loss | Test Loss |
|
---|---|--- |
|
| 0.0 | 14.601907 | 13.069239 | |
|
| 0.5 | 7.447329 | 6.978540 | |
|
| 1.0 | 6.960886 | 6.950151 | |
|
| 1.5 | 4.932429 | 2.806009 | |
|
| 2.0 | 2.701225 | 2.655897 | |
|
| 2.5 | 2.606577 | 2.581771 | |
|
| 3.0 | 2.548158 | 2.557307 | |
|
| 3.5 | 2.505096 | 2.519658 | |
|
| 4.0 | 2.481079 | 2.493212 | |
|
| 4.5 | 2.450955 | 2.467917 | |
|
| 5.0 | 2.436830 | 2.450555 | |
|
| 5.5 | 2.415499 | 2.444447 | |
|
| 6.0 | 2.396800 | 2.423250 | |
|
| 6.5 | 2.386172 | 2.408114 | |
|
| 7.0 | 2.366631 | 2.397662 | |
|
| 7.5 | 2.364014 | 2.391399 | |
|
| 8.0 | 2.338708 | 2.381241 | |
|
| 8.5 | 2.341727 | 2.370836 | |
|
| 9.0 | 2.321789 | 2.366722 | |
|
| 9.5 | 2.321147 | 2.363208 | |
|
|