|
--- |
|
language: en |
|
tags: |
|
- fill-mask |
|
--- |
|
|
|
## Environmental Impact (CODE CARBON DEFAULT) |
|
|
|
| Metric | Value | |
|
|--------------------------|---------------------------------| |
|
| Duration (in seconds) | [More Information Needed] | |
|
| Emissions (Co2eq in kg) | [More Information Needed] | |
|
| CPU power (W) | [NO CPU] | |
|
| GPU power (W) | [No GPU] | |
|
| RAM power (W) | [More Information Needed] | |
|
| CPU energy (kWh) | [No CPU] | |
|
| GPU energy (kWh) | [No GPU] | |
|
| RAM energy (kWh) | [More Information Needed] | |
|
| Consumed energy (kWh) | [More Information Needed] | |
|
| Country name | [More Information Needed] | |
|
| Cloud provider | [No Cloud] | |
|
| Cloud region | [No Cloud] | |
|
| CPU count | [No CPU] | |
|
| CPU model | [No CPU] | |
|
| GPU count | [No GPU] | |
|
| GPU model | [No GPU] | |
|
|
|
## Environmental Impact (for one core) |
|
|
|
| Metric | Value | |
|
|--------------------------|---------------------------------| |
|
| CPU energy (kWh) | [No CPU] | |
|
| Emissions (Co2eq in kg) | [More Information Needed] | |
|
|
|
## Note |
|
|
|
20 May 2024 |
|
|
|
## My Config |
|
|
|
| Config | Value | |
|
|--------------------------|-----------------| |
|
| checkpoint | albert-base-v2 | |
|
| model_name | ft_bs32_lr7_base_x8 | |
|
| sequence_length | 400 | |
|
| num_epoch | 20 | |
|
| learning_rate | 5e-07 | |
|
| batch_size | 32 | |
|
| weight_decay | 0.0 | |
|
| warm_up_prop | 0.0 | |
|
| drop_out_prob | 0.1 | |
|
| packing_length | 100 | |
|
| train_test_split | 0.2 | |
|
| num_steps | 108600 | |
|
|
|
## Training and Testing steps |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Epoch | Train Loss | Test Loss | Accuracy | Recall |
|
---|---|---|---|--- |
|
| 0 | 0.599385 | 0.533520 | 0.732695 | 0.743865 | |
|
| 1 | 0.497255 | 0.495337 | 0.756996 | 0.874233 | |
|
| 2 | 0.456973 | 0.457591 | 0.777614 | 0.812883 | |
|
| 3 | 0.428078 | 0.435462 | 0.792342 | 0.811350 | |
|
| 4 | 0.405985 | 0.418146 | 0.806333 | 0.865031 | |
|
| 5 | 0.386763 | 0.402823 | 0.818851 | 0.852761 | |
|
| 6 | 0.370968 | 0.398841 | 0.818115 | 0.819018 | |
|
| 7 | 0.361504 | 0.389461 | 0.822533 | 0.865031 | |
|
| 8 | 0.348315 | 0.386434 | 0.828424 | 0.881902 | |
|
| 9 | 0.339924 | 0.381690 | 0.829897 | 0.820552 | |
|
| 10 | 0.333508 | 0.379336 | 0.829161 | 0.869632 | |
|
|