--- language: en tags: - fill-mask --- ## Environmental Impact (CODE CARBON DEFAULT) | Metric | Value | |--------------------------|---------------------------------| | Duration (in seconds) | [More Information Needed] | | Emissions (Co2eq in kg) | [More Information Needed] | | CPU power (W) | [NO CPU] | | GPU power (W) | [No GPU] | | RAM power (W) | [More Information Needed] | | CPU energy (kWh) | [No CPU] | | GPU energy (kWh) | [No GPU] | | RAM energy (kWh) | [More Information Needed] | | Consumed energy (kWh) | [More Information Needed] | | Country name | [More Information Needed] | | Cloud provider | [No Cloud] | | Cloud region | [No Cloud] | | CPU count | [No CPU] | | CPU model | [No CPU] | | GPU count | [No GPU] | | GPU model | [No GPU] | ## Environmental Impact (for one core) | Metric | Value | |--------------------------|---------------------------------| | CPU energy (kWh) | [No CPU] | | Emissions (Co2eq in kg) | [More Information Needed] | ## Note 30 April 2024 ## My Config | Config | Value | |--------------------------|-----------------| | checkpoint | albert-base-v2 | | model_name | BERTrand_bs64_lr5 | | sequence_length | 400 | | num_epoch | 12 | | learning_rate | 5e-05 | | batch_size | 64 | | weight_decay | 0.0 | | warm_up_prop | 0 | | drop_out_prob | 0.1 | | packing_length | 100 | | train_test_split | 0.2 | | num_steps | 3148 | ## Training and Testing steps Epoch | Train Loss | Test Loss ---|---|--- | 0.0 | 15.402081 | 13.817045 | | 0.5 | 8.054414 | 7.829068 | | 1.0 | 4.816239 | 3.114260 | | 1.5 | 2.206430 | 2.955595 | | 2.0 | 2.189819 | 2.872115 | | 2.5 | 2.418134 | 2.865437 | | 3.0 | 2.349051 | 2.810524 | | 3.5 | 2.102283 | 2.820134 | | 4.0 | 1.907061 | 2.957294 | | 4.5 | 2.326205 | 2.785392 | | 5.0 | 2.257292 | 2.737638 | | 5.5 | 2.127350 | 2.733068 | | 6.0 | 1.883285 | 2.774372 | | 6.5 | 2.100682 | 2.667502 | | 7.0 | 2.194973 | 2.628296 | | 7.5 | 2.163919 | 2.643665 | | 8.0 | 1.850441 | 2.637510 | | 8.5 | 1.968181 | 2.632833 | | 9.0 | 2.121989 | 2.625116 | | 9.5 | 2.136497 | 2.646418 | | 10.0 | 1.891819 | 2.655790 | | 10.5 | 1.822331 | 2.596789 |