--- language: en tags: - fill-mask --- ## Environmental Impact (CODE CARBON DEFAULT) | Metric | Value | |--------------------------|---------------------------------| | Duration (in seconds) | [More Information Needed] | | Emissions (Co2eq in kg) | [More Information Needed] | | CPU power (W) | [NO CPU] | | GPU power (W) | [No GPU] | | RAM power (W) | [More Information Needed] | | CPU energy (kWh) | [No CPU] | | GPU energy (kWh) | [No GPU] | | RAM energy (kWh) | [More Information Needed] | | Consumed energy (kWh) | [More Information Needed] | | Country name | [More Information Needed] | | Cloud provider | [No Cloud] | | Cloud region | [No Cloud] | | CPU count | [No CPU] | | CPU model | [No CPU] | | GPU count | [No GPU] | | GPU model | [No GPU] | ## Environmental Impact (for one core) | Metric | Value | |--------------------------|---------------------------------| | CPU energy (kWh) | [No CPU] | | Emissions (Co2eq in kg) | [More Information Needed] | ## Note 20 May 2024 ## My Config | Config | Value | |--------------------------|-----------------| | checkpoint | albert-base-v2 | | model_name | ft_bs64_1lr6_base_x8 | | sequence_length | 400 | | num_epoch | 20 | | learning_rate | 1e-06 | | batch_size | 64 | | weight_decay | 0.0 | | warm_up_prop | 0.0 | | drop_out_prob | 0.1 | | packing_length | 100 | | train_test_split | 0.2 | | num_steps | 108600 | ## Training and Testing steps Epoch | Train Loss | Test Loss | Accuracy | Recall ---|---|---|---|--- | 0 | 0.600790 | 0.530991 | 0.729750 | 0.803681 | | 1 | 0.490914 | 0.515528 | 0.750368 | 0.937117 | | 2 | 0.454214 | 0.454406 | 0.784242 | 0.837423 | | 3 | 0.420142 | 0.442943 | 0.794551 | 0.891104 | | 4 | 0.391595 | 0.418620 | 0.801915 | 0.854294 | | 5 | 0.372766 | 0.409371 | 0.809278 | 0.808282 | | 6 | 0.351249 | 0.411269 | 0.816642 | 0.884969 | | 7 | 0.337517 | 0.410223 | 0.814433 | 0.897239 | | 8 | 0.331506 | 0.397039 | 0.825479 | 0.881902 | | 9 | 0.318005 | 0.416837 | 0.804124 | 0.730061 | | 10 | 0.309892 | 0.406303 | 0.818115 | 0.907975 | | 11 | 0.301596 | 0.390790 | 0.832106 | 0.840491 | | 12 | 0.295900 | 0.391270 | 0.832842 | 0.865031 | | 13 | 0.284800 | 0.392566 | 0.832106 | 0.860429 | | 14 | 0.271658 | 0.397234 | 0.829897 | 0.823620 | | 15 | 0.267081 | 0.404172 | 0.837261 | 0.883436 | | 16 | 0.251110 | 0.409858 | 0.828424 | 0.823620 | | 17 | 0.246977 | 0.409189 | 0.829161 | 0.852761 | | 18 | 0.235927 | 0.413321 | 0.828424 | 0.838957 |