|
--- |
|
language: en |
|
tags: |
|
- fill-mask |
|
kwargs: |
|
timestamp: '2024-05-22T16:01:26' |
|
project_name: ft_32_1e6_mlm_cv_emissions_tracker |
|
run_id: 91a1ef6e-5836-4dfc-8beb-10a57fac7d60 |
|
duration: 84112.26557207108 |
|
emissions: 0.0508976082410351 |
|
emissions_rate: 6.051151742836349e-07 |
|
cpu_power: 42.5 |
|
gpu_power: 0.0 |
|
ram_power: 3.75 |
|
cpu_energy: 0.9929897035482864 |
|
gpu_energy: 0 |
|
ram_energy: 0.0876159787309665 |
|
energy_consumed: 1.080605682279255 |
|
country_name: Switzerland |
|
country_iso_code: CHE |
|
region: .nan |
|
cloud_provider: .nan |
|
cloud_region: .nan |
|
os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34 |
|
python_version: 3.10.4 |
|
codecarbon_version: 2.3.4 |
|
cpu_count: 2 |
|
cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
|
gpu_count: .nan |
|
gpu_model: .nan |
|
longitude: .nan |
|
latitude: .nan |
|
ram_total_size: 10 |
|
tracking_mode: machine |
|
on_cloud: N |
|
pue: 1.0 |
|
--- |
|
|
|
## Environmental Impact (CODE CARBON DEFAULT) |
|
|
|
| Metric | Value | |
|
|--------------------------|---------------------------------| |
|
| Duration (in seconds) | 84112.26557207108 | |
|
| Emissions (Co2eq in kg) | 0.0508976082410351 | |
|
| CPU power (W) | 42.5 | |
|
| GPU power (W) | [No GPU] | |
|
| RAM power (W) | 3.75 | |
|
| CPU energy (kWh) | 0.9929897035482864 | |
|
| GPU energy (kWh) | [No GPU] | |
|
| RAM energy (kWh) | 0.0876159787309665 | |
|
| Consumed energy (kWh) | 1.080605682279255 | |
|
| Country name | Switzerland | |
|
| Cloud provider | nan | |
|
| Cloud region | nan | |
|
| CPU count | 2 | |
|
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz | |
|
| GPU count | nan | |
|
| GPU model | nan | |
|
|
|
## Environmental Impact (for one core) |
|
|
|
| Metric | Value | |
|
|--------------------------|---------------------------------| |
|
| CPU energy (kWh) | 0.1619161112262368 | |
|
| Emissions (Co2eq in kg) | 0.0329439706823945 | |
|
|
|
## Note |
|
|
|
21 May 2024 |
|
|
|
## My Config |
|
|
|
| Config | Value | |
|
|--------------------------|-----------------| |
|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM | |
|
| model_name | ft_32_1e6_mlm_cv | |
|
| sequence_length | 400 | |
|
| num_epoch | 6 | |
|
| learning_rate | 1e-06 | |
|
| batch_size | 32 | |
|
| weight_decay | 0.0 | |
|
| warm_up_prop | 0.0 | |
|
| drop_out_prob | 0.1 | |
|
| packing_length | 100 | |
|
| train_test_split | 0.2 | |
|
| num_steps | 32586 | |
|
|
|
## Training and Testing steps |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
Epoch | Train Loss | Test Loss | Accuracy | Recall |
|
---|---|---|---|--- |
|
| 0 | 0.617806 | 0.538588 | 0.735561 | 0.825708 | |
|
| 1 | 0.488718 | 0.449449 | 0.789484 | 0.849987 | |
|
| 2 | 0.413039 | 0.395091 | 0.815557 | 0.857885 | |
|
| 3 | 0.367947 | 0.374194 | 0.826311 | 0.848885 | |
|
| 4 | 0.343483 | 0.362730 | 0.832203 | 0.865677 | |
|
| 5 | 0.325950 | 0.357382 | 0.836770 | 0.847558 | |
|
|