ft_bs32_lr7_base_x8 / README.md
damgomz's picture
Upload README.md with huggingface_hub
e9246b7 verified
---
language: en
tags:
- fill-mask
kwargs:
timestamp: '2024-05-21T06:51:24'
project_name: ft_bs32_lr7_base_x8_emissions_tracker
run_id: d86ab92c-42c5-46ae-a58f-cb705b0a7a8b
duration: 29443.913482666016
emissions: 0.0192615910805578
emissions_rate: 6.54179040836314e-07
cpu_power: 42.5
gpu_power: 0.0
ram_power: 7.5
cpu_energy: 0.3476012287669722
gpu_energy: 0
ram_energy: 0.0613410671621561
energy_consumed: 0.4089422959291282
country_name: Switzerland
country_iso_code: CHE
region: .nan
cloud_provider: .nan
cloud_region: .nan
os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34
python_version: 3.10.4
codecarbon_version: 2.3.4
cpu_count: 3
cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
gpu_count: .nan
gpu_model: .nan
longitude: .nan
latitude: .nan
ram_total_size: 20
tracking_mode: machine
on_cloud: N
pue: 1.0
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 29443.913482666016 |
| Emissions (Co2eq in kg) | 0.0192615910805578 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 7.5 |
| CPU energy (kWh) | 0.3476012287669722 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0613410671621561 |
| Consumed energy (kWh) | 0.4089422959291282 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 3 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.056679533454132076 |
| Emissions (Co2eq in kg) | 0.011532199447377522 |
## Note
20 May 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_bs32_lr7_base_x8 |
| sequence_length | 400 |
| num_epoch | 20 |
| learning_rate | 5e-07 |
| batch_size | 32 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 108600 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | Accuracy | Recall
---|---|---|---|---
| 0 | 0.599385 | 0.533520 | 0.732695 | 0.743865 |
| 1 | 0.497255 | 0.495337 | 0.756996 | 0.874233 |
| 2 | 0.456973 | 0.457591 | 0.777614 | 0.812883 |
| 3 | 0.428078 | 0.435462 | 0.792342 | 0.811350 |
| 4 | 0.405985 | 0.418146 | 0.806333 | 0.865031 |
| 5 | 0.386763 | 0.402823 | 0.818851 | 0.852761 |
| 6 | 0.370968 | 0.398841 | 0.818115 | 0.819018 |
| 7 | 0.361504 | 0.389461 | 0.822533 | 0.865031 |
| 8 | 0.348315 | 0.386434 | 0.828424 | 0.881902 |
| 9 | 0.339924 | 0.381690 | 0.829897 | 0.820552 |
| 10 | 0.333508 | 0.379336 | 0.829161 | 0.869632 |
| 11 | 0.327714 | 0.375907 | 0.831370 | 0.860429 |
| 12 | 0.319972 | 0.372091 | 0.835052 | 0.861963 |
| 13 | 0.311965 | 0.373268 | 0.833579 | 0.829755 |
| 14 | 0.307354 | 0.374971 | 0.834315 | 0.835890 |
| 15 | 0.303944 | 0.373268 | 0.835052 | 0.874233 |
| 16 | 0.297742 | 0.387149 | 0.831370 | 0.906442 |
| 17 | 0.288179 | 0.376481 | 0.837997 | 0.878834 |
| 18 | 0.284836 | 0.380563 | 0.834315 | 0.892638 |
| 19 | 0.279182 | 0.376233 | 0.835788 | 0.843558 |