ft_bs16_lr7_mlm / README.md
damgomz's picture
Upload README.md with huggingface_hub
18177de verified
---
language: en
tags:
- fill-mask
kwargs:
timestamp: '2024-05-21T06:31:25'
project_name: ft_bs16_lr7_mlm_emissions_tracker
run_id: b6614c15-0b17-42cf-a4e3-7b88ff581e67
duration: 31470.129777431488
emissions: 0.0190430699900628
emissions_rate: 6.051157120972355e-07
cpu_power: 42.5
gpu_power: 0.0
ram_power: 3.75
cpu_energy: 0.3715217473053264
gpu_energy: 0
ram_energy: 0.0327811335265635
energy_consumed: 0.4043028808318904
country_name: Switzerland
country_iso_code: CHE
region: .nan
cloud_provider: .nan
cloud_region: .nan
os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34
python_version: 3.10.4
codecarbon_version: 2.3.4
cpu_count: 2
cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
gpu_count: .nan
gpu_model: .nan
longitude: .nan
latitude: .nan
ram_total_size: 10
tracking_mode: machine
on_cloud: N
pue: 1.0
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 31470.129777431488 |
| Emissions (Co2eq in kg) | 0.0190430699900628 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 3.75 |
| CPU energy (kWh) | 0.3715217473053264 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0327811335265635 |
| Consumed energy (kWh) | 0.4043028808318904 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 2 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.06057999982155562 |
| Emissions (Co2eq in kg) | 0.012325800829494 |
## Note
20 May 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | damgomz/ThunBERT_bs16_lr5_MLM |
| model_name | ft_bs16_lr7_mlm |
| sequence_length | 400 |
| num_epoch | 15 |
| learning_rate | 5e-07 |
| batch_size | 16 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 81450 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | Accuracy | Recall
---|---|---|---|---
| 0 | 0.628769 | 0.554894 | 0.730486 | 0.842025 |
| 1 | 0.510461 | 0.486829 | 0.763623 | 0.797546 |
| 2 | 0.449970 | 0.445788 | 0.786451 | 0.888037 |
| 3 | 0.410732 | 0.416862 | 0.807806 | 0.884969 |
| 4 | 0.380523 | 0.396044 | 0.812960 | 0.872699 |
| 5 | 0.359862 | 0.388476 | 0.820324 | 0.909509 |
| 6 | 0.342461 | 0.369396 | 0.834315 | 0.874233 |
| 7 | 0.330469 | 0.362060 | 0.840943 | 0.861963 |
| 8 | 0.319533 | 0.359950 | 0.840943 | 0.889571 |
| 9 | 0.310329 | 0.358102 | 0.843888 | 0.892638 |
| 10 | 0.300148 | 0.363338 | 0.840206 | 0.904908 |
| 11 | 0.291830 | 0.362882 | 0.830633 | 0.791411 |
| 12 | 0.285529 | 0.354668 | 0.840206 | 0.849693 |
| 13 | 0.277152 | 0.358292 | 0.837261 | 0.823620 |
| 14 | 0.264916 | 0.364439 | 0.844624 | 0.897239 |