ft_bs64_lr6_base_x4 / README.md
damgomz's picture
Upload README.md with huggingface_hub
08966f3 verified
|
raw
history blame
No virus
3.35 kB
---
language: en
tags:
- fill-mask
kwargs:
timestamp: '2024-05-17T15:06:22'
project_name: ft_bs64_lr6_base_x4_emissions_tracker
run_id: 02402ada-d0fb-4b04-b706-3b97861a2ad3
duration: 18129.05792498589
emissions: 0.0118596516104099
emissions_rate: 6.541791448558767e-07
cpu_power: 42.5
gpu_power: 0.0
ram_power: 7.5
cpu_energy: 0.2140232932743098
gpu_energy: 0
ram_energy: 0.0377686349312464
energy_consumed: 0.2517919282055566
country_name: Switzerland
country_iso_code: CHE
region: .nan
cloud_provider: .nan
cloud_region: .nan
os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34
python_version: 3.10.4
codecarbon_version: 2.3.4
cpu_count: 3
cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
gpu_count: .nan
gpu_model: .nan
longitude: .nan
latitude: .nan
ram_total_size: 20
tracking_mode: machine
on_cloud: N
pue: 1.0
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 18129.05792498589 |
| Emissions (Co2eq in kg) | 0.0118596516104099 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 7.5 |
| CPU energy (kWh) | 0.2140232932743098 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 0.0377686349312464 |
| Consumed energy (kWh) | 0.2517919282055566 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 3 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.03489843650559783 |
| Emissions (Co2eq in kg) | 0.00710054768728614 |
## Note
17 May 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ft_bs64_lr6_base_x4 |
| sequence_length | 400 |
| num_epoch | 12 |
| learning_rate | 5e-06 |
| batch_size | 64 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 65160 |
## Training and Testing steps
Epoch | Train Loss | Test Loss | Accuracy | Recall
---|---|---|---|---
| 0 | 0.494021 | 0.421029 | 0.805596 | 0.891104 |
| 1 | 0.370485 | 0.376504 | 0.825479 | 0.888037 |
| 2 | 0.325639 | 0.374071 | 0.835788 | 0.897239 |
| 3 | 0.287644 | 0.374302 | 0.830633 | 0.906442 |
| 4 | 0.259016 | 0.370951 | 0.849779 | 0.877301 |
| 5 | 0.221142 | 0.409426 | 0.824742 | 0.815951 |
| 6 | 0.192179 | 0.451972 | 0.821797 | 0.852761 |
| 7 | 0.119463 | 0.534428 | 0.812960 | 0.800613 |
| 8 | 0.075451 | 0.618039 | 0.817378 | 0.757669 |
| 9 | 0.051555 | 0.791213 | 0.799705 | 0.702454 |
| 10 | 0.024909 | 0.752735 | 0.809278 | 0.826687 |
| 11 | 0.019373 | 0.822910 | 0.813697 | 0.802147 |