damgomz's picture
Upload README.md with huggingface_hub
9a5a103 verified
|
raw
history blame
No virus
2.63 kB
---
language: en
tags:
- fill-mask
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | [More Information Needed] |
| Emissions (Co2eq in kg) | [More Information Needed] |
| CPU power (W) | [NO CPU] |
| GPU power (W) | [No GPU] |
| RAM power (W) | [More Information Needed] |
| CPU energy (kWh) | [No CPU] |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | [More Information Needed] |
| Consumed energy (kWh) | [More Information Needed] |
| Country name | [More Information Needed] |
| Cloud provider | [No Cloud] |
| Cloud region | [No Cloud] |
| CPU count | [No CPU] |
| CPU model | [No CPU] |
| GPU count | [No GPU] |
| GPU model | [No GPU] |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | [No CPU] |
| Emissions (Co2eq in kg) | [More Information Needed] |
## Note
2 May 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | BERTrand_bs16_lr5_MLM |
| sequence_length | 400 |
| num_epoch | 12 |
| learning_rate | 5e-05 |
| batch_size | 16 |
| weight_decay | 0.0 |
| warm_up_prop | 0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 13033 |
## Training and Testing steps
Epoch | Train Loss | Test Loss
---|---|---
| 0.0 | 14.743229 | 13.097182 |
| 0.5 | 7.100540 | 6.967106 |
| 1.0 | 6.948475 | 6.939338 |
| 1.5 | 6.938035 | 6.938765 |
| 2.0 | 6.931950 | 6.935680 |
| 2.5 | 6.923858 | 6.936854 |
| 3.0 | 6.920174 | 6.932032 |
| 3.5 | 6.920139 | 6.914489 |
| 4.0 | 6.911595 | 6.913473 |
| 4.5 | 6.905194 | 6.909112 |
| 5.0 | 6.903432 | 6.907618 |
| 5.5 | 6.901336 | 6.901513 |
| 6.0 | 6.896765 | 6.905171 |
| 6.5 | 6.900726 | 6.894976 |
| 7.0 | 6.881094 | 6.894709 |
| 7.5 | 6.885156 | 6.894880 |
| 8.0 | 6.883777 | 6.893790 |
| 8.5 | 6.884839 | 6.888695 |
| 9.0 | 6.881206 | 6.888314 |
| 9.5 | 6.875793 | 6.884028 |
| 10.0 | 6.871364 | 6.881168 |
| 10.5 | 6.880161 | 6.887640 |
| 11.0 | 6.878505 | 6.883618 |
| 11.5 | 6.873283 | 6.881647 |