Environmental Impact (CODE CARBON DEFAULT)
Metric | Value |
---|---|
Duration (in seconds) | 39504.639969825745 |
Emissions (Co2eq in kg) | 0.0239048803736773 |
CPU power (W) | 42.5 |
GPU power (W) | [No GPU] |
RAM power (W) | 3.75 |
CPU energy (kWh) | 0.4663734684311684 |
GPU energy (kWh) | [No GPU] |
RAM energy (kWh) | 0.0411503712681432 |
Consumed energy (kWh) | 0.5075238396993128 |
Country name | Switzerland |
Cloud provider | nan |
Cloud region | nan |
CPU count | 2 |
CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
GPU count | nan |
GPU model | nan |
Environmental Impact (for one core)
Metric | Value |
---|---|
CPU energy (kWh) | 0.07604643194191456 |
Emissions (Co2eq in kg) | 0.015472650654848416 |
Note
19 juin 2024
My Config
Config | Value |
---|---|
checkpoint | albert-base-v2 |
model_name | BERTrand_base_x12 |
sequence_length | 400 |
num_epoch | 6 |
learning_rate | 1.8e-05 |
batch_size | 8 |
weight_decay | 0.0 |
warm_up_prop | 0.0 |
drop_out_prob | 0.1 |
packing_length | 100 |
train_test_split | 0.2 |
num_steps | 36660 |
Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score | TN | FP | FN | TP |
---|---|---|---|---|---|---|---|
0 | 0.000000 | 0.691535 | 0.003252 | 753.000000 | 9.000000 | 764.000000 | 2.000000 |
1 | 0.306302 | 0.254817 | 0.931034 | 640.000000 | 122.000000 | 37.000000 | 729.000000 |
2 | 0.213653 | 0.277663 | 0.916473 | 658.000000 | 104.000000 | 55.000000 | 711.000000 |
3 | 0.187680 | 0.269253 | 0.896732 | 687.000000 | 75.000000 | 80.000000 | 686.000000 |
4 | 0.149168 | 0.265883 | 0.931210 | 632.000000 | 130.000000 | 35.000000 | 731.000000 |
5 | 0.106768 | 0.276629 | 0.927544 | 656.000000 | 106.000000 | 44.000000 | 722.000000 |
6 | 0.074045 | 0.330394 | 0.924080 | 637.000000 | 125.000000 | 43.000000 | 723.000000 |
- Downloads last month
- 4
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.