Environmental Impact (CODE CARBON DEFAULT)
Metric | Value |
---|---|
Duration (in seconds) | 14284.394858121872 |
Emissions (Co2eq in kg) | 0.0093445332329461 |
CPU power (W) | 42.5 |
GPU power (W) | [No GPU] |
RAM power (W) | 7.5 |
CPU energy (kWh) | 0.1686347934653361 |
GPU energy (kWh) | [No GPU] |
RAM energy (kWh) | 0.0297587275415659 |
Consumed energy (kWh) | 0.1983935210069022 |
Country name | Switzerland |
Cloud provider | nan |
Cloud region | nan |
CPU count | 3 |
CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
GPU count | nan |
GPU model | nan |
Environmental Impact (for one core)
Metric | Value |
---|---|
CPU energy (kWh) | 0.027497460101884603 |
Emissions (Co2eq in kg) | 0.005594721319431066 |
Note
17 May 2024
My Config
Config | Value |
---|---|
checkpoint | albert-base-v2 |
model_name | ft_bs64_lr6_base_x2 |
sequence_length | 400 |
num_epoch | 6 |
learning_rate | 5e-06 |
batch_size | 64 |
weight_decay | 0.0 |
warm_up_prop | 0.0 |
drop_out_prob | 0.1 |
packing_length | 100 |
train_test_split | 0.2 |
num_steps | 32580 |
Training and Testing steps
Epoch | Train Loss | Test Loss | Accuracy | Recall |
---|---|---|---|---|
0 | 0.509996 | 0.432674 | 0.800442 | 0.815951 |
1 | 0.378199 | 0.369624 | 0.834315 | 0.825153 |
2 | 0.330051 | 0.377433 | 0.835788 | 0.924847 |
3 | 0.294745 | 0.346708 | 0.849779 | 0.878834 |
4 | 0.271853 | 0.396932 | 0.829897 | 0.757669 |
5 | 0.212074 | 0.384297 | 0.846097 | 0.848160 |
- Downloads last month
- 12
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.