Environmental Impact (CODE CARBON DEFAULT)
Metric | Value |
---|---|
Duration (in seconds) | 95635.66425299644 |
Emissions (Co2eq in kg) | 0.0578705814908073 |
CPU power (W) | 42.5 |
GPU power (W) | [No GPU] |
RAM power (W) | 3.75 |
CPU energy (kWh) | 1.1290293123470423 |
GPU energy (kWh) | [No GPU] |
RAM energy (kWh) | 0.0996193679528929 |
Consumed energy (kWh) | 1.2286486802999383 |
Country name | Switzerland |
Cloud provider | nan |
Cloud region | nan |
CPU count | 2 |
CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
GPU count | nan |
GPU model | nan |
Environmental Impact (for one core)
Metric | Value |
---|---|
CPU energy (kWh) | 0.18409865368701817 |
Emissions (Co2eq in kg) | 0.03745730183242361 |
Note
19 juin 2024
My Config
Config | Value |
---|---|
checkpoint | albert-base-v2 |
model_name | ft_4_11e6_base_x4 |
sequence_length | 400 |
num_epoch | 6 |
learning_rate | 1.1e-05 |
batch_size | 4 |
weight_decay | 0.0 |
warm_up_prop | 0.0 |
drop_out_prob | 0.1 |
packing_length | 100 |
train_test_split | 0.2 |
num_steps | 29328 |
Training and Testing steps
Epoch | Train Loss | Test Loss | F-beta Score |
---|---|---|---|
0 | 0.000000 | 0.735825 | 0.214299 |
1 | 0.294253 | 0.237996 | 0.884702 |
2 | 0.195713 | 0.225240 | 0.927937 |
3 | 0.143365 | 0.248854 | 0.897371 |
4 | 0.100420 | 0.290587 | 0.902752 |
5 | 0.066621 | 0.303921 | 0.915935 |
6 | 0.044707 | 0.403602 | 0.876595 |
- Downloads last month
- 13
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.