Environmental Impact (CODE CARBON DEFAULT)
Metric | Value |
---|---|
Duration (in seconds) | 84055.49200820923 |
Emissions (Co2eq in kg) | 0.050863255182789 |
CPU power (W) | 42.5 |
GPU power (W) | [No GPU] |
RAM power (W) | 3.75 |
CPU energy (kWh) | 0.992319499941666 |
GPU energy (kWh) | [No GPU] |
RAM energy (kWh) | 0.0875568335287273 |
Consumed energy (kWh) | 1.0798763334703938 |
Country name | Switzerland |
Cloud provider | nan |
Cloud region | nan |
CPU count | 2 |
CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
GPU count | nan |
GPU model | nan |
Environmental Impact (for one core)
Metric | Value |
---|---|
CPU energy (kWh) | 0.16180682211580277 |
Emissions (Co2eq in kg) | 0.032921734369881946 |
Note
21 May 2024
My Config
Config | Value |
---|---|
checkpoint | albert-base-v2 |
model_name | ft_32_1e6_base |
sequence_length | 400 |
num_epoch | 6 |
learning_rate | 1e-06 |
batch_size | 32 |
weight_decay | 0.0 |
warm_up_prop | 0.0 |
drop_out_prob | 0.1 |
packing_length | 100 |
train_test_split | 0.2 |
num_steps | 32586 |
Training and Testing steps
Epoch | Train Loss | Test Loss | Accuracy | Recall |
---|---|---|---|---|
0 | 0.601054 | 0.541133 | 0.737178 | 0.841699 |
1 | 0.497217 | 0.475979 | 0.778429 | 0.853965 |
2 | 0.427722 | 0.442907 | 0.795959 | 0.866545 |
3 | 0.370634 | 0.431111 | 0.797575 | 0.823108 |
4 | 0.322800 | 0.436151 | 0.801262 | 0.808254 |
5 | 0.273025 | 0.449454 | 0.796988 | 0.803851 |
- Downloads last month
- 1
This model does not have enough activity to be deployed to Inference API (serverless) yet.
Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.