Edit model card

Environmental Impact (CODE CARBON DEFAULT)

Metric Value
Duration (in seconds) 29669.0817964077
Emissions (Co2eq in kg) 0.0194088841146787
CPU power (W) 42.5
GPU power (W) [No GPU]
RAM power (W) 7.5
CPU energy (kWh) 0.3502593684590531
GPU energy (kWh) [No GPU]
RAM energy (kWh) 0.0618101017152269
Consumed energy (kWh) 0.4120694701742792
Country name Switzerland
Cloud provider nan
Cloud region nan
CPU count 3
CPU model Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
GPU count nan
GPU model nan

Environmental Impact (for one core)

Metric Value
CPU energy (kWh) 0.05711298245808482
Emissions (Co2eq in kg) 0.011620390370259682

Note

20 May 2024

My Config

Config Value
checkpoint albert-base-v2
model_name ft_bs32_1lr6_base_x8
sequence_length 400
num_epoch 20
learning_rate 1e-06
batch_size 32
weight_decay 0.0
warm_up_prop 0.0
drop_out_prob 0.1
packing_length 100
train_test_split 0.2
num_steps 108600

Training and Testing steps

Epoch Train Loss Test Loss Accuracy Recall
0 0.563129 0.502131 0.748159 0.777607
1 0.467373 0.454147 0.782769 0.842025
2 0.418498 0.435779 0.794551 0.907975
3 0.379170 0.403679 0.811487 0.895706
4 0.358712 0.382256 0.827688 0.858896
5 0.340088 0.380777 0.834315 0.880368
6 0.326862 0.395078 0.823270 0.897239
7 0.314514 0.419026 0.816642 0.929448
8 0.302010 0.378412 0.832842 0.834356
9 0.293725 0.385449 0.824006 0.797546
10 0.286153 0.380928 0.835052 0.874233
11 0.267783 0.388242 0.836524 0.877301
12 0.255809 0.398119 0.830633 0.831288
13 0.245926 0.413752 0.819588 0.797546
14 0.236472 0.416892 0.815906 0.794479
15 0.223494 0.431361 0.830633 0.872699
16 0.207387 0.438017 0.815169 0.808282
17 0.198799 0.445411 0.819588 0.819018
18 0.182488 0.460939 0.821060 0.837423
19 0.168675 0.513154 0.817378 0.900307
Downloads last month
5
Safetensors
Model size
61.3M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.