Edit model card

Environmental Impact (CODE CARBON DEFAULT)

Metric Value
Duration (in seconds) 26423.72340154648
Emissions (Co2eq in kg) 0.0172858379899915
CPU power (W) 42.5
GPU power (W) [No GPU]
RAM power (W) 7.5
CPU energy (kWh) 0.3119461647588344
GPU energy (kWh) [No GPU]
RAM energy (kWh) 0.0550489731361468
Consumed energy (kWh) 0.3669951378949813
Country name Switzerland
Cloud provider nan
Cloud region nan
CPU count 3
CPU model Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
GPU count nan
GPU model nan

Environmental Impact (for one core)

Metric Value
CPU energy (kWh) 0.050865667547976966
Emissions (Co2eq in kg) 0.010349291665605703

Note

17 May 2024

My Config

Config Value
checkpoint albert-base-v2
model_name ft_bs64_lr7_base_x2
sequence_length 400
num_epoch 15
learning_rate 5e-07
batch_size 64
weight_decay 0.0
warm_up_prop 0.0
drop_out_prob 0.1
packing_length 100
train_test_split 0.2
num_steps 81450

Training and Testing steps

Epoch Train Loss Test Loss Accuracy Recall
0 0.664386 0.632545 0.740795 0.837423
1 0.589979 0.557067 0.743741 0.838957
2 0.512979 0.506961 0.767305 0.808282
3 0.471688 0.479002 0.779823 0.869632
4 0.439011 0.454375 0.788660 0.803681
5 0.413775 0.434841 0.802651 0.848160
6 0.392609 0.420262 0.807806 0.842025
7 0.380271 0.409428 0.809278 0.803681
8 0.365458 0.399789 0.825479 0.861963
9 0.353928 0.391207 0.829161 0.858896
10 0.342954 0.388762 0.827688 0.863497
11 0.335871 0.389029 0.827688 0.880368
12 0.328735 0.381536 0.827688 0.863497
13 0.320389 0.374983 0.827688 0.837423
14 0.314211 0.374905 0.826951 0.826687
Downloads last month
3
Safetensors
Model size
18.8M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.