Edit model card

Environmental Impact (CODE CARBON DEFAULT)

Metric Value
Duration (in seconds) 22799.569629907608
Emissions (Co2eq in kg) 0.0140201000358722
CPU power (W) 42.5
GPU power (W) [No GPU]
RAM power (W) 4.500000000000001
CPU energy (kWh) 0.2691611221477383
GPU energy (kWh) [No GPU]
RAM energy (kWh) 0.0284992255279422
Consumed energy (kWh) 0.2976603476756804
Country name Switzerland
Cloud provider nan
Cloud region nan
CPU count 2
CPU model Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
GPU count nan
GPU model nan

Environmental Impact (for one core)

Metric Value
CPU energy (kWh) 0.04388917153757214
Emissions (Co2eq in kg) 0.008929831438380479

Note

16 May 2024

My Config

Config Value
checkpoint albert-base-v2
model_name ft_bs32_lr7_base
sequence_length 400
num_epoch 10
learning_rate 5e-07
batch_size 32
weight_decay 0.0
warm_up_prop 0.0
drop_out_prob 0.1
packing_length 100
train_test_split 0.2
num_steps 54300

Training and Testing steps

Epoch Train Loss Test Loss Accuracy Recall
0 0.631623 0.594133 0.694404 0.823620
1 0.558485 0.541231 0.733432 0.809816
2 0.500646 0.493926 0.761414 0.760736
3 0.449850 0.456491 0.795287 0.805215
4 0.414101 0.442600 0.801178 0.874233
5 0.378759 0.425400 0.809278 0.852761
6 0.350256 0.424491 0.810751 0.868098
7 0.323519 0.420775 0.810015 0.837423
8 0.289084 0.426480 0.810751 0.838957
9 0.261814 0.436926 0.804860 0.822086
Downloads last month
4
Safetensors
Model size
11.7M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.