You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

gysbert_historical_fmp2_ogtok_output_sentiment

This model was trained from scratch on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5922
  • Accuracy: 0.7853
  • F1: 0.7324
  • Precision: 0.7274
  • Recall: 0.7384

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-06
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 200
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy F1 Precision Recall
1.0609 0.3030 100 0.9465 0.5911 0.2477 0.1970 0.3333
0.8599 0.6061 200 0.7103 0.6882 0.4985 0.6382 0.5230
0.6695 0.9091 300 0.6296 0.7087 0.4932 0.4571 0.5580
0.6168 1.2121 400 0.5866 0.7445 0.5998 0.6782 0.6232
0.5644 1.5152 500 0.5479 0.7734 0.6650 0.7096 0.6655
0.5245 1.8182 600 0.5417 0.7666 0.6721 0.7117 0.6800
0.4996 2.1212 700 0.5318 0.7700 0.6970 0.6978 0.7042
0.441 2.4242 800 0.5161 0.7785 0.7004 0.7086 0.7008
0.4527 2.7273 900 0.5275 0.7666 0.6984 0.6919 0.7121
0.4624 3.0303 1000 0.5324 0.7598 0.6910 0.6820 0.7034
0.376 3.3333 1100 0.5353 0.7751 0.7010 0.6996 0.7050
0.3767 3.6364 1200 0.5633 0.7700 0.7044 0.6953 0.7189
0.3902 3.9394 1300 0.5420 0.7751 0.7101 0.7038 0.7194
0.313 4.2424 1400 0.5688 0.7802 0.7167 0.7094 0.7276
0.3085 4.5455 1500 0.5813 0.7717 0.7063 0.6978 0.7179
0.3268 4.8485 1600 0.5843 0.7768 0.7107 0.7019 0.7224
0.2858 5.1515 1700 0.6147 0.7768 0.7102 0.7028 0.7236
0.2608 5.4545 1800 0.6328 0.7666 0.6979 0.6938 0.7048
0.2529 5.7576 1900 0.6575 0.7717 0.6988 0.6984 0.7039
0.2221 6.0606 2000 0.6823 0.7615 0.6938 0.6892 0.6999

Framework versions

  • Transformers 4.40.2
  • Pytorch 2.1.2
  • Datasets 2.18.0
  • Tokenizers 0.19.1
Downloads last month
0
Safetensors
Model size
109M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.