Edit model card

product-review-information-density-detection-distilbert

This model is a fine-tuned version of distilbert/distilbert-base-uncased on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.2972
  • Accuracy: 0.8387

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 48
  • eval_batch_size: 48
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 67 0.5551 0.7438
No log 2.0 134 0.4422 0.8163
No log 3.0 201 0.4285 0.84
No log 4.0 268 0.4707 0.8263
No log 5.0 335 0.5597 0.825
No log 6.0 402 0.6377 0.8387
No log 7.0 469 0.7444 0.8363
0.2608 8.0 536 0.7492 0.8413
0.2608 9.0 603 0.7549 0.8387
0.2608 10.0 670 0.8264 0.845
0.2608 11.0 737 1.0370 0.8187
0.2608 12.0 804 0.9359 0.8313
0.2608 13.0 871 0.9810 0.8387
0.2608 14.0 938 1.0293 0.84
0.0251 15.0 1005 1.0647 0.8263
0.0251 16.0 1072 1.0693 0.83
0.0251 17.0 1139 1.0656 0.8425
0.0251 18.0 1206 1.1193 0.8313
0.0251 19.0 1273 1.1583 0.8187
0.0251 20.0 1340 1.1257 0.8387
0.0251 21.0 1407 1.1632 0.825
0.0251 22.0 1474 1.2419 0.8213
0.0108 23.0 1541 1.1635 0.84
0.0108 24.0 1608 1.1951 0.8287
0.0108 25.0 1675 1.1710 0.845
0.0108 26.0 1742 1.2204 0.83
0.0108 27.0 1809 1.2166 0.8413
0.0108 28.0 1876 1.2335 0.8363
0.0108 29.0 1943 1.2355 0.8363
0.007 30.0 2010 1.2423 0.8425
0.007 31.0 2077 1.2511 0.8425
0.007 32.0 2144 1.2563 0.84
0.007 33.0 2211 1.2501 0.8413
0.007 34.0 2278 1.2431 0.8375
0.007 35.0 2345 1.2553 0.8387
0.007 36.0 2412 1.2635 0.8425
0.007 37.0 2479 1.2970 0.835
0.0061 38.0 2546 1.2894 0.8375
0.0061 39.0 2613 1.2773 0.84
0.0061 40.0 2680 1.2836 0.84
0.0061 41.0 2747 1.2916 0.8375
0.0061 42.0 2814 1.2869 0.8387
0.0061 43.0 2881 1.3032 0.8287
0.0061 44.0 2948 1.3056 0.8413
0.0047 45.0 3015 1.2813 0.8438
0.0047 46.0 3082 1.2811 0.8413
0.0047 47.0 3149 1.2858 0.8413
0.0047 48.0 3216 1.2960 0.8387
0.0047 49.0 3283 1.2971 0.8387
0.0047 50.0 3350 1.2972 0.8387

Framework versions

  • Transformers 4.39.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2
Downloads last month
6
Safetensors
Model size
67M params
Tensor type
F32
·

Finetuned from