Edit model card

German-MedBERT

This model is a fine-tuned version of smanjil/German-MedBERT on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5145
  • F1: 0.4561

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-07
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss F1
0.693 1.0 189 0.6754 0.0698
0.6853 2.0 378 0.6626 0.0339
0.6654 3.0 567 0.6499 0.0488
0.6562 4.0 756 0.6399 0.0541
0.6554 5.0 945 0.6335 0.0556
0.6394 6.0 1134 0.6260 0.0571
0.6452 7.0 1323 0.6220 0.0571
0.6257 8.0 1512 0.6161 0.0571
0.6334 9.0 1701 0.6117 0.0571
0.6302 10.0 1890 0.6068 0.0571
0.6151 11.0 2079 0.6011 0.0571
0.6121 12.0 2268 0.5961 0.0571
0.6097 13.0 2457 0.5915 0.0571
0.5929 14.0 2646 0.5865 0.0556
0.5955 15.0 2835 0.5822 0.0556
0.5893 16.0 3024 0.5776 0.1053
0.5936 17.0 3213 0.5731 0.1
0.5769 18.0 3402 0.5687 0.1
0.5692 19.0 3591 0.5646 0.1
0.5739 20.0 3780 0.5604 0.2326
0.5705 21.0 3969 0.5564 0.2326
0.5651 22.0 4158 0.5525 0.2727
0.5654 23.0 4347 0.5494 0.2727
0.5527 24.0 4536 0.5456 0.2727
0.5542 25.0 4725 0.5425 0.2727
0.5464 26.0 4914 0.5395 0.2727
0.5383 27.0 5103 0.5364 0.3111
0.5323 28.0 5292 0.5348 0.3111
0.5343 29.0 5481 0.5318 0.3404
0.5305 30.0 5670 0.5299 0.4082
0.5252 31.0 5859 0.5278 0.4
0.516 32.0 6048 0.5270 0.3922
0.5181 33.0 6237 0.5243 0.4231
0.5202 34.0 6426 0.5230 0.4231
0.5068 35.0 6615 0.5224 0.4231
0.514 36.0 6804 0.5205 0.4528
0.5014 37.0 6993 0.5194 0.4528
0.4899 38.0 7182 0.5188 0.4444
0.5104 39.0 7371 0.5164 0.4364
0.4823 40.0 7560 0.5174 0.4444
0.515 41.0 7749 0.5155 0.4364
0.4906 42.0 7938 0.5154 0.4364
0.4853 43.0 8127 0.5158 0.4364
0.5006 44.0 8316 0.5153 0.4364
0.503 45.0 8505 0.5146 0.4561
0.4915 46.0 8694 0.5141 0.4561
0.4903 47.0 8883 0.5144 0.4561
0.4892 48.0 9072 0.5146 0.4561
0.4939 49.0 9261 0.5146 0.4561
0.5007 50.0 9450 0.5145 0.4561

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.2
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
7

Finetuned from