Edit model card

longformer-base-4096-bne-es-finetuned

This model is a fine-tuned version of joheras/longformer-base-4096-bne-es-finetuned-clinais on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4271
  • Precision: 0.5208
  • Recall: 0.6368
  • F1: 0.5730
  • Accuracy: 0.8522

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Precision Recall F1 Accuracy
No log 1.0 196 0.6543 0.2024 0.3679 0.2611 0.8004
No log 2.0 392 0.5462 0.2347 0.4047 0.2971 0.8296
0.7514 3.0 588 0.5546 0.2902 0.5075 0.3693 0.8378
0.7514 4.0 784 0.5725 0.2885 0.5019 0.3664 0.8333
0.7514 5.0 980 0.5880 0.3329 0.5358 0.4107 0.8438
0.3391 6.0 1176 0.6615 0.3506 0.5547 0.4297 0.8386
0.3391 7.0 1372 0.6662 0.3534 0.5651 0.4348 0.8322
0.1955 8.0 1568 0.6838 0.3834 0.5726 0.4593 0.8300
0.1955 9.0 1764 0.7898 0.3901 0.5774 0.4656 0.8336
0.1955 10.0 1960 0.7843 0.4179 0.5934 0.4904 0.8329
0.12 11.0 2156 0.7695 0.3794 0.5774 0.4579 0.8228
0.12 12.0 2352 0.7917 0.4077 0.5858 0.4808 0.8468
0.0816 13.0 2548 0.7947 0.4171 0.6075 0.4946 0.8453
0.0816 14.0 2744 0.9385 0.4403 0.6160 0.5136 0.8391
0.0816 15.0 2940 0.9491 0.4163 0.5868 0.4871 0.8316
0.0554 16.0 3136 0.9586 0.4696 0.6189 0.5340 0.8373
0.0554 17.0 3332 0.9221 0.4565 0.6038 0.5199 0.8478
0.0387 18.0 3528 0.9156 0.4600 0.6245 0.5298 0.8486
0.0387 19.0 3724 0.9759 0.4587 0.6132 0.5248 0.8392
0.0387 20.0 3920 0.9874 0.4636 0.6075 0.5259 0.8406
0.0302 21.0 4116 1.0031 0.4697 0.6075 0.5298 0.8477
0.0302 22.0 4312 0.9735 0.4897 0.6292 0.5508 0.8523
0.0216 23.0 4508 1.0142 0.4893 0.6255 0.5491 0.8481
0.0216 24.0 4704 1.0030 0.4761 0.6387 0.5455 0.8540
0.0216 25.0 4900 1.0644 0.4745 0.6132 0.5350 0.8447
0.0195 26.0 5096 1.0565 0.4694 0.6217 0.5349 0.8441
0.0195 27.0 5292 1.0729 0.4781 0.6189 0.5395 0.8488
0.0195 28.0 5488 1.0865 0.4586 0.6274 0.5299 0.8420
0.0141 29.0 5684 1.1745 0.4805 0.6151 0.5395 0.8412
0.0141 30.0 5880 1.1908 0.4657 0.6217 0.5325 0.8378
0.0124 31.0 6076 1.2164 0.5062 0.6160 0.5557 0.8455
0.0124 32.0 6272 1.1651 0.4480 0.6217 0.5207 0.8365
0.0124 33.0 6468 1.1851 0.4700 0.6217 0.5353 0.8383
0.0105 34.0 6664 1.1538 0.4836 0.6274 0.5462 0.8506
0.0105 35.0 6860 1.2399 0.4739 0.6349 0.5427 0.8417
0.0093 36.0 7056 1.1659 0.4920 0.6387 0.5558 0.8450
0.0093 37.0 7252 1.1778 0.4955 0.6283 0.5541 0.8518
0.0093 38.0 7448 1.2633 0.4958 0.6179 0.5502 0.8432
0.0075 39.0 7644 1.1656 0.4960 0.6443 0.5605 0.8490
0.0075 40.0 7840 1.2003 0.4876 0.6292 0.5494 0.8479
0.0063 41.0 8036 1.2807 0.4828 0.6349 0.5485 0.8405
0.0063 42.0 8232 1.2237 0.5130 0.6330 0.5667 0.8528
0.0063 43.0 8428 1.2233 0.4812 0.6406 0.5496 0.8502
0.0047 44.0 8624 1.2412 0.4746 0.6179 0.5369 0.8467
0.0047 45.0 8820 1.2988 0.4985 0.6377 0.5596 0.8470
0.0049 46.0 9016 1.3227 0.4944 0.6264 0.5526 0.8474
0.0049 47.0 9212 1.3627 0.5054 0.6226 0.5579 0.8481
0.0049 48.0 9408 1.3941 0.5169 0.6208 0.5641 0.8404
0.005 49.0 9604 1.3395 0.5108 0.6264 0.5627 0.8457
0.005 50.0 9800 1.2560 0.5027 0.6208 0.5555 0.8517
0.005 51.0 9996 1.3470 0.4715 0.6160 0.5342 0.8438
0.005 52.0 10192 1.2791 0.5109 0.6208 0.5605 0.8517
0.005 53.0 10388 1.3045 0.4788 0.6274 0.5431 0.8480
0.0042 54.0 10584 1.3052 0.4955 0.6292 0.5544 0.8466
0.0042 55.0 10780 1.3140 0.5248 0.6292 0.5723 0.8503
0.0042 56.0 10976 1.2651 0.4776 0.6236 0.5409 0.8445
0.0045 57.0 11172 1.2664 0.4871 0.6255 0.5477 0.8489
0.0045 58.0 11368 1.3141 0.4974 0.6226 0.5530 0.8485
0.0018 59.0 11564 1.3525 0.5123 0.6274 0.5640 0.8460
0.0018 60.0 11760 1.3694 0.5188 0.6368 0.5718 0.8496
0.0018 61.0 11956 1.3892 0.5219 0.6292 0.5706 0.8440
0.0031 62.0 12152 1.3371 0.4951 0.6208 0.5509 0.8475
0.0031 63.0 12348 1.3313 0.5173 0.6349 0.5701 0.8554
0.002 64.0 12544 1.3916 0.5246 0.6349 0.5745 0.8503
0.002 65.0 12740 1.3874 0.5274 0.6358 0.5766 0.8490
0.002 66.0 12936 1.3903 0.4970 0.6292 0.5554 0.8459
0.0026 67.0 13132 1.3595 0.5090 0.6406 0.5673 0.8480
0.0026 68.0 13328 1.3849 0.5019 0.6368 0.5613 0.8478
0.0026 69.0 13524 1.3434 0.5148 0.6396 0.5705 0.8550
0.0026 70.0 13720 1.3593 0.5402 0.6396 0.5857 0.8561
0.0026 71.0 13916 1.3833 0.5227 0.6406 0.5757 0.8503
0.0014 72.0 14112 1.3807 0.4930 0.6283 0.5525 0.8464
0.0014 73.0 14308 1.4330 0.5060 0.6330 0.5624 0.8478
0.0009 74.0 14504 1.3308 0.5236 0.6274 0.5708 0.8603
0.0009 75.0 14700 1.3397 0.4837 0.6170 0.5423 0.8515
0.0009 76.0 14896 1.3821 0.5008 0.6245 0.5558 0.8481
0.0015 77.0 15092 1.3438 0.5030 0.6349 0.5613 0.8502
0.0015 78.0 15288 1.3522 0.5011 0.6208 0.5546 0.8476
0.0015 79.0 15484 1.3951 0.5134 0.6311 0.5662 0.8528
0.0008 80.0 15680 1.3744 0.5126 0.6330 0.5665 0.8510
0.0008 81.0 15876 1.4252 0.4864 0.6245 0.5469 0.8441
0.0006 82.0 16072 1.4555 0.5050 0.6255 0.5588 0.8445
0.0006 83.0 16268 1.4168 0.5107 0.6302 0.5642 0.8492
0.0006 84.0 16464 1.4010 0.4915 0.6302 0.5523 0.8493
0.0004 85.0 16660 1.3800 0.5161 0.6340 0.5690 0.8554
0.0004 86.0 16856 1.4098 0.5083 0.6321 0.5635 0.8520
0.0004 87.0 17052 1.3664 0.5122 0.6340 0.5666 0.8545
0.0004 88.0 17248 1.3677 0.5325 0.6415 0.5819 0.8515
0.0004 89.0 17444 1.4179 0.4912 0.6321 0.5528 0.8504
0.0012 90.0 17640 1.3850 0.5007 0.6340 0.5595 0.8533
0.0012 91.0 17836 1.4313 0.4937 0.6302 0.5537 0.8477
0.001 92.0 18032 1.4223 0.5056 0.6415 0.5655 0.8502
0.001 93.0 18228 1.4322 0.5030 0.6406 0.5635 0.8500
0.001 94.0 18424 1.4387 0.5098 0.6406 0.5677 0.8491
0.0002 95.0 18620 1.4464 0.5188 0.6377 0.5722 0.8501
0.0002 96.0 18816 1.4219 0.5208 0.6387 0.5737 0.8518
0.0011 97.0 19012 1.4226 0.5141 0.6368 0.5689 0.8513
0.0011 98.0 19208 1.4335 0.5195 0.6396 0.5734 0.8516
0.0011 99.0 19404 1.4271 0.5152 0.6396 0.5707 0.8528
0.0006 100.0 19600 1.4271 0.5208 0.6368 0.5730 0.8522

Framework versions

  • Transformers 4.28.1
  • Pytorch 2.0.0+cu117
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
11
Inference API
This model can be loaded on Inference API (serverless).