bert-finetuned-ner
This model is a fine-tuned version of bert-base-cased on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0274
- Precision: 0.9550
- Recall: 0.9638
- F1: 0.9594
- Accuracy: 0.9973
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 100
Training results
Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
---|---|---|---|---|---|---|---|
No log | 1.0 | 148 | 0.0305 | 0.8341 | 0.8789 | 0.8559 | 0.9934 |
No log | 2.0 | 296 | 0.0215 | 0.8834 | 0.9355 | 0.9087 | 0.9953 |
No log | 3.0 | 444 | 0.0195 | 0.9140 | 0.9435 | 0.9285 | 0.9961 |
0.0655 | 4.0 | 592 | 0.0195 | 0.9282 | 0.9498 | 0.9389 | 0.9964 |
0.0655 | 5.0 | 740 | 0.0203 | 0.9177 | 0.9539 | 0.9355 | 0.9962 |
0.0655 | 6.0 | 888 | 0.0201 | 0.9401 | 0.9552 | 0.9475 | 0.9966 |
0.0056 | 7.0 | 1036 | 0.0200 | 0.9355 | 0.9535 | 0.9444 | 0.9968 |
0.0056 | 8.0 | 1184 | 0.0208 | 0.9393 | 0.9569 | 0.9480 | 0.9967 |
0.0056 | 9.0 | 1332 | 0.0215 | 0.9380 | 0.9549 | 0.9464 | 0.9968 |
0.0056 | 10.0 | 1480 | 0.0232 | 0.9188 | 0.9582 | 0.9381 | 0.9960 |
0.0024 | 11.0 | 1628 | 0.0212 | 0.9334 | 0.9554 | 0.9442 | 0.9967 |
0.0024 | 12.0 | 1776 | 0.0223 | 0.9383 | 0.9598 | 0.9489 | 0.9968 |
0.0024 | 13.0 | 1924 | 0.0225 | 0.9394 | 0.9542 | 0.9468 | 0.9967 |
0.0012 | 14.0 | 2072 | 0.0232 | 0.9415 | 0.9560 | 0.9487 | 0.9968 |
0.0012 | 15.0 | 2220 | 0.0238 | 0.9413 | 0.9580 | 0.9496 | 0.9967 |
0.0012 | 16.0 | 2368 | 0.0239 | 0.9396 | 0.9582 | 0.9488 | 0.9966 |
0.001 | 17.0 | 2516 | 0.0230 | 0.9328 | 0.9563 | 0.9444 | 0.9966 |
0.001 | 18.0 | 2664 | 0.0243 | 0.9342 | 0.9577 | 0.9458 | 0.9966 |
0.001 | 19.0 | 2812 | 0.0246 | 0.9423 | 0.9576 | 0.9499 | 0.9969 |
0.001 | 20.0 | 2960 | 0.0240 | 0.9355 | 0.9576 | 0.9464 | 0.9967 |
0.0006 | 21.0 | 3108 | 0.0241 | 0.9477 | 0.9599 | 0.9538 | 0.9970 |
0.0006 | 22.0 | 3256 | 0.0236 | 0.9443 | 0.9569 | 0.9505 | 0.9968 |
0.0006 | 23.0 | 3404 | 0.0244 | 0.9461 | 0.9578 | 0.9519 | 0.9969 |
0.0006 | 24.0 | 3552 | 0.0248 | 0.9417 | 0.96 | 0.9508 | 0.9969 |
0.0006 | 25.0 | 3700 | 0.0246 | 0.9336 | 0.9590 | 0.9461 | 0.9966 |
0.0006 | 26.0 | 3848 | 0.0236 | 0.9421 | 0.9589 | 0.9504 | 0.9968 |
0.0006 | 27.0 | 3996 | 0.0244 | 0.9441 | 0.9612 | 0.9526 | 0.9969 |
0.0004 | 28.0 | 4144 | 0.0250 | 0.9462 | 0.9594 | 0.9528 | 0.9969 |
0.0004 | 29.0 | 4292 | 0.0249 | 0.9430 | 0.9622 | 0.9525 | 0.9969 |
0.0004 | 30.0 | 4440 | 0.0252 | 0.9439 | 0.9612 | 0.9525 | 0.9969 |
0.0003 | 31.0 | 4588 | 0.0253 | 0.9480 | 0.9552 | 0.9515 | 0.9968 |
0.0003 | 32.0 | 4736 | 0.0229 | 0.9484 | 0.9619 | 0.9551 | 0.9969 |
0.0003 | 33.0 | 4884 | 0.0235 | 0.9485 | 0.9608 | 0.9546 | 0.9970 |
0.0003 | 34.0 | 5032 | 0.0247 | 0.9438 | 0.9611 | 0.9524 | 0.9969 |
0.0003 | 35.0 | 5180 | 0.0248 | 0.9481 | 0.9598 | 0.9539 | 0.9970 |
0.0003 | 36.0 | 5328 | 0.0245 | 0.9441 | 0.9621 | 0.9530 | 0.9969 |
0.0003 | 37.0 | 5476 | 0.0255 | 0.9417 | 0.9602 | 0.9508 | 0.9967 |
0.0002 | 38.0 | 5624 | 0.0255 | 0.9416 | 0.9595 | 0.9505 | 0.9969 |
0.0002 | 39.0 | 5772 | 0.0246 | 0.9524 | 0.9611 | 0.9567 | 0.9971 |
0.0002 | 40.0 | 5920 | 0.0254 | 0.9435 | 0.9611 | 0.9522 | 0.9969 |
0.0003 | 41.0 | 6068 | 0.0252 | 0.9386 | 0.9608 | 0.9496 | 0.9966 |
0.0003 | 42.0 | 6216 | 0.0257 | 0.9385 | 0.9601 | 0.9492 | 0.9968 |
0.0003 | 43.0 | 6364 | 0.0251 | 0.9491 | 0.9591 | 0.9541 | 0.9970 |
0.0002 | 44.0 | 6512 | 0.0251 | 0.9448 | 0.9610 | 0.9528 | 0.9970 |
0.0002 | 45.0 | 6660 | 0.0252 | 0.9508 | 0.9622 | 0.9565 | 0.9972 |
0.0002 | 46.0 | 6808 | 0.0252 | 0.9486 | 0.9613 | 0.9549 | 0.9971 |
0.0002 | 47.0 | 6956 | 0.0262 | 0.9498 | 0.9618 | 0.9558 | 0.9971 |
0.0001 | 48.0 | 7104 | 0.0263 | 0.9520 | 0.9624 | 0.9572 | 0.9971 |
0.0001 | 49.0 | 7252 | 0.0263 | 0.9521 | 0.9624 | 0.9573 | 0.9971 |
0.0001 | 50.0 | 7400 | 0.0260 | 0.9526 | 0.9618 | 0.9572 | 0.9972 |
0.0001 | 51.0 | 7548 | 0.0248 | 0.9493 | 0.9634 | 0.9563 | 0.9971 |
0.0001 | 52.0 | 7696 | 0.0255 | 0.9502 | 0.9618 | 0.9560 | 0.9971 |
0.0001 | 53.0 | 7844 | 0.0258 | 0.9522 | 0.9617 | 0.9569 | 0.9972 |
0.0001 | 54.0 | 7992 | 0.0258 | 0.9481 | 0.9615 | 0.9548 | 0.9970 |
0.0001 | 55.0 | 8140 | 0.0251 | 0.9520 | 0.9617 | 0.9568 | 0.9972 |
0.0001 | 56.0 | 8288 | 0.0250 | 0.9509 | 0.9608 | 0.9558 | 0.9972 |
0.0001 | 57.0 | 8436 | 0.0260 | 0.9462 | 0.9601 | 0.9531 | 0.9972 |
0.0001 | 58.0 | 8584 | 0.0252 | 0.9563 | 0.9628 | 0.9595 | 0.9973 |
0.0001 | 59.0 | 8732 | 0.0247 | 0.9506 | 0.9624 | 0.9565 | 0.9972 |
0.0001 | 60.0 | 8880 | 0.0251 | 0.9510 | 0.9611 | 0.9560 | 0.9972 |
0.0001 | 61.0 | 9028 | 0.0255 | 0.9495 | 0.9614 | 0.9554 | 0.9972 |
0.0001 | 62.0 | 9176 | 0.0259 | 0.9537 | 0.9613 | 0.9575 | 0.9972 |
0.0001 | 63.0 | 9324 | 0.0259 | 0.9506 | 0.9609 | 0.9557 | 0.9972 |
0.0001 | 64.0 | 9472 | 0.0260 | 0.9544 | 0.9595 | 0.9569 | 0.9972 |
0.0 | 65.0 | 9620 | 0.0253 | 0.9511 | 0.9604 | 0.9557 | 0.9972 |
0.0 | 66.0 | 9768 | 0.0257 | 0.9526 | 0.9604 | 0.9565 | 0.9972 |
0.0 | 67.0 | 9916 | 0.0263 | 0.9528 | 0.9605 | 0.9566 | 0.9972 |
0.0 | 68.0 | 10064 | 0.0271 | 0.9544 | 0.9598 | 0.9571 | 0.9972 |
0.0 | 69.0 | 10212 | 0.0269 | 0.9530 | 0.9611 | 0.9571 | 0.9972 |
0.0 | 70.0 | 10360 | 0.0273 | 0.9514 | 0.9609 | 0.9561 | 0.9972 |
0.0 | 71.0 | 10508 | 0.0275 | 0.9535 | 0.9612 | 0.9573 | 0.9972 |
0.0 | 72.0 | 10656 | 0.0275 | 0.9524 | 0.9632 | 0.9578 | 0.9972 |
0.0 | 73.0 | 10804 | 0.0279 | 0.9537 | 0.9596 | 0.9566 | 0.9972 |
0.0 | 74.0 | 10952 | 0.0277 | 0.9475 | 0.9633 | 0.9554 | 0.9970 |
0.0 | 75.0 | 11100 | 0.0272 | 0.9537 | 0.9614 | 0.9575 | 0.9972 |
0.0 | 76.0 | 11248 | 0.0269 | 0.9541 | 0.9619 | 0.9580 | 0.9972 |
0.0 | 77.0 | 11396 | 0.0271 | 0.9552 | 0.9625 | 0.9588 | 0.9972 |
0.0 | 78.0 | 11544 | 0.0274 | 0.9457 | 0.9619 | 0.9537 | 0.9970 |
0.0 | 79.0 | 11692 | 0.0273 | 0.9524 | 0.9616 | 0.9570 | 0.9972 |
0.0 | 80.0 | 11840 | 0.0275 | 0.9530 | 0.9632 | 0.9581 | 0.9972 |
0.0 | 81.0 | 11988 | 0.0271 | 0.9496 | 0.9639 | 0.9567 | 0.9971 |
0.0 | 82.0 | 12136 | 0.0280 | 0.9537 | 0.9614 | 0.9575 | 0.9972 |
0.0 | 83.0 | 12284 | 0.0277 | 0.9499 | 0.9642 | 0.9570 | 0.9970 |
0.0 | 84.0 | 12432 | 0.0275 | 0.9517 | 0.9621 | 0.9569 | 0.9971 |
0.0 | 85.0 | 12580 | 0.0277 | 0.9524 | 0.9635 | 0.9579 | 0.9972 |
0.0 | 86.0 | 12728 | 0.0275 | 0.9517 | 0.9648 | 0.9582 | 0.9972 |
0.0 | 87.0 | 12876 | 0.0276 | 0.9519 | 0.9636 | 0.9577 | 0.9972 |
0.0 | 88.0 | 13024 | 0.0276 | 0.9541 | 0.9647 | 0.9594 | 0.9972 |
0.0 | 89.0 | 13172 | 0.0275 | 0.9500 | 0.9642 | 0.9571 | 0.9971 |
0.0 | 90.0 | 13320 | 0.0276 | 0.9532 | 0.9635 | 0.9584 | 0.9972 |
0.0 | 91.0 | 13468 | 0.0273 | 0.9542 | 0.9636 | 0.9589 | 0.9972 |
0.0 | 92.0 | 13616 | 0.0274 | 0.9541 | 0.9636 | 0.9588 | 0.9973 |
0.0 | 93.0 | 13764 | 0.0274 | 0.9552 | 0.9638 | 0.9595 | 0.9973 |
0.0 | 94.0 | 13912 | 0.0275 | 0.9547 | 0.9636 | 0.9591 | 0.9973 |
0.0 | 95.0 | 14060 | 0.0274 | 0.9557 | 0.9639 | 0.9598 | 0.9973 |
0.0 | 96.0 | 14208 | 0.0274 | 0.9548 | 0.9638 | 0.9593 | 0.9973 |
0.0 | 97.0 | 14356 | 0.0274 | 0.9550 | 0.9641 | 0.9595 | 0.9973 |
0.0 | 98.0 | 14504 | 0.0275 | 0.9552 | 0.9643 | 0.9597 | 0.9973 |
0.0 | 99.0 | 14652 | 0.0274 | 0.9549 | 0.9638 | 0.9593 | 0.9973 |
0.0 | 100.0 | 14800 | 0.0274 | 0.9550 | 0.9638 | 0.9594 | 0.9973 |
Framework versions
- Transformers 4.46.3
- Pytorch 2.5.1+cu121
- Datasets 3.1.0
- Tokenizers 0.20.3
- Downloads last month
- 9
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for deivism/bert-finetuned-ner
Base model
google-bert/bert-base-cased