nyt-ingredient-tagger-gte-small

This model is a fine-tuned version of thenlper/gte-small on the nyt_ingredients dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8647
  • Comment: {'precision': 0.700344295381693, 'recall': 0.8211302895322939, 'f1': 0.7559428461587748, 'number': 7184}
  • Name: {'precision': 0.8125, 'recall': 0.8314522197140707, 'f1': 0.8218668650055783, 'number': 9303}
  • Qty: {'precision': 0.987037037037037, 'recall': 0.9920233980324382, 'f1': 0.9895239358175308, 'number': 7522}
  • Range End: {'precision': 0.7394366197183099, 'recall': 0.9375, 'f1': 0.8267716535433071, 'number': 112}
  • Unit: {'precision': 0.9283142901862577, 'recall': 0.9870194707938093, 'f1': 0.9567672205194386, 'number': 6009}
  • Overall Precision: 0.8470
  • Overall Recall: 0.9005
  • Overall F1: 0.8729
  • Overall Accuracy: 0.8468

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss Comment Name Qty Range End Unit Overall Precision Overall Recall Overall F1 Overall Accuracy
0.9427 0.2 1000 0.9321 {'precision': 0.6076682497393722, 'recall': 0.7789161098737936, 'f1': 0.6827173347214992, 'number': 6735} {'precision': 0.7829228243021347, 'recall': 0.8126349278491081, 'f1': 0.7975022301516503, 'number': 8801} {'precision': 0.97529493407356, 'recall': 0.9913939051918735, 'f1': 0.9832785279507451, 'number': 7088} {'precision': 0.52046783625731, 'recall': 0.978021978021978, 'f1': 0.6793893129770994, 'number': 91} {'precision': 0.914435009797518, 'recall': 0.982973494821836, 'f1': 0.9474663734032653, 'number': 5697} 0.8032 0.8839 0.8416 0.8183
0.9169 0.4 2000 0.9112 {'precision': 0.6358395387941388, 'recall': 0.7860430586488493, 'f1': 0.7030077684084721, 'number': 6735} {'precision': 0.781960613643782, 'recall': 0.8166117486649245, 'f1': 0.798910626945309, 'number': 8801} {'precision': 0.9746502285635129, 'recall': 0.9926636568848759, 'f1': 0.9835744740336898, 'number': 7088} {'precision': 0.6277372262773723, 'recall': 0.945054945054945, 'f1': 0.7543859649122806, 'number': 91} {'precision': 0.9176201372997712, 'recall': 0.9854309285588907, 'f1': 0.9503173931443081, 'number': 5697} 0.8137 0.8875 0.8490 0.8256
0.9 0.59 3000 0.9021 {'precision': 0.6538183570224042, 'recall': 0.8059391239792131, 'f1': 0.7219525171244263, 'number': 6735} {'precision': 0.7920824651825858, 'recall': 0.8207021929326213, 'f1': 0.8061383928571427, 'number': 8801} {'precision': 0.9837443946188341, 'recall': 0.9904063205417607, 'f1': 0.9870641169853768, 'number': 7088} {'precision': 0.6792452830188679, 'recall': 0.7912087912087912, 'f1': 0.7309644670050762, 'number': 91} {'precision': 0.9138857235878235, 'recall': 0.9854309285588907, 'f1': 0.9483108108108108, 'number': 5697} 0.8231 0.8925 0.8564 0.8302
0.9061 0.79 4000 0.8912 {'precision': 0.6613263785394933, 'recall': 0.7906458797327395, 'f1': 0.7202272266179753, 'number': 6735} {'precision': 0.7952695269526953, 'recall': 0.821383933643904, 'f1': 0.8081158124196521, 'number': 8801} {'precision': 0.985236220472441, 'recall': 0.9885722347629797, 'f1': 0.9869014084507043, 'number': 7088} {'precision': 0.656, 'recall': 0.9010989010989011, 'f1': 0.7592592592592593, 'number': 91} {'precision': 0.9215106732348112, 'recall': 0.9850798665964543, 'f1': 0.9522355137015356, 'number': 5697} 0.8289 0.8889 0.8578 0.8326
0.8889 0.99 5000 0.8908 {'precision': 0.6653019447287615, 'recall': 0.7720861172976986, 'f1': 0.7147275101367603, 'number': 6735} {'precision': 0.7959925134867335, 'recall': 0.8214975570957845, 'f1': 0.8085439498993513, 'number': 8801} {'precision': 0.9857926571951048, 'recall': 0.9887133182844243, 'f1': 0.9872508276396421, 'number': 7088} {'precision': 0.6439393939393939, 'recall': 0.9340659340659341, 'f1': 0.7623318385650225, 'number': 91} {'precision': 0.9227729293594599, 'recall': 0.9836756187467088, 'f1': 0.9522514868309261, 'number': 5697} 0.8317 0.8844 0.8572 0.8311
0.88 1.19 6000 0.8873 {'precision': 0.660211910851297, 'recall': 0.8048997772828508, 'f1': 0.7254114813327982, 'number': 6735} {'precision': 0.7929331723910896, 'recall': 0.8210430632882627, 'f1': 0.8067433292397008, 'number': 8801} {'precision': 0.9827899818105499, 'recall': 0.9909706546275395, 'f1': 0.9868633649455567, 'number': 7088} {'precision': 0.635036496350365, 'recall': 0.9560439560439561, 'f1': 0.7631578947368421, 'number': 91} {'precision': 0.9224067072168338, 'recall': 0.984904335615236, 'f1': 0.9526315789473684, 'number': 5697} 0.8266 0.8929 0.8585 0.8338
0.8751 1.39 7000 0.8866 {'precision': 0.6715582638975252, 'recall': 0.8017817371937639, 'f1': 0.7309149972929073, 'number': 6735} {'precision': 0.8020833333333334, 'recall': 0.8224065447108283, 'f1': 0.8121178120617111, 'number': 8801} {'precision': 0.9849719101123595, 'recall': 0.9894187358916479, 'f1': 0.9871903153153153, 'number': 7088} {'precision': 0.6258992805755396, 'recall': 0.9560439560439561, 'f1': 0.7565217391304349, 'number': 91} {'precision': 0.9254720105995363, 'recall': 0.9808671230472178, 'f1': 0.9523647209203238, 'number': 5697} 0.8341 0.8914 0.8618 0.8348
0.8816 1.58 8000 0.8823 {'precision': 0.6672340425531915, 'recall': 0.8148478099480326, 'f1': 0.7336898395721925, 'number': 6735} {'precision': 0.797549398388343, 'recall': 0.8209294398363822, 'f1': 0.8090705487122061, 'number': 8801} {'precision': 0.9851018973998594, 'recall': 0.988854401805869, 'f1': 0.9869745828346124, 'number': 7088} {'precision': 0.6439393939393939, 'recall': 0.9340659340659341, 'f1': 0.7623318385650225, 'number': 91} {'precision': 0.9253386190948133, 'recall': 0.9833245567842724, 'f1': 0.9534507701472215, 'number': 5697} 0.8308 0.8943 0.8614 0.8360
0.8756 1.78 9000 0.8817 {'precision': 0.6767485822306238, 'recall': 0.7973273942093542, 'f1': 0.7321063394683027, 'number': 6735} {'precision': 0.8019714254070218, 'recall': 0.8227474150664698, 'f1': 0.8122265844083006, 'number': 8801} {'precision': 0.9834641255605381, 'recall': 0.9901241534988713, 'f1': 0.9867829021372329, 'number': 7088} {'precision': 0.7043478260869566, 'recall': 0.8901098901098901, 'f1': 0.7864077669902914, 'number': 91} {'precision': 0.9209706509263814, 'recall': 0.9859575215025452, 'f1': 0.9523567310952866, 'number': 5697} 0.8355 0.8914 0.8625 0.8375
0.8695 1.98 10000 0.8788 {'precision': 0.6812743986903412, 'recall': 0.8032665181885672, 'f1': 0.7372581084764241, 'number': 6735} {'precision': 0.7975521005623553, 'recall': 0.821838427451426, 'f1': 0.8095131505316173, 'number': 8801} {'precision': 0.9779074614422676, 'recall': 0.9929458239277652, 'f1': 0.9853692684634231, 'number': 7088} {'precision': 0.6267605633802817, 'recall': 0.978021978021978, 'f1': 0.7639484978540774, 'number': 91} {'precision': 0.9206036745406824, 'recall': 0.9850798665964543, 'f1': 0.9517510387518019, 'number': 5697} 0.8337 0.8934 0.8625 0.8376
0.8537 2.18 11000 0.8804 {'precision': 0.6863314805457301, 'recall': 0.8066815144766147, 'f1': 0.741655859668282, 'number': 6735} {'precision': 0.8018565587357719, 'recall': 0.8244517668446767, 'f1': 0.8129971988795517, 'number': 8801} {'precision': 0.9830627099664053, 'recall': 0.9908295711060948, 'f1': 0.9869308600337269, 'number': 7088} {'precision': 0.6825396825396826, 'recall': 0.945054945054945, 'f1': 0.792626728110599, 'number': 91} {'precision': 0.9248306624814142, 'recall': 0.9826224328593997, 'f1': 0.9528510638297872, 'number': 5697} 0.8385 0.8938 0.8653 0.8384
0.854 2.38 12000 0.8817 {'precision': 0.6863779033270558, 'recall': 0.8117297698589458, 'f1': 0.7438095238095238, 'number': 6735} {'precision': 0.8055925432756325, 'recall': 0.8249062606521986, 'f1': 0.8151350137539999, 'number': 8801} {'precision': 0.9833426651735723, 'recall': 0.9911117381489842, 'f1': 0.9872119168071951, 'number': 7088} {'precision': 0.6742424242424242, 'recall': 0.978021978021978, 'f1': 0.7982062780269058, 'number': 91} {'precision': 0.9206687428290444, 'recall': 0.9859575215025452, 'f1': 0.9521952873368368, 'number': 5697} 0.8387 0.8960 0.8664 0.8390
0.8582 2.57 13000 0.8746 {'precision': 0.6879990019960079, 'recall': 0.8188567186340014, 'f1': 0.7477459155311504, 'number': 6735} {'precision': 0.8027563395810364, 'recall': 0.8272923531416885, 'f1': 0.8148396844049018, 'number': 8801} {'precision': 0.9822476935979871, 'recall': 0.9913939051918735, 'f1': 0.9867996067967982, 'number': 7088} {'precision': 0.6693548387096774, 'recall': 0.9120879120879121, 'f1': 0.772093023255814, 'number': 91} {'precision': 0.9260487481346377, 'recall': 0.9803405301035633, 'f1': 0.9524215552523875, 'number': 5697} 0.8387 0.8972 0.8669 0.8402
0.8554 2.77 14000 0.8743 {'precision': 0.6870807453416149, 'recall': 0.8212323682256867, 'f1': 0.7481907338518768, 'number': 6735} {'precision': 0.8064945140197274, 'recall': 0.8268378593341665, 'f1': 0.8165394973070018, 'number': 8801} {'precision': 0.9841692350798543, 'recall': 0.9911117381489842, 'f1': 0.9876282862364684, 'number': 7088} {'precision': 0.672, 'recall': 0.9230769230769231, 'f1': 0.7777777777777778, 'number': 91} {'precision': 0.923507155782201, 'recall': 0.9854309285588907, 'f1': 0.9534646739130435, 'number': 5697} 0.8394 0.8986 0.8680 0.8416
0.86 2.97 15000 0.8735 {'precision': 0.6898646083765658, 'recall': 0.8095025983667409, 'f1': 0.7449105068998496, 'number': 6735} {'precision': 0.8031156778256546, 'recall': 0.8259288717191229, 'f1': 0.8143625364104862, 'number': 8801} {'precision': 0.9833473271760426, 'recall': 0.9913939051918735, 'f1': 0.9873542222846705, 'number': 7088} {'precision': 0.6611570247933884, 'recall': 0.8791208791208791, 'f1': 0.7547169811320755, 'number': 91} {'precision': 0.9261034881798644, 'recall': 0.9833245567842724, 'f1': 0.9538566320449514, 'number': 5697} 0.8401 0.8950 0.8667 0.8423
0.845 3.17 16000 0.8782 {'precision': 0.7035061991734436, 'recall': 0.7835189309576838, 'f1': 0.7413599325653274, 'number': 6735} {'precision': 0.7948126167710737, 'recall': 0.8217248039995455, 'f1': 0.8080446927374302, 'number': 8801} {'precision': 0.982394858180802, 'recall': 0.9919582392776524, 'f1': 0.9871533871533871, 'number': 7088} {'precision': 0.719626168224299, 'recall': 0.8461538461538461, 'f1': 0.7777777777777778, 'number': 91} {'precision': 0.9267119880616813, 'recall': 0.981042654028436, 'f1': 0.9531036834924966, 'number': 5697} 0.8432 0.8872 0.8646 0.8397
0.846 3.37 17000 0.8759 {'precision': 0.6963411491883535, 'recall': 0.8025241276911655, 'f1': 0.7456715182451542, 'number': 6735} {'precision': 0.7992290748898678, 'recall': 0.8245653902965572, 'f1': 0.8116995693753145, 'number': 8801} {'precision': 0.984164798206278, 'recall': 0.9908295711060948, 'f1': 0.9874859392575928, 'number': 7088} {'precision': 0.6694214876033058, 'recall': 0.8901098901098901, 'f1': 0.7641509433962264, 'number': 91} {'precision': 0.9245874587458746, 'recall': 0.9835000877654906, 'f1': 0.9531343029684443, 'number': 5697} 0.8412 0.8929 0.8663 0.8403
0.8392 3.56 18000 0.8759 {'precision': 0.7022442588726514, 'recall': 0.799109131403118, 'f1': 0.7475519133273144, 'number': 6735} {'precision': 0.8000660211267606, 'recall': 0.8261561186228837, 'f1': 0.8129017832187377, 'number': 8801} {'precision': 0.9823997765050985, 'recall': 0.9922404063205418, 'f1': 0.987295570997403, 'number': 7088} {'precision': 0.6890756302521008, 'recall': 0.9010989010989011, 'f1': 0.7809523809523808, 'number': 91} {'precision': 0.9239380968060587, 'recall': 0.9850798665964543, 'f1': 0.9535298615240847, 'number': 5697} 0.8431 0.8933 0.8675 0.8409
0.8375 3.76 19000 0.8780 {'precision': 0.6947714900620332, 'recall': 0.8148478099480326, 'f1': 0.7500341670083366, 'number': 6735} {'precision': 0.8054323725055432, 'recall': 0.8254743779116009, 'f1': 0.8153302283822457, 'number': 8801} {'precision': 0.9837648705388383, 'recall': 0.991676072234763, 'f1': 0.9877046300850137, 'number': 7088} {'precision': 0.6864406779661016, 'recall': 0.8901098901098901, 'f1': 0.7751196172248803, 'number': 91} {'precision': 0.9236477572559367, 'recall': 0.9831490258030542, 'f1': 0.9524700280588385, 'number': 5697} 0.8419 0.8962 0.8682 0.8413
0.8366 3.96 20000 0.8742 {'precision': 0.7003211303789338, 'recall': 0.8095025983667409, 'f1': 0.7509641873278236, 'number': 6735} {'precision': 0.804251550044287, 'recall': 0.8253607544597205, 'f1': 0.8146694330735154, 'number': 8801} {'precision': 0.9832073887489504, 'recall': 0.9912528216704289, 'f1': 0.9872137136433891, 'number': 7088} {'precision': 0.7053571428571429, 'recall': 0.8681318681318682, 'f1': 0.7783251231527095, 'number': 91} {'precision': 0.9234189723320159, 'recall': 0.9842022116903634, 'f1': 0.9528422125924038, 'number': 5697} 0.8435 0.8950 0.8685 0.8417
0.8189 4.16 21000 0.8799 {'precision': 0.7027131284557027, 'recall': 0.8114328136599851, 'f1': 0.7531697905181918, 'number': 6735} {'precision': 0.8056294326241135, 'recall': 0.8260424951710033, 'f1': 0.8157082748948107, 'number': 8801} {'precision': 0.9825345815285734, 'recall': 0.9920993227990971, 'f1': 0.9872937872937874, 'number': 7088} {'precision': 0.6864406779661016, 'recall': 0.8901098901098901, 'f1': 0.7751196172248803, 'number': 91} {'precision': 0.9251195777667821, 'recall': 0.9845532736527998, 'f1': 0.9539115646258504, 'number': 5697} 0.8447 0.8960 0.8696 0.8425
0.8269 4.36 22000 0.8781 {'precision': 0.6995086630462891, 'recall': 0.8032665181885672, 'f1': 0.7478056534660309, 'number': 6735} {'precision': 0.8004192409532216, 'recall': 0.8243381433927963, 'f1': 0.8122026308424294, 'number': 8801} {'precision': 0.9833403331933361, 'recall': 0.9909706546275395, 'f1': 0.987140749068934, 'number': 7088} {'precision': 0.7079646017699115, 'recall': 0.8791208791208791, 'f1': 0.7843137254901961, 'number': 91} {'precision': 0.9244473771032663, 'recall': 0.9836756187467088, 'f1': 0.9531422740028915, 'number': 5697} 0.8425 0.8930 0.8670 0.8418
0.829 4.55 23000 0.8794 {'precision': 0.70192058136517, 'recall': 0.8031180400890868, 'f1': 0.7491170971539367, 'number': 6735} {'precision': 0.7987892129884425, 'recall': 0.8245653902965572, 'f1': 0.8114726601811473, 'number': 8801} {'precision': 0.9836226203807391, 'recall': 0.9913939051918735, 'f1': 0.9874929735806632, 'number': 7088} {'precision': 0.6991150442477876, 'recall': 0.8681318681318682, 'f1': 0.7745098039215685, 'number': 91} {'precision': 0.9249050998514606, 'recall': 0.9836756187467088, 'f1': 0.9533855052739026, 'number': 5697} 0.8429 0.8931 0.8673 0.8422
0.8183 4.75 24000 0.8815 {'precision': 0.6988397296952696, 'recall': 0.8138084632516703, 'f1': 0.7519550006859651, 'number': 6735} {'precision': 0.806115665854199, 'recall': 0.826724235882286, 'f1': 0.8162898973467212, 'number': 8801} {'precision': 0.9834757036829576, 'recall': 0.9908295711060948, 'f1': 0.9871389415981446, 'number': 7088} {'precision': 0.6837606837606838, 'recall': 0.8791208791208791, 'f1': 0.7692307692307692, 'number': 91} {'precision': 0.9244473771032663, 'recall': 0.9836756187467088, 'f1': 0.9531422740028915, 'number': 5697} 0.8435 0.8962 0.8691 0.8428
0.8201 4.95 25000 0.8808 {'precision': 0.7003986112897004, 'recall': 0.8087602078693392, 'f1': 0.7506890848952591, 'number': 6735} {'precision': 0.8026315789473685, 'recall': 0.8247926372003181, 'f1': 0.8135612216307089, 'number': 8801} {'precision': 0.9834849545136459, 'recall': 0.9913939051918735, 'f1': 0.9874235930583853, 'number': 7088} {'precision': 0.6923076923076923, 'recall': 0.8901098901098901, 'f1': 0.7788461538461537, 'number': 91} {'precision': 0.9247400561148704, 'recall': 0.9835000877654906, 'f1': 0.9532153793807416, 'number': 5697} 0.8432 0.8946 0.8682 0.8429

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
17
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for napsternxg/nyt-ingredient-tagger-gte-small

Base model

thenlper/gte-small
Finetuned
(7)
this model

Dataset used to train napsternxg/nyt-ingredient-tagger-gte-small