GLiNER
PyTorch
English
meals-gliner / README.md
vumichien's picture
Update README.md
bbca078 verified
metadata
license: cc-by-nc-4.0
datasets:
  - vumichien/meals-data-gliner
language:
  - en
library_name: gliner

vumichien/ner-jp-gliner

This model is a fine-tuned version of deberta-v3-base-small on the meals synthetic dataset that generated by Mistral 8B. It achieves the following results:

  • Precision: 84.79%
  • Recall: 75.04%
  • F1 score: 79.62%

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • num_steps: 30000
  • train_batch_size: 8
  • eval_every: 3000
  • warmup_ratio: 0.1
  • scheduler_type: "cosine"
  • loss_alpha: -1
  • loss_gamma: 0
  • label_smoothing: 0
  • loss_reduction: "sum"
  • lr_encoder: 1e-5
  • lr_others: 5e-5
  • weight_decay_encoder: 0.01
  • weight_decay_other: 0.01

Training results

Epoch Training Loss
1 No log
2 2008.786600
3 2008.786600
4 117.661100
5 84.863400
6 84.863400
7 66.872200
8 66.872200
9 58.574600
10 53.905900
11 53.905900
12 48.563900
13 48.563900
14 43.970700
15 38.940100
16 38.940100
17 35.543100
18 35.543100
19 33.050500
20 30.091100
21 30.091100
22 27.275200
23 27.275200
24 25.327500
25 23.171200
26 23.171200
27 20.940300
28 19.034100
29 19.034100
30 17.366400
31 17.366400
32 16.570800
33 15.673200
34 15.673200
35 14.457500
36 14.457500
37 13.064500
38 12.786100
39 12.786100
40 11.934400
41 11.934400
42 11.225800
43 10.106500
44 10.106500
45 9.200000
46 9.200000
47 9.449100
48 8.979400
49 8.979400
50 7.840100
51 7.949600
52 7.949600
53 7.233800
54 7.233800
55 7.383200
56 6.114800
57 6.114800
58 6.421800
59 6.421800
60 6.191000
61 5.932200
62 5.932200
63 5.706100
64 5.706100
65 5.567800
66 5.104100
67 5.104100
68 5.407800
69 5.407800
70 5.607500
71 4.967500
72 4.967500
73 5.362100
74 5.362100
75 5.425800
76 5.283100
77 5.283100
78 4.250000
79 4.330900
80 4.330900
81 4.088400
82 4.088400
83 4.512400
84 4.513500
85 4.513500
86 4.327000
87 4.327000
88 5.152200
89 3.776100
90 3.776100
91 3.762500
92 3.762500
93 4.054900
94 3.579700
95 3.579700
96 3.391500
97 3.391500
98 4.863200