File size: 2,160 Bytes
9d434ec
 
 
 
 
 
398b852
 
 
 
 
 
 
 
 
 
 
 
9d434ec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
---
license: cc-by-nc-4.0
datasets:
- stockmark/ner-wikipedia-dataset
language:
- ja
metrics:
  - f1
  - precision
  - recall
tags:
  - NER
  - information extraction
  - relation extraction
  - summarization
  - sentiment extraction
  - question-answering
pipeline_tag: token-classification
library_name: gliner
---

# vumichien/ner-jp-gliner

This model is a fine-tuned version of [deberta-v3-base-japanese](ku-nlp/deberta-v3-base-japanese) on the Japanese Ner Wikipedia dataset.
It achieves the following results:
- Precision: 96.07%
- Recall: 89.16%
- F1 score: 92.49%

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters
The following hyperparameters were used during training:
- num_steps: 30000
- train_batch_size: 8
- eval_every: 3000
- warmup_ratio: 0.1
- scheduler_type: "cosine"
- loss_alpha: -1
- loss_gamma: 0
- label_smoothing: 0
- loss_reduction: "sum"
- lr_encoder: 1e-5
- lr_others: 5e-5
- weight_decay_encoder: 0.01
- weight_decay_other: 0.01

### Training results

| Epoch | Training Loss |
|:-----:|:-------------:|
| 1     | 1291.582200   |
| 2     | 53.290100     |
| 3     | 44.137400     |
| 4     | 35.286200     |
| 5     | 20.865500     |
| 6     | 15.890000     |
| 7     | 13.369600     |
| 8     | 11.599500     |
| 9     | 9.773400      |
| 10    | 8.372600      |
| 11    | 7.256200      |
| 12    | 6.521800      |
| 13    | 7.203800      |
| 14    | 7.032900      |
| 15    | 6.189700      |
| 16    | 6.897400      |
| 17    | 6.031700      |
| 18    | 5.329600      |
| 19    | 5.411300      |
| 20    | 5.253800      |
| 21    | 4.522000      |
| 22    | 5.107700      |
| 23    | 4.163300      |
| 24    | 4.185400      |
| 25    | 3.403100      |
| 26    | 3.272400      |
| 27    | 2.387800      |
| 28    | 3.039400      |
| 29    | 2.383000      |
| 30    | 1.895300      |
| 31    | 1.748700      |
| 32    | 1.864300      |
| 33    | 2.343000      |
| 34    | 1.356600      |
| 35    | 1.182000      |
| 36    | 0.894700      |
| 37    | 0.954900      |