File size: 6,104 Bytes
92c5241
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
---
license: apache-2.0
base_model: hfl/chinese-roberta-wwm-ext
tags:
- generated_from_trainer
metrics:
- precision
- recall
- f1
- accuracy
model-index:
- name: RoBERTa-ext-lora-chinese-finetuned-ner
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# RoBERTa-ext-lora-chinese-finetuned-ner

This model is a fine-tuned version of [hfl/chinese-roberta-wwm-ext](https://huggingface.co/hfl/chinese-roberta-wwm-ext) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 0.6506
- Precision: 0.6463
- Recall: 0.7339
- F1: 0.6873
- Accuracy: 0.9081

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 50

### Training results

| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1     | Accuracy |
|:-------------:|:-----:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
| 0.6585        | 1.0   | 126  | 0.3495          | 0.4949    | 0.6045 | 0.5443 | 0.8916   |
| 0.3145        | 2.0   | 252  | 0.3116          | 0.5286    | 0.6644 | 0.5888 | 0.9000   |
| 0.2644        | 3.0   | 378  | 0.3112          | 0.5425    | 0.7012 | 0.6117 | 0.9029   |
| 0.2373        | 4.0   | 504  | 0.3028          | 0.5696    | 0.7090 | 0.6317 | 0.9058   |
| 0.2078        | 5.0   | 630  | 0.3141          | 0.5933    | 0.7102 | 0.6465 | 0.9059   |
| 0.192         | 6.0   | 756  | 0.3091          | 0.5842    | 0.7037 | 0.6384 | 0.9069   |
| 0.1708        | 7.0   | 882  | 0.3224          | 0.5803    | 0.7165 | 0.6413 | 0.9046   |
| 0.1557        | 8.0   | 1008 | 0.3306          | 0.6088    | 0.6833 | 0.6439 | 0.9067   |
| 0.1424        | 9.0   | 1134 | 0.3243          | 0.6031    | 0.6961 | 0.6463 | 0.9089   |
| 0.1285        | 10.0  | 1260 | 0.3451          | 0.6041    | 0.7142 | 0.6546 | 0.9064   |
| 0.1223        | 11.0  | 1386 | 0.3578          | 0.6016    | 0.7080 | 0.6505 | 0.9054   |
| 0.1111        | 12.0  | 1512 | 0.3615          | 0.6167    | 0.7158 | 0.6625 | 0.9080   |
| 0.1025        | 13.0  | 1638 | 0.3918          | 0.6073    | 0.7163 | 0.6573 | 0.9051   |
| 0.093         | 14.0  | 1764 | 0.3957          | 0.6119    | 0.7329 | 0.6670 | 0.9078   |
| 0.0858        | 15.0  | 1890 | 0.4050          | 0.6179    | 0.7052 | 0.6587 | 0.9067   |
| 0.0769        | 16.0  | 2016 | 0.4218          | 0.6178    | 0.7170 | 0.6637 | 0.9067   |
| 0.0716        | 17.0  | 2142 | 0.4213          | 0.6090    | 0.7223 | 0.6608 | 0.9057   |
| 0.0657        | 18.0  | 2268 | 0.4486          | 0.6028    | 0.7299 | 0.6603 | 0.9048   |
| 0.0652        | 19.0  | 2394 | 0.4388          | 0.6301    | 0.7193 | 0.6718 | 0.9058   |
| 0.0653        | 20.0  | 2520 | 0.4563          | 0.6165    | 0.7002 | 0.6557 | 0.9039   |
| 0.0568        | 21.0  | 2646 | 0.4549          | 0.6057    | 0.7283 | 0.6614 | 0.9034   |
| 0.0522        | 22.0  | 2772 | 0.4711          | 0.6216    | 0.7205 | 0.6674 | 0.9066   |
| 0.0486        | 23.0  | 2898 | 0.4995          | 0.6267    | 0.7148 | 0.6678 | 0.9062   |
| 0.0477        | 24.0  | 3024 | 0.4938          | 0.6228    | 0.7261 | 0.6705 | 0.9056   |
| 0.0415        | 25.0  | 3150 | 0.5129          | 0.6365    | 0.7180 | 0.6748 | 0.9075   |
| 0.0404        | 26.0  | 3276 | 0.5096          | 0.6287    | 0.7258 | 0.6738 | 0.9072   |
| 0.0364        | 27.0  | 3402 | 0.5331          | 0.6390    | 0.7205 | 0.6773 | 0.9058   |
| 0.0362        | 28.0  | 3528 | 0.5572          | 0.6317    | 0.7263 | 0.6757 | 0.9061   |
| 0.0331        | 29.0  | 3654 | 0.5603          | 0.6377    | 0.7228 | 0.6776 | 0.9050   |
| 0.0316        | 30.0  | 3780 | 0.5588          | 0.6304    | 0.7256 | 0.6746 | 0.9050   |
| 0.0321        | 31.0  | 3906 | 0.5579          | 0.6366    | 0.7190 | 0.6753 | 0.9067   |
| 0.0283        | 32.0  | 4032 | 0.5785          | 0.6469    | 0.7163 | 0.6798 | 0.9067   |
| 0.0284        | 33.0  | 4158 | 0.5698          | 0.6357    | 0.7246 | 0.6773 | 0.9073   |
| 0.0256        | 34.0  | 4284 | 0.5816          | 0.6333    | 0.7314 | 0.6788 | 0.9066   |
| 0.0231        | 35.0  | 4410 | 0.6032          | 0.6273    | 0.7276 | 0.6737 | 0.9059   |
| 0.0223        | 36.0  | 4536 | 0.6044          | 0.6317    | 0.7238 | 0.6746 | 0.9067   |
| 0.0225        | 37.0  | 4662 | 0.6007          | 0.6243    | 0.7246 | 0.6707 | 0.9060   |
| 0.0209        | 38.0  | 4788 | 0.6072          | 0.6325    | 0.7213 | 0.6740 | 0.9067   |
| 0.0199        | 39.0  | 4914 | 0.6145          | 0.6379    | 0.7261 | 0.6791 | 0.9075   |
| 0.018         | 40.0  | 5040 | 0.6299          | 0.6412    | 0.7341 | 0.6845 | 0.9070   |
| 0.0174        | 41.0  | 5166 | 0.6264          | 0.6448    | 0.7299 | 0.6847 | 0.9069   |
| 0.0172        | 42.0  | 5292 | 0.6389          | 0.6409    | 0.7369 | 0.6856 | 0.9061   |
| 0.0155        | 43.0  | 5418 | 0.6489          | 0.6381    | 0.7241 | 0.6784 | 0.9071   |
| 0.0156        | 44.0  | 5544 | 0.6418          | 0.6384    | 0.7339 | 0.6828 | 0.9065   |
| 0.0143        | 45.0  | 5670 | 0.6503          | 0.6378    | 0.7243 | 0.6783 | 0.9066   |
| 0.0149        | 46.0  | 5796 | 0.6506          | 0.6463    | 0.7339 | 0.6873 | 0.9081   |
| 0.0135        | 47.0  | 5922 | 0.6497          | 0.6432    | 0.7294 | 0.6836 | 0.9072   |
| 0.0135        | 48.0  | 6048 | 0.6563          | 0.6389    | 0.7256 | 0.6795 | 0.9069   |
| 0.013         | 49.0  | 6174 | 0.6599          | 0.6377    | 0.7306 | 0.6810 | 0.9066   |
| 0.0125        | 50.0  | 6300 | 0.6582          | 0.6380    | 0.7278 | 0.6800 | 0.9070   |


### Framework versions

- Transformers 4.35.2
- Pytorch 2.1.0+cu121
- Datasets 2.16.1
- Tokenizers 0.15.0