asahi417 commited on
Commit
76fada7
1 Parent(s): f35527e

model update

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -6,7 +6,7 @@ metrics:
6
  - precision
7
  - recall
8
  model-index:
9
- - name: tner/roberta-large-tweetner7-2020-selflabel2020-concat
10
  results:
11
  - task:
12
  name: Token Classification
@@ -76,7 +76,7 @@ widget:
76
  - text: "Get the all-analog Classic Vinyl Edition of `Takin' Off` Album from {{@Herbie Hancock@}} via {{USERNAME}} link below: {{URL}}"
77
  example_title: "NER Example 1"
78
  ---
79
- # tner/roberta-large-tweetner7-2020-selflabel2020-concat
80
 
81
  This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on the
82
  [tner/tweetner7](https://huggingface.co/datasets/tner/tweetner7) dataset (`train` split). This model is fine-tuned on self-labeled dataset which is the `extra_2020` split of the [tner/tweetner7](https://huggingface.co/datasets/tner/tweetner7) annotated by [tner/roberta-large](https://huggingface.co/tner/roberta-large-tweetner7-2020)). Please check [https://github.com/asahi417/tner/tree/master/examples/tweetner7_paper#model-fine-tuning-self-labeling](https://github.com/asahi417/tner/tree/master/examples/tweetner7_paper#model-fine-tuning-self-labeling) for more detail of reproducing the model.
@@ -108,8 +108,8 @@ For F1 scores, the confidence interval is obtained by bootstrap as below:
108
  - 90%: [0.6459013617167609, 0.6637399915981033]
109
  - 95%: [0.6439605146787715, 0.6661442289789786]
110
 
111
- Full evaluation can be found at [metric file of NER](https://huggingface.co/tner/roberta-large-tweetner7-2020-selflabel2020-concat/raw/main/eval/metric.json)
112
- and [metric file of entity span](https://huggingface.co/tner/roberta-large-tweetner7-2020-selflabel2020-concat/raw/main/eval/metric_span.json).
113
 
114
  ### Usage
115
  This model can be used through the [tner library](https://github.com/asahi417/tner). Install the library via pip
@@ -119,7 +119,7 @@ pip install tner
119
  and activate model as below.
120
  ```python
121
  from tner import TransformersNER
122
- model = TransformersNER("tner/roberta-large-tweetner7-2020-selflabel2020-concat")
123
  model.predict(["Jacob Collier is a Grammy awarded English artist from London"])
124
  ```
125
  It can be used via transformers library but it is not recommended as CRF layer is not supported at the moment.
@@ -143,7 +143,7 @@ The following hyperparameters were used during training:
143
  - lr_warmup_step_ratio: 0.15
144
  - max_grad_norm: 1
145
 
146
- The full configuration can be found at [fine-tuning parameter file](https://huggingface.co/tner/roberta-large-tweetner7-2020-selflabel2020-concat/raw/main/trainer_config.json).
147
 
148
  ### Reference
149
  If you use any resource from T-NER, please consider to cite our [paper](https://aclanthology.org/2021.eacl-demos.7/).
6
  - precision
7
  - recall
8
  model-index:
9
+ - name: tner/roberta-large-tweetner7-2020-selflabel2020-all
10
  results:
11
  - task:
12
  name: Token Classification
76
  - text: "Get the all-analog Classic Vinyl Edition of `Takin' Off` Album from {{@Herbie Hancock@}} via {{USERNAME}} link below: {{URL}}"
77
  example_title: "NER Example 1"
78
  ---
79
+ # tner/roberta-large-tweetner7-2020-selflabel2020-all
80
 
81
  This model is a fine-tuned version of [roberta-large](https://huggingface.co/roberta-large) on the
82
  [tner/tweetner7](https://huggingface.co/datasets/tner/tweetner7) dataset (`train` split). This model is fine-tuned on self-labeled dataset which is the `extra_2020` split of the [tner/tweetner7](https://huggingface.co/datasets/tner/tweetner7) annotated by [tner/roberta-large](https://huggingface.co/tner/roberta-large-tweetner7-2020)). Please check [https://github.com/asahi417/tner/tree/master/examples/tweetner7_paper#model-fine-tuning-self-labeling](https://github.com/asahi417/tner/tree/master/examples/tweetner7_paper#model-fine-tuning-self-labeling) for more detail of reproducing the model.
108
  - 90%: [0.6459013617167609, 0.6637399915981033]
109
  - 95%: [0.6439605146787715, 0.6661442289789786]
110
 
111
+ Full evaluation can be found at [metric file of NER](https://huggingface.co/tner/roberta-large-tweetner7-2020-selflabel2020-all/raw/main/eval/metric.json)
112
+ and [metric file of entity span](https://huggingface.co/tner/roberta-large-tweetner7-2020-selflabel2020-all/raw/main/eval/metric_span.json).
113
 
114
  ### Usage
115
  This model can be used through the [tner library](https://github.com/asahi417/tner). Install the library via pip
119
  and activate model as below.
120
  ```python
121
  from tner import TransformersNER
122
+ model = TransformersNER("tner/roberta-large-tweetner7-2020-selflabel2020-all")
123
  model.predict(["Jacob Collier is a Grammy awarded English artist from London"])
124
  ```
125
  It can be used via transformers library but it is not recommended as CRF layer is not supported at the moment.
143
  - lr_warmup_step_ratio: 0.15
144
  - max_grad_norm: 1
145
 
146
+ The full configuration can be found at [fine-tuning parameter file](https://huggingface.co/tner/roberta-large-tweetner7-2020-selflabel2020-all/raw/main/trainer_config.json).
147
 
148
  ### Reference
149
  If you use any resource from T-NER, please consider to cite our [paper](https://aclanthology.org/2021.eacl-demos.7/).