GuillemGSubies commited on
Commit
352e67d
1 Parent(s): b593755

Add README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -6
README.md CHANGED
@@ -7,18 +7,18 @@ tags:
7
  - bert-base-spanish-wwm-cased
8
  license: cc-by-4.0
9
  datasets:
10
- - "bigbio/distemist"
11
  metrics:
12
  - f1
13
 
14
  model-index:
15
- - name: IIC/bert-base-spanish-wwm-cased-distemist
16
  results:
17
  - task:
18
  type: token-classification
19
  dataset:
20
- name: distemist
21
- type: bigbio/distemist
22
  split: test
23
  metrics:
24
  - name: f1
@@ -28,9 +28,9 @@ pipeline_tag: token-classification
28
 
29
  ---
30
 
31
- # bert-base-spanish-wwm-cased-distemist
32
 
33
- This model is a finetuned version of bert-base-spanish-wwm-cased for the distemist dataset used in a benchmark in the paper TODO. The model has a F1 of 0.843
34
 
35
  Please refer to the original publication for more information TODO LINK
36
 
 
7
  - bert-base-spanish-wwm-cased
8
  license: cc-by-4.0
9
  datasets:
10
+ - "ehealth_kd"
11
  metrics:
12
  - f1
13
 
14
  model-index:
15
+ - name: IIC/bert-base-spanish-wwm-cased-ehealth_kd
16
  results:
17
  - task:
18
  type: token-classification
19
  dataset:
20
+ name: eHealth-KD
21
+ type: ehealth_kd
22
  split: test
23
  metrics:
24
  - name: f1
 
28
 
29
  ---
30
 
31
+ # bert-base-spanish-wwm-cased-ehealth_kd
32
 
33
+ This model is a finetuned version of bert-base-spanish-wwm-cased for the eHealth-KD dataset used in a benchmark in the paper TODO. The model has a F1 of 0.843
34
 
35
  Please refer to the original publication for more information TODO LINK
36