pucpr commited on
Commit
0d889f9
1 Parent(s): b1823ec

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +32 -0
README.md CHANGED
@@ -10,10 +10,42 @@ datasets:
10
  thumbnail: "https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png"
11
  ---
12
 
 
 
13
  # Portuguese Clinical NER - Medical
14
 
15
  The Medical NER model is part of the [BioBERTpt project](https://www.aclweb.org/anthology/2020.clinicalnlp-1.7/), where 13 models of clinical entities (compatible with UMLS) were trained. All NER model from "pucpr" user was trained from the Brazilian clinical corpus [SemClinBr](https://github.com/HAILab-PUCPR/SemClinBr), with 10 epochs and IOB2 format, from BioBERTpt(all) model.
16
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
17
  ## Questions?
18
 
19
  Post a Github issue on the [BioBERTpt repo](https://github.com/HAILab-PUCPR/BioBERTpt).
 
10
  thumbnail: "https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png"
11
  ---
12
 
13
+ <img src="https://raw.githubusercontent.com/HAILab-PUCPR/BioBERTpt/master/images/logo-biobertpr1.png" alt="Logo BioBERTpt">
14
+
15
  # Portuguese Clinical NER - Medical
16
 
17
  The Medical NER model is part of the [BioBERTpt project](https://www.aclweb.org/anthology/2020.clinicalnlp-1.7/), where 13 models of clinical entities (compatible with UMLS) were trained. All NER model from "pucpr" user was trained from the Brazilian clinical corpus [SemClinBr](https://github.com/HAILab-PUCPR/SemClinBr), with 10 epochs and IOB2 format, from BioBERTpt(all) model.
18
 
19
+ ## Acknowledgements
20
+
21
+ This study was financed in part by the Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil (CAPES) - Finance Code 001.
22
+
23
+ ## Citation
24
+
25
+ ```
26
+ @inproceedings{schneider-etal-2020-biobertpt,
27
+ title = "{B}io{BERT}pt - A {P}ortuguese Neural Language Model for Clinical Named Entity Recognition",
28
+ author = "Schneider, Elisa Terumi Rubel and
29
+ de Souza, Jo{\~a}o Vitor Andrioli and
30
+ Knafou, Julien and
31
+ Oliveira, Lucas Emanuel Silva e and
32
+ Copara, Jenny and
33
+ Gumiel, Yohan Bonescki and
34
+ Oliveira, Lucas Ferro Antunes de and
35
+ Paraiso, Emerson Cabrera and
36
+ Teodoro, Douglas and
37
+ Barra, Cl{\'a}udia Maria Cabral Moro",
38
+ booktitle = "Proceedings of the 3rd Clinical Natural Language Processing Workshop",
39
+ month = nov,
40
+ year = "2020",
41
+ address = "Online",
42
+ publisher = "Association for Computational Linguistics",
43
+ url = "https://www.aclweb.org/anthology/2020.clinicalnlp-1.7",
44
+ pages = "65--72",
45
+ abstract = "With the growing number of electronic health record data, clinical NLP tasks have become increasingly relevant to unlock valuable information from unstructured clinical text. Although the performance of downstream NLP tasks, such as named-entity recognition (NER), in English corpus has recently improved by contextualised language models, less research is available for clinical texts in low resource languages. Our goal is to assess a deep contextual embedding model for Portuguese, so called BioBERTpt, to support clinical and biomedical NER. We transfer learned information encoded in a multilingual-BERT model to a corpora of clinical narratives and biomedical-scientific papers in Brazilian Portuguese. To evaluate the performance of BioBERTpt, we ran NER experiments on two annotated corpora containing clinical narratives and compared the results with existing BERT models. Our in-domain model outperformed the baseline model in F1-score by 2.72{\%}, achieving higher performance in 11 out of 13 assessed entities. We demonstrate that enriching contextual embedding models with domain literature can play an important role in improving performance for specific NLP tasks. The transfer learning process enhanced the Portuguese biomedical NER model by reducing the necessity of labeled data and the demand for retraining a whole new model.",
46
+ }
47
+ ```
48
+
49
  ## Questions?
50
 
51
  Post a Github issue on the [BioBERTpt repo](https://github.com/HAILab-PUCPR/BioBERTpt).