pierreguillou commited on
Commit
6836c7e
1 Parent(s): cfbac95

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -50,7 +50,7 @@ Due to the small size of BERTimbau base and finetuning dataset, the model overfi
50
  - **accuracy**: 0.9759397808828684
51
  - **loss**: 0.10249536484479904
52
 
53
- **Note**: the model [pierreguillou/bert-base-cased-pt-lenerbr](https://huggingface.co/pierreguillou/bert-base-cased-pt-lenerbr) is a language model that was created through the finetuning of the model [BERTimbau base](https://huggingface.co/neuralmind/bert-base-portuguese-cased) on the dataset [LeNER-Br language modeling](https://huggingface.co/datasets/pierreguillou/lener_br_finetuning_language_model) by using a MASK objective. This first specialization of the language model before fine finetuning on the NER task improved a bit the model quality. To prove it, here are the results of the NER model finetuned from the model [BERTimbau base](https://huggingface.co/neuralmind/bert-base-portuguese-cased) (a non-specialized language model):
54
  - **f1**: 0.8716487228203504
55
  - **precision**: 0.8559286898839138
56
  - **recall**: 0.8879569892473118
 
50
  - **accuracy**: 0.9759397808828684
51
  - **loss**: 0.10249536484479904
52
 
53
+ **Note**: the model [pierreguillou/bert-base-cased-pt-lenerbr](https://huggingface.co/pierreguillou/bert-base-cased-pt-lenerbr) is a language model that was created through the finetuning of the model [BERTimbau base](https://huggingface.co/neuralmind/bert-base-portuguese-cased) on the dataset [LeNER-Br language modeling](https://huggingface.co/datasets/pierreguillou/lener_br_finetuning_language_model) by using a MASK objective. This first specialization of the language model before finetuning on the NER task improved a bit the model quality. To prove it, here are the results of the NER model finetuned from the model [BERTimbau base](https://huggingface.co/neuralmind/bert-base-portuguese-cased) (a non-specialized language model):
54
  - **f1**: 0.8716487228203504
55
  - **precision**: 0.8559286898839138
56
  - **recall**: 0.8879569892473118