EngEmBERT3 / README.md
poltextlab's picture
Update README.md
4c72c6a verified
---
license: apache-2.0
---
Model description Cased fine-tuned BERT model for English, trained on (manually annotated) Hungarian parliamentary speeches scraped from parlament.hu, and translated with Google Translate API.
Intended uses & limitations The model can be used as any other (cased) BERT model. It has been tested recognizing positive, negative, and neutral sentences in (parliamentary) pre-agenda speeches, where:
'Label_0': Negative 'Label_1': Neutral 'Label_2': Positive
Training The fine-tuned version of the original bert-base-cased model (bert-base-cased), trained on HunEmPoli corpus, translated with Google Translate API.
Intended uses & limitations: The model can be used as any other (cased) BERT model.
Eval results
precision recall f1-score support
0 0.87 0.87 0.87 1118
1 1.00 0.26 0.41 35
2 0.78 0.82 0.80 748
accuracy 0.83 1901
macro avg 0.88 0.65 0.69 1901
weighted avg 0.84 0.83 0.83 1901