Edit model card

Model description Cased fine-tuned BERT model for English, trained on (manually annotated) Hungarian parliamentary speeches scraped from parlament.hu, and translated with Google Translate API.

Intended uses & limitations The model can be used as any other (cased) BERT model. It has been tested recognizing positive, negative, and neutral sentences in (parliamentary) pre-agenda speeches, where:

'Label_0': Neutral 'Label_1': Positive 'Label_2': Negative

Training The fine-tuned version of the original bert-base-cased model (bert-base-cased), trained on HunEmPoli corpus, translated with Google Translate API.

Eval results

      class   precision  recall  f1-score   support

       0       0.93      0.40      0.56        35
       1       0.80      0.84      0.82       748
       2       0.88      0.87      0.88      1118

accuracy                           0.85      1901
macro avg       0.87      0.70     0.75      1901
weighted avg    0.85      0.85     0.85      1901
Downloads last month
5