Bert2Bert-HunSum-2 / README.md
BotondBarta's picture
Update README.md
7379b24 verified
metadata
datasets:
  - SZTAKI-HLT/HunSum-2-abstractive
language:
  - hu
metrics:
  - rouge
pipeline_tag: text2text-generation
inference:
  parameters:
    num_beams: 5
    length_penalty: 2
    max_length: 128
    no_repeat_ngram_size: 3
    early_stopping: false
tags:
  - hubert
  - bert
  - summarization
license: apache-2.0

Model Card for Bert2Bert-HunSum-2

The Bert2Bert-HunSum-2 is a Hungarian abstractive summarization model, which was trained on the SZTAKI-HLT/HunSum-2-abstractive dataset. The model is based on SZTAKI-HLT/hubert-base-cc.

Intended uses & limitations

  • Model type: Text Summarization
  • Language(s) (NLP): Hungarian
  • Resource(s) for more information:

Parameters

  • Batch Size: 13
  • Learning Rate: 5e-5
  • Weight Decay: 0.01
  • Warmup Steps: 16000
  • Epochs: 10
  • no_repeat_ngram_size: 3
  • num_beams: 5
  • early_stopping: False

Results

Metric Value
ROUGE-1 40.95
ROUGE-2 14.18
ROUGE-L 27.42