Edit model card

Language Models Fine-tuning on Question Generation: lmqg/mbart-large-cc25-jaquad

This model is fine-tuned version of facebook/mbart-large-cc25 for question generation task on the lmqg/qg_jaquad (dataset_name: default).

Overview

Usage


from transformers import pipeline

model_path = 'lmqg/mbart-large-cc25-jaquad'
pipe = pipeline("text2text-generation", model_path)

# Question Generation
question = pipe('generate question: ゾフィーは貴族出身ではあったが王族出身ではなく、ハプスブルク家の皇位継承者であるフランツ・フェルディナントとの結婚は貴賤結婚となった。皇帝フランツ・ヨーゼフは、2人の間に生まれた子孫が皇位を継がないことを条件として結婚を承認していた。視察が予定されている<hl>6月28日<hl>は2人の14回目の結婚記念日であった。')

Evaluation Metrics

Metrics

Dataset Type BLEU4 ROUGE-L METEOR BERTScore MoverScore Link
lmqg/qg_jaquad default 0.32156776073917387 0.5294969429504184 0.2997311570800795 0.8225831409256842 0.5987761972106114 link

Training hyperparameters

The following hyperparameters were used during fine-tuning:

  • dataset_path: lmqg/qg_jaquad
  • dataset_name: default
  • input_types: ['paragraph_answer']
  • output_types: ['question']
  • prefix_types: None
  • model: facebook/mbart-large-cc25
  • max_length: 512
  • max_length_output: 32
  • epoch: 12
  • batch: 64
  • lr: 0.0001
  • fp16: False
  • random_seed: 1
  • gradient_accumulation_steps: 1
  • label_smoothing: 0.15

The full configuration can be found at fine-tuning config file.

Citation

TBA

Downloads last month
54
Hosted inference API
Text2Text Generation
Examples
Examples
This model can be loaded on the Inference API on-demand.

Dataset used to train lmqg/mbart-large-cc25-jaquad

Evaluation results