ccasimiro's picture
Update README.md
7986cf2
metadata
license: apache-2.0
datasets:
  - xquad
language:
  - multilingual
library_name: transformers
tags:
  - cross-lingual
  - exctractive-question-answering
metrics:
  - f1
  - exact_match

Description

Best-performing "mBERT-qa-en, skd" model from the paper Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation.

Check the official GitHub repository to access the code used to implement the methods in the paper.

More info coming soon!

How to Cite

To cite our work use the following BibTex:

@misc{carrino2023promoting,
      title={Promoting Generalized Cross-lingual Question Answering in Few-resource Scenarios via Self-knowledge Distillation}, 
      author={Casimiro Pio Carrino and Carlos Escolano and José A. R. Fonollosa},
      year={2023},
      eprint={2309.17134},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}