piotr-rybak commited on
Commit
d78d036
1 Parent(s): d3af9cf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +19 -7
README.md CHANGED
@@ -213,12 +213,24 @@ CC BY-SA 4.0
213
  ### Citation Information
214
 
215
  ```
216
- @misc{rybak2024polqa,
217
- title={PolQA: Polish Question Answering Dataset},
218
- author={Piotr Rybak and Piotr Przybyła and Maciej Ogrodniczuk},
219
- year={2024},
220
- eprint={2212.08897},
221
- archivePrefix={arXiv},
222
- primaryClass={cs.CL}
 
 
 
 
 
 
 
 
 
 
 
 
223
  }
224
  ```
 
213
  ### Citation Information
214
 
215
  ```
216
+ @inproceedings{rybak-etal-2024-polqa-polish,
217
+ title = "{P}ol{QA}: {P}olish Question Answering Dataset",
218
+ author = "Rybak, Piotr and
219
+ Przyby{\l}a, Piotr and
220
+ Ogrodniczuk, Maciej",
221
+ editor = "Calzolari, Nicoletta and
222
+ Kan, Min-Yen and
223
+ Hoste, Veronique and
224
+ Lenci, Alessandro and
225
+ Sakti, Sakriani and
226
+ Xue, Nianwen",
227
+ booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)",
228
+ month = may,
229
+ year = "2024",
230
+ address = "Torino, Italia",
231
+ publisher = "ELRA and ICCL",
232
+ url = "https://aclanthology.org/2024.lrec-main.1125",
233
+ pages = "12846--12855",
234
+ abstract = "Recently proposed systems for open-domain question answering (OpenQA) require large amounts of training data to achieve state-of-the-art performance. However, data annotation is known to be time-consuming and therefore expensive to acquire. As a result, the appropriate datasets are available only for a handful of languages (mainly English and Chinese). In this work, we introduce and publicly release PolQA, the first Polish dataset for OpenQA. It consists of 7,000 questions, 87,525 manually labeled evidence passages, and a corpus of over 7,097,322 candidate passages. Each question is classified according to its formulation, type, as well as entity type of the answer. This resource allows us to evaluate the impact of different annotation choices on the performance of the QA system and propose an efficient annotation strategy that increases the passage retrieval accuracy@10 by 10.55 p.p. while reducing the annotation cost by 82{\%}.",
235
  }
236
  ```