maximoss commited on
Commit
564da8e
1 Parent(s): 1cd70f0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +23 -1
README.md CHANGED
@@ -55,6 +55,28 @@ The task of automatic detection of contradictions between sentences is a sentenc
55
  **BibTeX:**
56
 
57
  ````BibTeX
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
58
  @inproceedings{skandalis-etal-2023-daccord,
59
  title = "{DACCORD} : un jeu de donn{\'e}es pour la D{\'e}tection Automatique d{'}{\'e}non{C}{\'e}s {CO}nt{R}a{D}ictoires en fran{\c{c}}ais",
60
  author = "Skandalis, Maximos and
@@ -74,7 +96,7 @@ The task of automatic detection of contradictions between sentences is a sentenc
74
 
75
  **ACL:**
76
 
77
- Maximos Skandalis, Richard Moot, Christian Retoré, and Simon Robillard. 2024. *New datasets for automatic detection of textual entailment and of contradictions between sentences in French*. The 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024), Turin, Italy. European Language Resources Association (ELRA) and International Committee on Computational Linguistics (ICCL).
78
 
79
  And
80
 
 
55
  **BibTeX:**
56
 
57
  ````BibTeX
58
+ @inproceedings{skandalis-etal-2024-new-datasets,
59
+ title = "New Datasets for Automatic Detection of Textual Entailment and of Contradictions between Sentences in {F}rench",
60
+ author = "Skandalis, Maximos and
61
+ Moot, Richard and
62
+ Retor{\'e}, Christian and
63
+ Robillard, Simon",
64
+ editor = "Calzolari, Nicoletta and
65
+ Kan, Min-Yen and
66
+ Hoste, Veronique and
67
+ Lenci, Alessandro and
68
+ Sakti, Sakriani and
69
+ Xue, Nianwen",
70
+ booktitle = "Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)",
71
+ month = may,
72
+ year = "2024",
73
+ address = "Torino, Italy",
74
+ publisher = "ELRA and ICCL",
75
+ url = "https://aclanthology.org/2024.lrec-main.1065",
76
+ pages = "12173--12186",
77
+ abstract = "This paper introduces DACCORD, an original dataset in French for automatic detection of contradictions between sentences. It also presents new, manually translated versions of two datasets, namely the well known dataset RTE3 and the recent dataset GQNLI, from English to French, for the task of natural language inference / recognising textual entailment, which is a sentence-pair classification task. These datasets help increase the admittedly limited number of datasets in French available for these tasks. DACCORD consists of 1034 pairs of sentences and is the first dataset exclusively dedicated to this task and covering among others the topic of the Russian invasion in Ukraine. RTE3-FR contains 800 examples for each of its validation and test subsets, while GQNLI-FR is composed of 300 pairs of sentences and focuses specifically on the use of generalised quantifiers. Our experiments on these datasets show that they are more challenging than the two already existing datasets for the mainstream NLI task in French (XNLI, FraCaS). For languages other than English, most deep learning models for NLI tasks currently have only XNLI available as a training set. Additional datasets, such as ours for French, could permit different training and evaluation strategies, producing more robust results and reducing the inevitable biases present in any single dataset.",
78
+ }
79
+
80
  @inproceedings{skandalis-etal-2023-daccord,
81
  title = "{DACCORD} : un jeu de donn{\'e}es pour la D{\'e}tection Automatique d{'}{\'e}non{C}{\'e}s {CO}nt{R}a{D}ictoires en fran{\c{c}}ais",
82
  author = "Skandalis, Maximos and
 
96
 
97
  **ACL:**
98
 
99
+ Maximos Skandalis, Richard Moot, Christian Retoré, and Simon Robillard. 2024. [New Datasets for Automatic Detection of Textual Entailment and of Contradictions between Sentences in French](https://aclanthology.org/2024.lrec-main.1065). In *Proceedings of the 2024 Joint International Conference on Computational Linguistics, Language Resources and Evaluation (LREC-COLING 2024)*, pages 12173–12186, Torino, Italy. ELRA and ICCL.
100
 
101
  And
102