|
---
|
|
language: fr
|
|
license: mit
|
|
tags:
|
|
- zero-shot-classification
|
|
- nli
|
|
pipeline_tag: zero-shot-classification
|
|
widget:
|
|
- text: "Selon certains physiciens, un univers parallèle, miroir du nôtre ou relevant de ce que l'on appelle la théorie des branes, autoriserait des neutrons à sortir de notre Univers pour y entrer à nouveau. L'idée a été testée une nouvelle fois avec le réacteur nucléaire de l'Institut Laue-Langevin à Grenoble, plus précisément en utilisant le détecteur de l'expérience Stereo initialement conçu pour chasser des particules de matière noire potentielles, les neutrinos stériles."
|
|
candidate_labels: "politique, science, sport, santé"
|
|
hypothesis_template: "Ce texte parle de {}."
|
|
datasets:
|
|
- flue
|
|
---
|
|
|
|
DistilCamemBERT-NLI
|
|
===================
|
|
|
|
We present DistilCamemBERT-NLI which is [DistilCamemBERT](https://huggingface.co/cmarkea/distilcamembert-base) fine-tuned for the Natural Language Inference (NLI) task for the French language, also known as recognizing textual entailment (RTE). This model is constructed on the XNLI dataset which consists to determine whether a premise entails, contradicts or neither entails nor contradicts a hypothesis.
|
|
|
|
This modelization is close to [BaptisteDoyen/camembert-base-xnli](https://huggingface.co/BaptisteDoyen/camembert-base-xnli) based on [CamemBERT](https://huggingface.co/camembert-base) model. The problem of the modelizations based on CamemBERT is at the scaling moment, for the production phase for example. Indeed, inference cost can be a technological issue. To counteract this effect, we propose this modelization which divides the inference time by 2 with the same consumption power thanks to DistilCamemBERT.
|
|
|
|
Dataset
|
|
=======
|
|
|
|
The dataset (XNLI)[https://huggingface.co/datasets/xnli] is composed of 392,702 premises with their hypothesis for the train and 5,010 couples for the test.
|
|
|
|
| **class** | **precision (%)** | **f1-score (%)** | **support** |
|
|
| :----------------: | :---------------: | :--------------: | :---------: |
|
|
| **global** | 77.70 | 77.45 | 5,010 |
|
|
| **contradiction** | 78.00 | 79.54 | 1,670 |
|
|
| **entailment** | 82.90 | 78.87 | 1,670 |
|
|
| **neutral** | 72.18 | 74.04 | 1,670 |
|
|
|
|
|
|
| **NLI** | **time (ms)** | **MCC (x100)** |
|
|
| :--------- ----: | :-----------: | :------------: |
|
|
| [cmarkea/distilcamembert-base-nli](https://huggingface.co/cmarkea/distilcamembert-base-nli) | **51.35** | **66.24** |
|
|
| [BaptisteDoyen/camembert-base-xnli](https://huggingface.co/BaptisteDoyen/camembert-base-xnli) | 105.0 | 72.67 |
|
|
| [MoritzLaurer/mDeBERTa-v3-base-mnli-xnli](https://huggingface.co/MoritzLaurer/mDeBERTa-v3-base-mnli-xnli) | 299.18 | 75.15 | |