makaleChatbotu / bart.txt
yonkasoft's picture
Upload 11 files
b9ab29b verified
raw
history blame
667 Bytes
MNLI: a bitext classification task to predict whether one sentence entails another. The fine-tuned model concatenates the two sentences with appended an EOS token, and passes them to both the BART encoder and decoder. In contrast to BERT, the representation of the EOS token is used to classify the sentences relations.
ELI5: a long-form abstractive question answering dataset. Models generate answers conditioned on the concatenation of a question and supporting documents.
ConvAI2: a dialogue response generation task, conditioned on context and a persona.
CNN/DM: a news summarization dataset. Summaries here are typically closely related to source sentences.