Text2Text Generation
Transformers
PyTorch
mt5
Inference Endpoints
eaclark07 commited on
Commit
36d756d
1 Parent(s): 1a0e8d9

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -0
README.md CHANGED
@@ -1,3 +1,28 @@
1
  ---
2
  license: cc-by-4.0
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: cc-by-4.0
3
  ---
4
+ This is model based on mT5-L that predicts a binary label for a given article and summary for Q5 (main idea(s)), as defined in the [SEAHORSE paper](https://arxiv.org/abs/2305.13194) (Clark et al., 2023).
5
+
6
+ It is trained similarly to the [TRUE paper (Honovich et al, 2022)](https://arxiv.org/pdf/2204.04991.pdf) on human ratings from the SEAHORSE dataset in 6 languages:
7
+ - German
8
+ - English
9
+ - Spanish
10
+ - Russian
11
+ - Turkish
12
+ - Vietnamese
13
+
14
+ The input format for the model is: "premise: ARTICLE hypothesis: SUMMARY".
15
+
16
+ There is also an XXL version of this model, as well as metrics trained for each of the other 5 dimensions described in the original paper.
17
+
18
+ The full citation for the SEAHORSE paper is:
19
+ ```
20
+ @misc{clark2023seahorse,
21
+ title={SEAHORSE: A Multilingual, Multifaceted Dataset for Summarization Evaluation},
22
+ author={Elizabeth Clark and Shruti Rijhwani and Sebastian Gehrmann and Joshua Maynez and Roee Aharoni and Vitaly Nikolaev and Thibault Sellam and Aditya Siddhant and Dipanjan Das and Ankur P. Parikh},
23
+ year={2023},
24
+ eprint={2305.13194},
25
+ archivePrefix={arXiv},
26
+ primaryClass={cs.CL}
27
+ }
28
+ ```