English
Summarization
5 papers
igorgavi commited on
Commit
54137d9
1 Parent(s): f6147de

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +25 -2
README.md CHANGED
@@ -57,11 +57,34 @@ model are the already existing and vastly applied BART-Large CNN, Pegasus-XSUM a
57
  the Sumy Python Library and include SumyRandom, SumyLuhn, SumyLsa, SumyLexRank, SumyTextRank, SumySumBasic, SumyKL and SumyReduction. Each of the
58
  methods used for text summarization will be described indvidually in the following sections.
59
 
60
- | Model | Params |
61
- |:-----:|:------:|
62
 
63
  ![architeru](https://github.com/marcosdib/S2Query/Classification_Architecture_model.png)
64
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
65
  ## Model variations
66
 
67
  With the motivation to increase accuracy obtained with baseline implementation, we implemented a transfer learning
 
57
  the Sumy Python Library and include SumyRandom, SumyLuhn, SumyLsa, SumyLexRank, SumyTextRank, SumySumBasic, SumyKL and SumyReduction. Each of the
58
  methods used for text summarization will be described indvidually in the following sections.
59
 
 
 
60
 
61
  ![architeru](https://github.com/marcosdib/S2Query/Classification_Architecture_model.png)
62
 
63
+ ## Methods
64
+
65
+ # SumyRandom
66
+
67
+ # Sumy Luhn
68
+
69
+ # SumyLsa
70
+
71
+ # SumyLexRank
72
+
73
+ # SumyTextRank
74
+
75
+ # SumySumBasic
76
+
77
+ # SumyKL
78
+
79
+ # SumyReduction
80
+
81
+ # BART-Large CNN
82
+
83
+ # Pegasus-XSUM
84
+
85
+ # mT5 Multilingual XLSUM
86
+
87
+
88
  ## Model variations
89
 
90
  With the motivation to increase accuracy obtained with baseline implementation, we implemented a transfer learning