habdine commited on
Commit
d80c258
Β·
verified Β·
1 Parent(s): 0fac219

Update app.py

Browse files
Files changed (1) hide show
  1. app.py +2 -2
app.py CHANGED
@@ -40,11 +40,11 @@ def get_input(text) -> Iterator[str]:
40
 
41
  desc = f'''
42
  This is a demo for Greek News summarization using [greekbart-news24-abstract](https://huggingface.co/dascim/greekbart-news24-abstract), a finetuned version of [GreekBART](https://huggingface.co/dascim/greekbart).
43
- GreekBART is the first Greek sequence to sequence pretrained model. It is pretrained on 77GB of Greek raw text using the CNRS Jean Zay supercomputer. Our model is based on BART. Unlike already existing BERT-based Greek language models such as GreekBERT and Electra, GreekBART is particularly well-suited for generative tasks, since not only its encoder but also its decoder is pretrained. Our models are competitive to GreekBERT and XLM-R in discriminative tasks and it is the first BART BASE model that can generative tasks such as abstractive summarization for the Greek language.
44
 
45
  πŸ“‘ Paper: [GreekBART: The First Pretrained Greek Sequence-to-Sequence Model](https://arxiv.org/abs/2304.00869)
46
 
47
  Enter your text (maximum of 1024 tokens of Greek news article) to get a summary.
48
  '''
49
- iface = gr.Interface(fn=get_input,inputs="text",outputs="text",title = "Greek News Summarizer",description=desc)
50
  iface.launch()
 
40
 
41
  desc = f'''
42
  This is a demo for Greek News summarization using [greekbart-news24-abstract](https://huggingface.co/dascim/greekbart-news24-abstract), a finetuned version of [GreekBART](https://huggingface.co/dascim/greekbart).
43
+ GreekBART is the first Greek sequence to sequence pretrained model. It is pretrained on 77GB of Greek raw text using the CNRS Jean Zay supercomputer. Our model is based on BART. Unlike already existing BERT-based Greek language models such as GreekBERT and Electra, GreekBART is particularly well-suited for generative tasks, since not only its encoder but also its decoder is pretrained.
44
 
45
  πŸ“‘ Paper: [GreekBART: The First Pretrained Greek Sequence-to-Sequence Model](https://arxiv.org/abs/2304.00869)
46
 
47
  Enter your text (maximum of 1024 tokens of Greek news article) to get a summary.
48
  '''
49
+ iface = gr.Interface(fn=get_input,inputs="text",outputs="text",title = "πŸ‡¬πŸ‡· Greek News Summarizer πŸ‡¬πŸ‡·",description=desc)
50
  iface.launch()