gokceuludogan commited on
Commit
7ce3af2
·
verified ·
1 Parent(s): cffeb6e

Update apps/home.py

Browse files
Files changed (1) hide show
  1. apps/home.py +32 -32
apps/home.py CHANGED
@@ -27,38 +27,38 @@ def write():
27
  )
28
  st.markdown(
29
  """
30
- Welcome to our Huggingface space, where you can explore the capabilities of TURNA.
31
-
32
- **Key Features of TURNA:**
33
-
34
- - **Powerful Architecture:** TURNA contains 1.1B parameters, and was pre-trained with an encoder-decoder architecture following the UL2 framework on 43B tokens from various domains.
35
- - **Diverse Training Data:** Our model is trained on a varied dataset of 43 billion tokens, covering a wide array of domains.
36
- - **Broad Applications:** TURNA is fine-tuned for a variety of generation and understanding tasks, including:
37
- - Summarization
38
- - Paraphrasing
39
- - News title generation
40
- - Sentiment classification
41
- - Text categorization
42
- - Named entity recognition
43
- - Part-of-speech tagging
44
- - Semantic textual similarity
45
- - Natural language inference
46
-
47
- Explore various applications powered by **TURNA** using the **Navigation** bar.
48
-
49
- Refer to our [paper](https://arxiv.org/abs/2401.14373) for more details...
50
-
51
- ### Citation
52
- ```bibtex
53
- @misc{uludoğan2024turna,
54
- title={TURNA: A Turkish Encoder-Decoder Language Model for Enhanced Understanding and Generation},
55
- author={Gökçe Uludoğan and Zeynep Yirmibeşoğlu Balal and Furkan Akkurt and Melikşah Türker and Onur Güngör and Susan Üsküdarlı},
56
- year={2024},
57
- eprint={2401.14373},
58
- archivePrefix={arXiv},
59
- primaryClass={cs.CL}
60
- }
61
- ```
62
  """)
63
  st.markdown(
64
  """
 
27
  )
28
  st.markdown(
29
  """
30
+ Welcome to our Huggingface space, where you can explore the capabilities of TURNA.
31
+
32
+ **Key Features of TURNA:**
33
+
34
+ - **Powerful Architecture:** TURNA contains 1.1B parameters, and was pre-trained with an encoder-decoder architecture following the UL2 framework on 43B tokens from various domains.
35
+ - **Diverse Training Data:** Our model is trained on a varied dataset of 43 billion tokens, covering a wide array of domains.
36
+ - **Broad Applications:** TURNA is fine-tuned for a variety of generation and understanding tasks, including:
37
+ - Summarization
38
+ - Paraphrasing
39
+ - News title generation
40
+ - Sentiment classification
41
+ - Text categorization
42
+ - Named entity recognition
43
+ - Part-of-speech tagging
44
+ - Semantic textual similarity
45
+ - Natural language inference
46
+
47
+ Explore various applications powered by **TURNA** using the **Navigation** bar.
48
+
49
+ Refer to our [paper](https://arxiv.org/abs/2401.14373) for more details...
50
+
51
+ ### Citation
52
+ ```bibtex
53
+ @misc{uludoğan2024turna,
54
+ title={TURNA: A Turkish Encoder-Decoder Language Model for Enhanced Understanding and Generation},
55
+ author={Gökçe Uludoğan and Zeynep Yirmibeşoğlu Balal and Furkan Akkurt and Melikşah Türker and Onur Güngör and Susan Üsküdarlı},
56
+ year={2024},
57
+ eprint={2401.14373},
58
+ archivePrefix={arXiv},
59
+ primaryClass={cs.CL}
60
+ }
61
+ ```
62
  """)
63
  st.markdown(
64
  """