Gabriel commited on
Commit
9ec99d2
1 Parent(s): 9723ff3

Update text.py

Browse files
Files changed (1) hide show
  1. text.py +2 -2
text.py CHANGED
@@ -14,9 +14,9 @@ The underlying engines for the Abstractive part are transformer based model BART
14
 
15
  To see more in depth regarding the training go to model card: [Gabriel/bart-base-cnn-xsum-swe](https://huggingface.co/Gabriel/bart-base-cnn-xsum-swe).
16
  """
17
- ## _
18
  sum_app_text_tab_2= """
19
-
20
  The core idea behind the training procedure is sequential adoption through transfer learning, i.e multiple phases for fine-tuning a pretrained model on different datasets. The figure below illustrates how the skill level of the model increases at each step:
21
  ![alt text2](file/BART_SEQ.png)
22
 
 
14
 
15
  To see more in depth regarding the training go to model card: [Gabriel/bart-base-cnn-xsum-swe](https://huggingface.co/Gabriel/bart-base-cnn-xsum-swe).
16
  """
17
+
18
  sum_app_text_tab_2= """
19
+ ##
20
  The core idea behind the training procedure is sequential adoption through transfer learning, i.e multiple phases for fine-tuning a pretrained model on different datasets. The figure below illustrates how the skill level of the model increases at each step:
21
  ![alt text2](file/BART_SEQ.png)
22