Gabriel commited on
Commit
9723ff3
1 Parent(s): 45f7d73

Update text.py

Browse files
Files changed (1) hide show
  1. text.py +1 -1
text.py CHANGED
@@ -14,7 +14,7 @@ The underlying engines for the Abstractive part are transformer based model BART
14
 
15
  To see more in depth regarding the training go to model card: [Gabriel/bart-base-cnn-xsum-swe](https://huggingface.co/Gabriel/bart-base-cnn-xsum-swe).
16
  """
17
- ##
18
  sum_app_text_tab_2= """
19
 
20
  The core idea behind the training procedure is sequential adoption through transfer learning, i.e multiple phases for fine-tuning a pretrained model on different datasets. The figure below illustrates how the skill level of the model increases at each step:
 
14
 
15
  To see more in depth regarding the training go to model card: [Gabriel/bart-base-cnn-xsum-swe](https://huggingface.co/Gabriel/bart-base-cnn-xsum-swe).
16
  """
17
+ ## _
18
  sum_app_text_tab_2= """
19
 
20
  The core idea behind the training procedure is sequential adoption through transfer learning, i.e multiple phases for fine-tuning a pretrained model on different datasets. The figure below illustrates how the skill level of the model increases at each step: