jordiclive commited on
Commit
df91210
1 Parent(s): 3b515a1

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -9
README.md CHANGED
@@ -19,7 +19,7 @@ metrics:
19
 
20
  # Multi-purpose Summarizer (Fine-tuned 3B google/flan-t5-xl on several Summarization datasets)
21
 
22
- <a href="https://colab.research.google.com/gist/pszemraj/5dc89199a631a9c6cfd7e386011452a0/demo-flan-t5-large-grammar-synthesis.ipynb">
23
  <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
24
  </a>
25
 
@@ -67,18 +67,11 @@ results = summarizer(
67
  )
68
  ```
69
 
70
- **For Batch Inference:** see [this discussion thread](https://huggingface.co/pszemraj/flan-t5-large-grammar-synthesis/discussions/1) for details, but essentially the dataset consists of several sentences at a time, and so I'd recommend running inference **in the same fashion:** batches of 64-96 tokens ish (or, 2-3 sentences split with regex)
71
-
72
- - it is also helpful to **first** check whether or not a given sentence needs grammar correction before using the text2text model. You can do this with BERT-type models fine-tuned on CoLA like `textattack/roberta-base-CoLA`
73
- - I made a notebook demonstrating batch inference [here](https://colab.research.google.com/gist/pszemraj/6e961b08970f98479511bb1e17cdb4f0/batch-grammar-check-correct-demo.ipynb)
74
-
75
-
76
-
77
  ---
78
 
79
  ## Training procedure
80
 
81
- - Training was done in BF16, deepspeed stage 2 for 6 epochs with ROUGE2 monitored on the validation set.
82
 
83
  ## Hardware
84
  - GPU count 8 NVIDIA A100-SXM4-40GB
 
19
 
20
  # Multi-purpose Summarizer (Fine-tuned 3B google/flan-t5-xl on several Summarization datasets)
21
 
22
+ <a href="https://colab.research.google.com/drive/1EYfnIoG-r5lL2-3oiO_YdYEVKB0pAa9h">
23
  <img src="https://colab.research.google.com/assets/colab-badge.svg" alt="Open In Colab"/>
24
  </a>
25
 
 
67
  )
68
  ```
69
 
 
 
 
 
 
 
 
70
  ---
71
 
72
  ## Training procedure
73
 
74
+ - Training was done in BF16, deepspeed stage 2 for 6 epochs with ROUGE-2 monitored on the validation set.
75
 
76
  ## Hardware
77
  - GPU count 8 NVIDIA A100-SXM4-40GB