IAJw's picture
Update README.md
eba3086
metadata
license: bsd
pipeline_tag: text2text-generation

Used scraped data as input, ChatGPT result as output and limited types of instruction to fine-tune flan-t5-base. Used declare-lab repo. Refer to https://github.com/declare-lab/flan-alpaca Epoch set to 1 Input token max 512 Output token max 512 Trained on Free Google Colab T4 Training time ~ 40 mins