GPT-2 (355M model) finetuned on 0.5m PubMed abstracts. Used in the writemeanabstract.com and the following preprint:
- Downloads last month
- 2
GPT-2 (355M model) finetuned on 0.5m PubMed abstracts. Used in the writemeanabstract.com and the following preprint: