Introduction

#1
by dannoncaffeine - opened

GPT2-124M-wikitext-v0.1

This is a practical hands-on result that helped me build a foundation on πŸ€— Transformers and πŸ€— Datasets. I fint-tuned GPT2 on wikitext (103-raw-v1) on T4. It's an interesting start.

Sign up or log in to comment