GPT-2 for Skript

Complete your Skript automatically via a finetuned GPT-2 model

0.57 Training loss on about 2 epochs (in total)

1.2 million lines of Skript is inside the dataset.

Inference Colab: https://colab.research.google.com/drive/1ujtLt7MOk7Nsag3q-BYK62Kpoe4Lr4PE

Downloads last month
4
Hosted inference API
Text Generation
Examples
Examples
This model can be loaded on the Inference API on-demand.