File size: 718 Bytes
4c97596
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
### Model information
    
    Fine tuning data 1: https://www.kaggle.com/andradaolteanu/rickmorty-scripts
    Base model: e-tony/gpt2-rnm
    Epoch: 1
    Train runtime: 3.4982 secs
    Loss: 3.0894


Training notebook: [Colab](https://colab.research.google.com/drive/1RawVxulLETFicWMY0YANUdP-H-e7Eeyc)

### ===Teachable NLP=== ###

To train a GPT-2 model, write code and require GPU resources, but can easily fine-tune and get an API to use the model here for free.

Teachable NLP: [Teachable NLP](https://ainize.ai/teachable-nlp)

Tutorial: [Tutorial](https://forum.ainetwork.ai/t/teachable-nlp-how-to-use-teachable-nlp/65?utm_source=community&utm_medium=huggingface&utm_campaign=model&utm_content=teachable%20nlp)