nandinib1999 commited on
Commit
c003419
1 Parent(s): 645322b

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +66 -0
README.md ADDED
@@ -0,0 +1,66 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ thumbnail:
5
+ tags:
6
+ - text generation
7
+ license:
8
+ datasets:
9
+ - quotes-500K
10
+ metrics:
11
+ - perplexity
12
+
13
+ ---
14
+
15
+ # Quotes Generator
16
+
17
+ ## Model description
18
+
19
+ This is a GPT2 model fine-tuned on the Quotes-500K dataset.
20
+
21
+ ## Intended uses & limitations
22
+
23
+ For a given user prompt, it can generate motivational quotes starting with it.
24
+
25
+ #### How to use
26
+
27
+ ```python
28
+ from transformers import AutoTokenizer, AutoModelWithLMHead
29
+
30
+ tokenizer = AutoTokenizer.from_pretrained("nandinib1999/quote-generator")
31
+ model = AutoModelWithLMHead.from_pretrained("nandinib1999/quote-generator")
32
+ ```
33
+
34
+ ## Training data
35
+
36
+ This is the distribution of the total dataset into training, validation and test dataset for the fine-tuning task.
37
+
38
+ <table style="width:30%">
39
+ <tr>
40
+ <th>train</th>
41
+ <th>validation</th>
42
+ <th>test</th>
43
+ </tr>
44
+ <tr>
45
+ <td>349796</td>
46
+ <td>99942</td>
47
+ <td>49971</td>
48
+ </tr>
49
+ </table>
50
+
51
+ ## Training procedure
52
+
53
+ The model was fine-tuned using the Google Colab GPU for one epoch. The weights of the pre-trained GPT2 model were used as a base.
54
+
55
+ ## Eval results
56
+
57
+ <table style="width:30%">
58
+ <tr>
59
+ <th>Epoch</th>
60
+ <th>Perplexity</th>
61
+ </tr>
62
+ <tr>
63
+ <td>1</td>
64
+ <td>15.180</td>
65
+ </tr>
66
+ </table>