DarwinAnim8or commited on
Commit
74ad994
1 Parent(s): b7f14bf

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +50 -0
README.md CHANGED
@@ -1,3 +1,53 @@
1
  ---
2
  license: mit
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
3
  ---
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
  license: mit
3
+ datasets:
4
+ - DarwinAnim8or/greentext
5
+ language:
6
+ - en
7
+ tags:
8
+ - fun
9
+ - greentext
10
+ widget:
11
+ - text: ">be me"
12
+ example_title: "be me"
13
+ co2_eq_emissions:
14
+ emissions: 60
15
+ source: "https://mlco2.github.io/impact/#compute"
16
+ training_type: "fine-tuning"
17
+ geographical_location: "Oregon, USA"
18
+ hardware_used: "1 T4, Google Colab"
19
  ---
20
+
21
+ # GPT-Greentext-125m
22
+ A finetuned version of [GPT2-Medium](https://huggingface.co/gpt2-medium) on the 'greentext' dataset. (Linked above)
23
+ A demo is available [here](https://huggingface.co/spaces/DarwinAnim8or/GPT-Greentext-Playground)
24
+ The demo playground is recommended over the inference box on the right.
25
+
26
+ # Training Procedure
27
+ This was trained on the 'greentext' dataset, using the "HappyTransformers" library on Google Colab.
28
+ This model was trained for 8 epochs with learning rate 1e-2.
29
+
30
+ # Biases & Limitations
31
+ This likely contains the same biases and limitations as the original GPT2 that it is based on, and additionally heavy biases from the greentext dataset.
32
+ It likely will generate offensive output.
33
+
34
+ # Intended Use
35
+ This model is meant for fun, nothing else.
36
+
37
+ # Sample Use
38
+ ```python
39
+ #Import model:
40
+ from happytransformer import HappyGeneration
41
+ happy_gen = HappyGeneration("GPT2", "DarwinAnim8or/GPT-Greentext-355m")
42
+
43
+ #Set generation settings:
44
+ from happytransformer import GENSettings
45
+ args_top_k = GENSettingsGENSettings(no_repeat_ngram_size=3, do_sample=True, top_k=80, temperature=0.8, max_length=150, early_stopping=False)
46
+
47
+ #Generate a response:
48
+ result = happy_gen.generate_text(""">be me
49
+ >""", args=args_top_k)
50
+
51
+ print(result)
52
+ print(result.text)
53
+ ```