pangpang666 commited on
Commit
2d8e8b0
1 Parent(s): 0ccba25

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +65 -0
README.md ADDED
@@ -0,0 +1,65 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ This model provides a GPT-2 language model trained with SimCTG on the WritingPrompts benchmark [(Fan et al., 2018)](https://arxiv.org/abs/1805.04833) based on our paper [_A Contrastive Framework for Neural Text Generation_](https://arxiv.org/abs/2202.06417).
2
+
3
+ We provide a detailed tutorial on how to apply SimCTG and Contrastive Search in our [project repo](https://github.com/yxuansu/SimCTG#4-huggingface-style-tutorials-back-to-top). In the following, we illustrate a brief tutorial on how to use our approach to perform text generation.
4
+
5
+ ## 1. Installation of SimCTG:
6
+ ```yaml
7
+ pip install simctg --upgrade
8
+ ```
9
+
10
+ ## 2. Initialize SimCTG Model:
11
+ ```python
12
+ import torch
13
+ # load SimCTG language model
14
+ from simctg.simctggpt import SimCTGGPT
15
+ model_name = r'cambridgeltl/simctg_writingprompts'
16
+ model = SimCTGGPT(model_name)
17
+ model.eval()
18
+ tokenizer = model.tokenizer
19
+ ```
20
+
21
+ ## 3. Prepare the Text Prefix:
22
+ ```python
23
+ prefix_text = r"[ WP ] A kid doodling in a math class accidentally creates the world 's first functional magic circle in centuries . <|endoftext|>"
24
+ print ('Prefix is: {}'.format(prefix_text))
25
+ tokens = tokenizer.tokenize(prefix_text)
26
+ input_ids = tokenizer.convert_tokens_to_ids(tokens)
27
+ input_ids = torch.LongTensor(input_ids).view(1,-1)
28
+ ```
29
+
30
+ ## 4. Generate Text with Contrastive Search:
31
+ ```python
32
+ beam_width, alpha, decoding_len = 5, 0.6, 200
33
+ output = model.fast_contrastive_search(input_ids=input_ids, beam_width=beam_width,
34
+ alpha=alpha, decoding_len=decoding_len)
35
+ print("Output:\n" + 100 * '-')
36
+ print(tokenizer.decode(output))
37
+ '''
38
+ Prefix is: [ WP ] A kid doodling in a math class accidentally creates the world 's first functional magic circle in centuries . <|endoftext|>
39
+ Output:
40
+ ----------------------------------------------------------------------------------------------------
41
+ [ WP ] A kid doodling in a math class accidentally creates the world's first functional magic circle in centuries. <|endoftext|> I looked at
42
+ the circle, it wasn't there. I couldn't see it, and my eyes were watering from the rain that had fallen over the school, the wind howling through
43
+ the windows and making a wispy noise as it passed through the air. `` What is it? '' I asked, trying to find the source of the noise. `` It's a
44
+ circle, '' the teacher said in a voice that sounded like it was from an old TV show or something like that. `` You can't make it out of there. ''
45
+ I looked around the room, there was no one there. It was as if I was in a dream, but no one seemed to notice me. Then I saw a flash of light, and
46
+ the circle appeared in front of me. I turned around to see what was going on, I had never seen anything like it before in my life. I ran up to the
47
+ teacher and asked, `` Are you sure this is real?
48
+ '''
49
+ ```
50
+
51
+ For more details of our work, please refer to our main [project repo](https://github.com/yxuansu/SimCTG).
52
+
53
+ ## 5. Citation:
54
+ If you find our paper and resources useful, please kindly leave a star and cite our paper. Thanks!
55
+
56
+ ```bibtex
57
+ @article{su2022contrastive,
58
+ title={A Contrastive Framework for Neural Text Generation},
59
+ author={Su, Yixuan and Lan, Tian and Wang, Yan and Yogatama, Dani and Kong, Lingpeng and Collier, Nigel},
60
+ journal={arXiv preprint arXiv:2202.06417},
61
+ year={2022}
62
+ }
63
+ ```
64
+
65
+