File size: 1,866 Bytes
1140607
 
f9affc0
 
 
 
 
1140607
f9affc0
34056f6
 
f9affc0
 
1eeaa40
 
 
f9affc0
 
34056f6
1bf7ee2
 
f9affc0
 
1bf7ee2
 
 
 
 
 
f9affc0
 
1bf7ee2
 
 
f9affc0
 
 
 
 
154d14b
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
---
license: openrail
metrics:
- bleu
pipeline_tag: text-generation
tags:
- code
---

## Story Generation Using GPT-2 in Hugging Face
This repository provides an example of how to use the GPT-2 language model in Hugging Face for story generation tasks. GPT-2 is a powerful natural language processing model that can generate human-like text, and Hugging Face is a popular open-source library for working with NLP models.

## Requirements
- Python 3.6 or higher
- Hugging Face transformers library
- PyTorch or TensorFlow

## Installation
- Clone this repository: git clone ```https://github.com/BaoToan1704/Deep-Learning/Final%20Project```
- Navigate to the repository directory: ```cd Final Project```
- Install the required libraries: ```pip install -r requirements.txt```

## Usage
- Download the GPT-2 pre-trained model: ```python download_model.py```
- Edit the ```Gpt_2_to_generate_stories.ipynb``` file to include your desired prompt and generate settings.
- Run the ```Gpt_2_to_generate_stories.ipynb file``` to generate text: ```python Gpt_2_to_generate_stories.ipynb```
  
## Customization
You can customize the GPT-2 model and the text generation settings by editing the ```Gpt_2_to_generate_stories.ipynb``` file. For example, you can change the prompt text, the number of tokens to generate, the temperature setting for the model, and more.

## References
- Hugging Face Transformers library: ```https://github.com/huggingface/transformers```
- GPT-2 model by me: ```https://huggingface.co/baotoan2002/GPT-2```
- OpenAI GPT-2 model: ```https://openai.com/models/gpt-2/```

## License
This repository is licensed under the [openrail] License. See the LICENSE file for details.

## Acknowledgments
- Special thanks to the Hugging Face team for their excellent work on the Transformers library.
- Thanks to OpenAI for providing the pre-trained GPT-2 model.