system HF staff commited on
Commit
a0fba48
1 Parent(s): 94f10f5

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +26 -0
README.md ADDED
@@ -0,0 +1,26 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language: ar
3
+ ---
4
+
5
+ # Sanaa
6
+ ## Arabic GPT-2 demo
7
+
8
+ This is a small GPT-2 model retrained on Arabic Wikipedia circa September 2020
9
+ (due to memory limits, the first 600,000 lines of the Wiki dump)
10
+
11
+ Training notebook: https://colab.research.google.com/drive/1Z_935vTuZvbseOsExCjSprrqn1MsQT57
12
+
13
+ Steps to training:
14
+ - Follow beginning of Pierre Guillou's Portuguese GPT-2 notebook: https://github.com/piegu/fastai-projects/blob/master/finetuning-English-GPT2-any-language-Portuguese-HuggingFace-fastaiv2.ipynb to download Arabic Wikipedia and run WikiExtractor
15
+ - Read Beginner's Guide by Ng Wai Foong https://medium.com/@ngwaifoong92/beginners-guide-to-retrain-gpt-2-117m-to-generate-custom-text-content-8bb5363d8b7f
16
+ - Following Ng Wai Foong's instructions, create an encoded .npz corpus (this was very small in my project
17
+ and would be improved by adding many X more training data)
18
+ - Run generate_unconditional_samples.py and other sample code to generate text
19
+ - Download TensorFlow checkpoints
20
+ - Use my notebook code to write vocab.json, empty merge.txt
21
+ - Copy config.json from similar GPT-2 arch, edit for changes as needed
22
+
23
+ ```python
24
+ am = AutoModel.from_pretrained('./argpt', from_tf=True)
25
+ am.save_pretrained("./")
26
+ ```