baebee commited on
Commit
6310e9d
1 Parent(s): b333055

Create README.md

Browse files
Files changed (1) hide show
  1. README.md +35 -0
README.md ADDED
@@ -0,0 +1,35 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: openrail
3
+ language:
4
+ - tl
5
+ tags:
6
+ - langauge
7
+ - gpt
8
+ - remake
9
+ - v2
10
+ ---
11
+ # Model Information
12
+ - Model Name: GPTagalog
13
+ - Version: 2
14
+ - Training Iterations: 143,000
15
+ - Training Epochs: 60
16
+ - Learning Rate: 6e-4
17
+ - Language: Tagalog
18
+ - Compatibility: Pickle (pkl) format
19
+ - Model Size: 30MB
20
+ - Usage: Experimental, not suitable for commercial purposes
21
+
22
+ # Model Description
23
+ This was designed to explore the capabilities of training a language model on a small dataset and to see how it performs in generating text in the Tagalog language.
24
+
25
+ # Training Details
26
+ Iterations and Epochs: GPTagalog was trained for 143,000 iterations over 60 epochs. This extended training period aimed to refine its language generation abilities.
27
+
28
+ Learning Rate: The model was trained with a learning rate of 6e-4, which was chosen to optimize learning and convergence.
29
+
30
+ Model Size: GPTagalog is relatively small with a file size of 30MB. This small size is due to its experimental nature and limited resources.
31
+
32
+ # Usage Guidelines
33
+ Experimental Use: GPTagalog Version 2 is an experimental model and is not recommended for commercial purposes. It may have limitations in generating coherent and contextually accurate text.
34
+
35
+ Resource Constraints: Due to resource limitations, the model's training was limited to 143,000 iterations and a maximum training time of 6 hours. This is considerably shorter than the training duration of larger models like GPT-2, which has 143 million parameters and takes several days to train.