File size: 1,734 Bytes
6310e9d
 
 
 
 
 
 
 
 
 
903d819
 
5a3a7da
6310e9d
 
 
 
 
 
08850b2
6310e9d
08850b2
6310e9d
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
---
license: openrail
language:
- tl
tags:
- langauge
- gpt
- remake
- v2
---
Colab used to train this model 👉👉 [gpt remaker](https://colab.research.google.com/drive/1O9uFQVP9EUhguwhx2qD4pk9PbRCdnijE?usp=sharing)
Both training and inferencing are included in the colab. Happy coding!

# Model Information
- Model Name: GPTagalog
- Version: 2
- Training Iterations: 143,000
- Learning Rate: 6e-4
- Language: Tagalog
- Compatibility: Pickle (pkl) format (cuda)
- Model Size: 30MB
- Training Time: Approx 2 hours and 30 minutes
- Usage: Experimental, not suitable for commercial purposes

# Model Description
This was designed to explore the capabilities of training a language model on a small dataset and to see how it performs in generating text in the Tagalog language.

# Training Details
Iterations and Epochs: GPTagalog was trained for 143,000 iterations over 60 epochs. This extended training period aimed to refine its language generation abilities.

Learning Rate: The model was trained with a learning rate of 6e-4, which was chosen to optimize learning and convergence.

Model Size: GPTagalog is relatively small with a file size of 30MB. This small size is due to its experimental nature and limited resources.

# Usage Guidelines
Experimental Use: GPTagalog Version 2 is an experimental model and is not recommended for commercial purposes. It may have limitations in generating coherent and contextually accurate text.

Resource Constraints: Due to resource limitations, the model's training was limited to 143,000 iterations and a maximum training time of 6 hours. This is considerably shorter than the training duration of larger models like GPT-2, which has 143 million parameters and takes several days to train.