GPTagalog / README.md
baebee's picture
Create README.md
6310e9d
metadata
license: openrail
language:
  - tl
tags:
  - langauge
  - gpt
  - remake
  - v2

Model Information

  • Model Name: GPTagalog
  • Version: 2
  • Training Iterations: 143,000
  • Training Epochs: 60
  • Learning Rate: 6e-4
  • Language: Tagalog
  • Compatibility: Pickle (pkl) format
  • Model Size: 30MB
  • Usage: Experimental, not suitable for commercial purposes

Model Description

This was designed to explore the capabilities of training a language model on a small dataset and to see how it performs in generating text in the Tagalog language.

Training Details

Iterations and Epochs: GPTagalog was trained for 143,000 iterations over 60 epochs. This extended training period aimed to refine its language generation abilities.

Learning Rate: The model was trained with a learning rate of 6e-4, which was chosen to optimize learning and convergence.

Model Size: GPTagalog is relatively small with a file size of 30MB. This small size is due to its experimental nature and limited resources.

Usage Guidelines

Experimental Use: GPTagalog Version 2 is an experimental model and is not recommended for commercial purposes. It may have limitations in generating coherent and contextually accurate text.

Resource Constraints: Due to resource limitations, the model's training was limited to 143,000 iterations and a maximum training time of 6 hours. This is considerably shorter than the training duration of larger models like GPT-2, which has 143 million parameters and takes several days to train.