Minor grammatical error
Browse files
README.md
CHANGED
@@ -14,7 +14,7 @@ Javanese GPT-2 Small IMDB is a causal language model based on the [GPT-2 model](
|
|
14 |
|
15 |
The model was originally the pretrained [Javanese GPT-2 Small model](https://huggingface.co/w11wo/javanese-gpt2-small) and is later fine-tuned on the Javanese IMDB movie review dataset. It achieved a perplexity of 55.09 on the validation dataset. Many of the techniques used are based on a Hugging Face tutorial [notebook](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb) written by [Sylvain Gugger](https://github.com/sgugger).
|
16 |
|
17 |
-
Hugging Face's `Trainer` class from the [Transformers]((https://huggingface.co/transformers)) library was used to train the
|
18 |
|
19 |
## Model
|
20 |
| Model | #params | Arch. | Training/Validation data (text) |
|
|
|
14 |
|
15 |
The model was originally the pretrained [Javanese GPT-2 Small model](https://huggingface.co/w11wo/javanese-gpt2-small) and is later fine-tuned on the Javanese IMDB movie review dataset. It achieved a perplexity of 55.09 on the validation dataset. Many of the techniques used are based on a Hugging Face tutorial [notebook](https://github.com/huggingface/notebooks/blob/master/examples/language_modeling.ipynb) written by [Sylvain Gugger](https://github.com/sgugger).
|
16 |
|
17 |
+
Hugging Face's `Trainer` class from the [Transformers]((https://huggingface.co/transformers)) library was used to train the model. PyTorch was used as the backend framework during training, but the model remains compatible with TensorFlow nonetheless.
|
18 |
|
19 |
## Model
|
20 |
| Model | #params | Arch. | Training/Validation data (text) |
|