Edit model card

Fine tuned GPT Neo 125M with Sinhala Wikipedia Dataset

This model is fine tuned with acrticles from Sinhala Wikipedia Dataset for Sinhala text generation. Only the articles with word count in between 60 and 500 were used here.

How to use

You can use this model directly with a pipeline for text generation. This example generates a different sequence each time it's run:

>>> from transformers import pipeline
>>> generator = pipeline('text-generation', model='Suchinthana/sinhala-gpt-neo-siwiki')
>>> generator("නිර්මාණය කිරීම සඳහා ", do_sample=True, max_length=500)
Downloads last month
11
Safetensors
Model size
176M params
Tensor type
F32
·
U8
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Suchinthana/sinhala-gpt-neo-siwiki