Model Description

This model is originally created for Kaggle's competition here.

You can also see the model from Kaggle here.

Trained with ~80k Turkish dataset with 3 epochs. Took around 19 hours with 2x RTX 4090 GPUs.

You can use the model from PEFT, Transformers and Kaggle.

Important Notes

  • Use the model with a CUDA supported GPU since it's fine-tuned with bitsandbytes.

Fine-tuned by emre570.

Downloads last month
8
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Model tree for emre570/gemma-2-2b-tr-3epoch

Base model

google/gemma-2-2b
Adapter
(61)
this model

Dataset used to train emre570/gemma-2-2b-tr-3epoch