Model Description

This model is originally created for Kaggle's competition here.

You can also see the model from Kaggle here.

Trained with ~80k Turkish dataset with 3 epochs. Took around 19 hours with 2x RTX 4090 GPUs.

You can use the model from PEFT, Transformers and Kaggle.

Important Notes

  • Use the model with a CUDA supported GPU since it's fine-tuned with bitsandbytes.

Fine-tuned by emre570.

Downloads last month
15
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for emre570/gemma-2-2b-tr-3epoch

Base model

google/gemma-2-2b
Adapter
(67)
this model

Dataset used to train emre570/gemma-2-2b-tr-3epoch