Edit model card

Model Card

Please check google/mt5-base model. This model is pruned version of mt5-base model to only work in Turkish and English. Also for methodology, you can check Russian version of mT5-base cointegrated/rut5-base.

Usage

You should import required libraries by:

from transformers import T5ForConditionalGeneration, T5Tokenizer
import torch

To load model:

model = T5ForConditionalGeneration.from_pretrained('bonur/t5-base-tr')
tokenizer = T5Tokenizer.from_pretrained('bonur/t5-base-tr')

To make inference with given text, you can use the following code:

inputs = tokenizer("Bu hafta hasta olduğum için <extra_id_0> gittim.", return_tensors='pt')
with torch.no_grad():
    hypotheses = model.generate(
        **inputs,
        do_sample=True, top_p=0.95,
        num_return_sequences=2,
        repetition_penalty=2.75,
        max_length=32,
    )
for h in hypotheses:
    print(tokenizer1.decode(h))

You can tune parameters for better result, and this model is ready to fine-tune in bilingual downstream tasks with English and Turkish.

Downloads last month
7
Safetensors
Model size
229M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.