T5ForConditionalGeneration files for the Madlad-400 3B parameter MT model.

from transformers import T5ForConditionalGeneration, T5Tokenizer, GenerationConfig
model = T5ForConditionalGeneration.from_pretrained("iliemihai/madlad400-3b-mt")
tokenizer = T5Tokenizer.from_pretrained("iliemihai/madlad400-3b-mt")

text = """The quick brown fox jumped over the laisy dog."""
input_ids = tokenizer.encode(f"<2ro> {text} </s>", return_tensors="pt")
outputs = model.generate(input_ids, 
        do_sample=True,
        early_stopping=True,
        top_p=0.92,
        top_k=50,
        temperature=0.3,
        max_length=256)
print(tokenizer.decode(outputs[0]))

**KUDOS TO jbochi FOR RELEASING THE COLAB AND CONVERSION CODE Colab to generate these files is here.

Downloads last month
24
Safetensors
Model size
2.94B params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for iliemihai/madlad400-3b-mt

Quantizations
1 model