Edit model card

How to use

from transformers import T5Tokenizer, T5ForConditionalGeneration
model_name = "KhantKyaw/T5-small_new_chatbot"
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
def generate_response(input_text):
    input_ids = tokenizer.encode(input_text, return_tensors='pt')
    outputs = model.generate(input_ids,
                             min_length=5,
                             max_length=300,
                             do_sample=True, num_beams=5, no_repeat_ngram_size=2)
    generated_text = tokenizer.decode(
        outputs[0], skip_special_tokens=True)
    return generated_text
generate_response("how to be healthy?")

Contributors: Team Machina: Khant Kyaw, Hein Min Htun, Htet Myat Noe Aung, Thant Zin Oo

Downloads last month
102
Safetensors
Model size
60.5M params
Tensor type
F32
·

Spaces using KhantKyaw/T5-small_new_chatbot 2