Edit model card

Model Card for Model ID

This model card lists fine-tuned byT5 model for the task of Text Generation from Meaning Representation (DRS).

Model Details

We worked on a pre-trained byt5-base model and fine-tuned it with the Parallel Meaning Bank dataset (DRS-Text pairs dataset). Furthermore, we enriched the gold_silver flavors of PMB (release 5.0.0) with different augmentation strategies.

Uses

To use the model, follow the code below for a quick response.


from transformers import ByT5Tokenizer, T5ForConditionalGeneration

# Initialize the tokenizer and model
tokenizer = ByT5Tokenizer.from_pretrained('saadamin2k13/urdu_text_generation', max_length=512)

model = T5ForConditionalGeneration.from_pretrained('saadamin2k13/urdu_text_generation')

# Example sentence
example = "male.n.02 Name 'ٹام' yell.v.01 Agent -1 Time +1 time.n.08 TPR now"

# Tokenize and prepare the input
x = tokenizer(example, return_tensors='pt', padding=True, truncation=True, max_length=512)['input_ids']

# Generate output
output = model.generate(x)

# Decode and print the output text
pred_text = tokenizer.decode(output[0], skip_special_tokens=True, clean_up_tokenization_spaces=False)
print(pred_text)
Downloads last month
0
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.