My Fine-Tuned T5 Model
This is a fine-tuned T5 model for extracting structured information from automobile part descriptions.
Usage
from transformers import T5Tokenizer, T5ForConditionalGeneration
model_name = "your-username/your-repo-name"
tokenizer = T5Tokenizer.from_pretrained(model_name)
model = T5ForConditionalGeneration.from_pretrained(model_name)
input_text = "Brand: Ford, Model: Mustang, Year: 2018, Color: Red"
inputs = tokenizer(input_text, return_tensors="pt", max_length=128, truncation=True)
outputs = model.generate(**inputs)
decoded_output = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(decoded_output)
---
license: apache-2.0
---
- Downloads last month
- 0
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API:
The model has no library tag.