Edit model card

T5-define

(This model is still a work in progress. If you use it for fine tuning, make sure to save a local copy)

This model is trained to generate word definitions based on the word and a context, using a subset of wordnet for all words that have an example and definition. The model uses task prompts on the format 'define "[word]": [example sentence]'

This model in particular is a one-shot learner for unseen words, as it has to infer the definition by only one example

How to run:

from transformers import T5ForConditionalGeneration, T5Tokenizer

tokenizer = T5Tokenizer.from_pretrained("marksverdhei/t5-base-define")
model = T5ForConditionalGeneration.from_pretrained("marksverdhei/t5-base-define")

prompt = "define \"noseplow\": The children hid as the noseplow drove across the street"

ids = tokenizer(prompt, return_tensors="pt").input_ids
generated_tokens = model.generate(ids)[0][1:-1]
print(tokenizer.decode(generated_tokens))

See the gist for the source code to used to train the model:

https://gist.github.com/marksverdhei/0a13f67e65460b71c05fcf558a6a91ae

Downloads last month
25
Safetensors
Model size
223M params
Tensor type
F32
·
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train marksverdhei/t5-base-define

Space using marksverdhei/t5-base-define 1