Edit model card

T5-large for Word Sense Disambiguation

If you are using this model in your research work, please cite

  title={Incorporating Word Sense Disambiguation in Neural Language Models},
  author={Wahle, Jan Philip and Ruas, Terry and Meuschke, Norman and Gipp, Bela},
  journal={arXiv preprint arXiv:2106.07967},

This is the checkpoint for T5-large after being trained on the SemCor 3.0 dataset.

Additional information about this model:

The model can be loaded to perform a few-shot classification like so:

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer

model = AutoModelForSeq2SeqLM.from_pretrained("jpelhaw/t5-word-sense-disambiguation")
tokenizer = AutoTokenizer.from_pretrained("jpelhaw/t5-word-sense-disambiguation")

input = '''question: which description describes the word " java "\
           best in the following context? \
descriptions:[  " A drink consisting of an infusion of ground coffee beans ", 
                " a platform-independent programming language ", or
                " an island in Indonesia to the south of Borneo " ] 
context: I like to drink " java " in the morning .'''

example = tokenizer.tokenize(input, add_special_tokens=True)

answer = model.generate(input_ids=example['input_ids'], 
# "a drink consisting of an infusion of ground coffee beans"
Downloads last month
Model size
738M params
Tensor type
Hosted inference API
This model can be loaded on the Inference API on-demand.