Zero-Shot Classification
Transformers
PyTorch
Safetensors
English
deberta-v2
text-classification
deberta-v3-base
deberta-v3
deberta
nli
natural-language-inference
multitask
multi-task
pipeline
extreme-multi-task
extreme-mtl
tasksource
zero-shot
rlhf
Eval Results
Inference Endpoints

fINER Error

#7
by iampoppyxx - opened

Been getting this error for a while, not sure what to do about it .
Code:

import tasknet as tn
pipe = tn.load_pipeline('sileod/deberta-v3-base-tasksource-nli','finer-139') 

Error:

AttributeError                            Traceback (most recent call last)

<ipython-input-7-a8319e37fc37> in <cell line: 2>()
      1 import tasknet as tn
----> 2 pipe = tn.load_pipeline('sileod/deberta-v3-base-tasksource-nli','finer-139')

/usr/local/lib/python3.10/dist-packages/tasknet/utils.py in load_pipeline(model_name, task_name, adapt_task_embedding, multilingual, device, return_all_scores)
    204     tokenizer = AutoTokenizer.from_pretrained(model_name)
    205     model = adapter.adapt_model_to_task(model, task_name)
--> 206     model.config.id2label = task["train"].features["labels"]._int2str
    207 
    208     task_index = adapter.config.tasks.index(task_name)

AttributeError: 'Sequence' object has no attribute '_int2str'

Hi, tasknet is not currently supporting pipelines for NER. I'll see what I can do next week

(It should be doable to define load_ner_pipeline with appropriate code changes if the meantime)

Could you share some resources on how to do that perhaps? Thanks for the quick reply

@sileod is there any prebuilt model i can use for NER with fiNER. it's a pretty useful dataset for a problem I'm working with.

@sileod do you have any updates for me on this? I would love to know how I can get around this. Thanks in advance.

Hi, I actually implemented the code to do that locally. However, results are not good. I think you should use tasknet https://github.com/sileod/tasknet/tree/main to fine-tune your own model (possibly based on mine)

Hi, I added fine-tuning code on the ReadMe

Sign up or log in to comment