--- language: - en tags: - pytorch - ner - text generation - seq2seq inference: false license: mit datasets: - conll2003 metrics: - f1 --- # t5-base-qa-ner-conll Unofficial implementation of [InstructionNER](https://arxiv.org/pdf/2203.03903v1.pdf). t5-base model tuned on conll2003 dataset. https://github.com/ovbystrova/InstructionNER ## Inference ```shell git clone https://github.com/ovbystrova/InstructionNER cd InstructionNER ``` ```python from instruction_ner.model import Model model = Model( model_path_or_name="olgaduchovny/t5-base-ner-mit-restaurant", tokenizer_path_or_name="olgaduchovny/t5-base-mit-restaurant" ) options = ["LOC", "PER", "ORG", "MISC"] instruction = "please extract entities and their types from the input sentence, " \ "all entity types are in options" text = "Once I visited Sovok in Nizny Novgorod. I had asian wok there. It was the best WOK i ever had"\ "It was cheap but lemonades cost 5 dollars." generation_kwargs = { "num_beams": 2, "max_length": 128 } pred_spans = model.predict( text=text, generation_kwargs=generation_kwargs, instruction=instruction, options=options ) >>> ('sovok is a Restaurant_Name, Nizny Novgorod is a Location, asian wok is a Dish, cheap is a Price, lemonades is a Dish, 5 dollars is a Price.', [(24, 38, 'Location'), (46, 55, 'Dish'), (100, 105, 'Price'), (110, 119, 'Dish'), (125, 134, 'Price')]) ```