shengz's picture
Update README.md
617f0b3
|
raw
history blame
2.86 kB
metadata
license: cc
language:
  - en

💡We'll release the code for using UniversalNER at this repo by EOW (08/13/2023)

UniNER-7B-definition

Description: A UniNER-7B model trained from LLama-7B using the Pile-NER-definition data without human-labeled data. The data was collected by prompting gpt-3.5-turbo-0301 to label entities from passages and provide short-sentence definitions. The data collection prompt is as follows:

Instruction:
Given a paragraph, your task is to extract all entities and concepts, and define their type using a short sentence. The output should be in the following format: [("entity", "definition of entity type in a short sentence"), ... ]

Check our paper for more information.

Comparison with UniNER-7B-type

The UniNER-7B-type model, trained on Pile-NER-type, excels in recognizing common and short NER tags (e.g., person, location) and performs better on NER datasets. On the other hand, UniNER-7B-definition demonstrates superior capabilities in understanding short-sentence definitions of entity types. Additionally, it exhibits enhanced robustness against variations in type paraphrasing.

Inference

The template for inference instances is as follows:

Prompting template:
A virtual assistant answers questions from a user based on the provided text.
USER: Text: {Fill the input text here}
ASSISTANT: I’ve read this text.
USER: What describes {Fill the entity type here} in the text?
ASSISTANT: (model's predictions in JSON format)

Note: Inferences are based on one entity type at a time. For multiple entity types, create separate instances for each type.

License

This model and its associated data are released under the CC BY-NC 4.0 license. They are primarily used for research purposes.

Citation

@article{zhou2023universalner,
      title={UniversalNER: Targeted Distillation from Large Language Models for Open Named Entity Recognition}, 
      author={Wenxuan Zhou and Sheng Zhang and Yu Gu and Muhao Chen and Hoifung Poon},
      year={2023},
      eprint={2308.03279},
      archivePrefix={arXiv},
      primaryClass={cs.CL}
}