This is a multilingual NER system trained using a Frustratingly Easy Domain Adaptation architecture. It is based on LaBSE and supports different tagsets all using IOBES formats:

  1. Wikiann (LOC, PER, ORG)
  2. SlavNER 19/21 (EVT, LOC, ORG, PER, PRO)
  3. Turku (DATE, EVT, LOC, ORG, PER, PRO, TIME)

PER: person, LOC: location, ORG: organization, EVT: event, PRO: product, MISC: Miscellaneous, MEDIA: media, ART: Artifact, TIME: time, DATE: date, GEOPOLIT: Geopolitical,

You can select the tagset to use in the output by configuring the model. This models manages differently uppercase words.

More information about the model can be found in the paper (https://aclanthology.org/2021.bsnlp-1.12.pdf) and GitHub repository (https://github.com/EMBEDDIA/NER_FEDA).

Downloads last month
18
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support