gpt2-small-serbian-upos

Model Description

This is a GPT-2 model in Serbian (Cyrillic and Latin) for POS-tagging and dependency-parsing, derived from gpt2-vrabac. Every word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.

How to Use

from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/gpt2-small-serbian-upos",trust_remote_code=True,aggregation_strategy="simple")

or

import esupar
nlp=esupar.load("KoichiYasuoka/gpt2-small-serbian-upos")

See Also

esupar: Tokenizer POS-tagger and Dependency-parser with BERT/RoBERTa/DeBERTa/GPT models

Downloads last month
10
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no library tag.

Model tree for KoichiYasuoka/gpt2-small-serbian-upos

Base model

jerteh/gpt2-vrabac
Finetuned
(4)
this model

Dataset used to train KoichiYasuoka/gpt2-small-serbian-upos