roberta-base-thai-spm-ud-goeswith
Model Description
This is a RoBERTa model pre-trained on Thai Wikipedia texts for POS-tagging and dependency-parsing (using goeswith
for subwords), derived from roberta-base-thai-spm-upos.
How to Use
from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/roberta-base-thai-spm-ud-goeswith",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("หลายหัวดีกว่าหัวเดียว"))
Reference
Koichi Yasuoka: Sequence-Labeling RoBERTa Model for Dependency-Parsing in Classical Chinese and Its Application to Vietnamese and Thai, ICBIR 2023: 8th International Conference on Business and Industrial Research (May 2023), pp.169-173.
- Downloads last month
- 29
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for KoichiYasuoka/roberta-base-thai-spm-ud-goeswith
Base model
KoichiYasuoka/roberta-base-thai-spm
Finetuned
KoichiYasuoka/roberta-base-thai-spm-upos