Edit model card

roberta-base-japanese-juman-ud-goeswith

Model Description

This is a RoBERTa model pretrained on Japanese Wikipedia and CC-100 texts for POS-tagging and dependency-parsing (using goeswith for subwords), derived from roberta-base-japanese.

How to Use

from transformers import pipeline
nlp=pipeline("universal-dependencies","KoichiYasuoka/roberta-base-japanese-juman-ud-goeswith",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("全学年にわたって小学校の国語の教科書に挿し絵が用いられている"))

fugashi is required.

Downloads last month
3

Dataset used to train KoichiYasuoka/roberta-base-japanese-juman-ud-goeswith