Edit model card

Swallow-7b-plus-upos

Model Description

This is a LLaMA model for POS-tagging, derived from Swallow-7b-plus-hf. Every short-unit-word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.

How to Use

from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/Swallow-7b-plus-upos",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("国境の長いトンネルを抜けると雪国であった。"))
Downloads last month
5
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.

Dataset used to train KoichiYasuoka/Swallow-7b-plus-upos