Edit model card

Swallow-7b-plus-char-upos

Model Description

This is a LLaMA model for POS-tagging, derived from Swallow-7b-plus-hf. Every short-unit-word is tagged by UPOS (Universal Part-Of-Speech) and FEATS.

How to Use

from transformers import pipeline
nlp=pipeline("upos","KoichiYasuoka/Swallow-7b-plus-char-upos",trust_remote_code=True,aggregation_strategy="simple")
print(nlp("国境の長いトンネルを抜けると雪国であった。"))

Reference

安岡孝一: GPT系モデルの系列ラベリングによる品詞付与, 東洋学へのコンピュータ利用, 第38回研究セミナー (2024年7月26日), pp.3-10.

Downloads last month
6
Inference Examples
Inference API (serverless) is not available, repository is disabled.

Model tree for KoichiYasuoka/Swallow-7b-plus-char-upos

Finetuned
this model

Dataset used to train KoichiYasuoka/Swallow-7b-plus-char-upos