Edit model card

TaCOMET_ja

This is the Japanese TaCOMET model, which is the finetuned COMET model on the Japanese ver. of TimeATOMIC using causal language modeling (CLM) objective. The data and this model are introduced in this paper and LREC-COLING2024 (TBA).

Preprocessing

The texts are segmented into words using Juman++ and tokenized using SentencePiece.

BibTeX entry and citation info

@InProceedings{murata_nlp2024_tacomet,
    author =    "村田栄樹 and 河原大輔",
    title =     "TaCOMET: 時間を考慮したイベント常識生成モデル",
    booktitle = "言語処理学会第30回年次大会",
    year =      "2024",
    url =       "https://www.anlp.jp/proceedings/annual_meeting/2024/pdf_dir/P3-19.pdf"
    note =      "in Japanese"
}
Downloads last month
7

Collection including nlp-waseda/tacomet-gpt2-xl-japanese