akiFQC commited on
Commit
809eee2
1 Parent(s): a7b372a

update readme

Browse files
Files changed (1) hide show
  1. README.md +14 -3
README.md CHANGED
@@ -3,14 +3,21 @@ license: cc-by-sa-4.0
3
  language: ja
4
  pipeline_tag: zero-shot-classification
5
  tags:
6
- - tohoku-nlp/bert-base-japanese-v3
 
 
 
7
  datasets:
8
  - shunk031/jsnli
9
  library_name: sentence-transformers
10
  ---
11
 
12
 
13
- # Cross-Encoder for Natural Language Inference
 
 
 
 
14
  This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
15
  This model is based on [tohoku-nlp/bert-base-japanese-v3](https://huggingface.co/tohoku-nlp/bert-base-japanese-v3).
16
 
@@ -61,4 +68,8 @@ sent = "Appleは先程、iPhoneの最新機種について発表しました。"
61
  candidate_labels = ["技術", "スポーツ", "政治"]
62
  res = classifier(sent, candidate_labels)
63
  print(res)
64
- ```
 
 
 
 
 
3
  language: ja
4
  pipeline_tag: zero-shot-classification
5
  tags:
6
+ - cross-encoder
7
+ - tohoku-nlp/bert-base-japanese-v3
8
+ - nli
9
+ - natural-language-inference
10
  datasets:
11
  - shunk031/jsnli
12
  library_name: sentence-transformers
13
  ---
14
 
15
 
16
+ # Cross-Encoder for Natural Language Inference(NLI) for Japanese
17
+
18
+ > [!NOTE]
19
+ > Considering the results of the JNLI evaluation result, we recommend using [akiFQC/bert-base-japanese-v3_nli-jsnli-jnli-jsick](https://huggingface.co/akiFQC/bert-base-japanese-v3_nli-jsnli-jnli-jsick) for natural language inference in Japanese.
20
+
21
  This model was trained using [SentenceTransformers](https://sbert.net) [Cross-Encoder](https://www.sbert.net/examples/applications/cross-encoder/README.html) class.
22
  This model is based on [tohoku-nlp/bert-base-japanese-v3](https://huggingface.co/tohoku-nlp/bert-base-japanese-v3).
23
 
 
68
  candidate_labels = ["技術", "スポーツ", "政治"]
69
  res = classifier(sent, candidate_labels)
70
  print(res)
71
+ ```
72
+
73
+ ## Benchmarks
74
+
75
+ JGLUE-JNLI validation set accuracy: 0.770