KDHyun08 commited on
Commit
dbd019c
β€’
1 Parent(s): 038c388

Upload with huggingface_hub

Browse files
Files changed (1) hide show
  1. README.md +29 -7
README.md CHANGED
@@ -4,17 +4,17 @@ tags:
4
  - sentence-transformers
5
  - sentence-similarity
6
  - transformers
7
- lan: Korean
8
  ---
9
 
10
  # TAACO_Similarity
11
 
12
- λ³Έ λͺ¨λΈμ€ Sentence-transformers[sentence-transformers](https://www.SBERT.net)λ₯Ό 기반으둜 ν•˜λ©° KLUE의 STS(Sentence Textual Similarity) 데이터셋을 톡해 ν›ˆλ ¨μ„ μ§„ν–‰ν•œ λͺ¨λΈμž…λ‹ˆλ‹€. ν•„μžκ°€ μ œμž‘ν•˜κ³  μžˆλŠ” ν•œκ΅­μ–΄ λ¬Έμž₯κ°„ 결속성
13
  μΈ‘μ • 도ꡬ인 K-TAACO(κ°€μ œ)의 μ§€ν‘œ 쀑 ν•˜λ‚˜μΈ λ¬Έμž₯ κ°„ 의미적 결속성을 μΈ‘μ •ν•˜κΈ° μœ„ν•΄ μ œμž‘ν•˜μ˜€μŠ΅λ‹ˆλ‹€. λ˜ν•œ λͺ¨λ‘μ˜ λ§λ­‰μΉ˜μ˜ λ¬Έμž₯κ°„ μœ μ‚¬λ„ 데이터셋을 ν†΅ν•΄μ„œλ„ ν›ˆλ ¨μ„ 진행할 μ˜ˆμ •μž…λ‹ˆλ‹€.
14
 
15
  ## Usage (Sentence-Transformers)
16
 
17
- λ³Έ λͺ¨λΈμ„ μ‚¬μš©ν•˜κΈ° μœ„ν•΄μ„œλŠ” Sentence-Transformer [sentence-transformers](https://www.SBERT.net)λ₯Ό μ„€μΉ˜ν•˜μ—¬μ•Ό ν•©λ‹ˆλ‹€.
18
 
19
  ```
20
  pip install -U sentence-transformers
@@ -33,7 +33,7 @@ print(embeddings)
33
 
34
 
35
  ## Usage (μ‹€μ œ λ¬Έμž₯ κ°„ μœ μ‚¬λ„ 비ꡐ)
36
- Sentence-transformers [sentence-transformers](https://www.SBERT.net) λ₯Ό μ„€μΉ˜ν•œ ν›„ μ•„λž˜ λ‚΄μš©κ³Ό 같이 λ¬Έμž₯ κ°„ μœ μ‚¬λ„λ₯Ό 비ꡐ할 수 μžˆμŠ΅λ‹ˆλ‹€.
37
  query λ³€μˆ˜λŠ” 비ꡐ 기쀀이 λ˜λŠ” λ¬Έμž₯(Source Sentence)이고 비ꡐλ₯Ό 진행할 λ¬Έμž₯은 docs에 list ν˜•μ‹μœΌλ‘œ κ΅¬μ„±ν•˜μ‹œλ©΄ λ©λ‹ˆλ‹€.
38
 
39
  ```python
@@ -65,11 +65,33 @@ for i, (score, idx) in enumerate(zip(top_results[0], top_results[1])):
65
 
66
  ## Evaluation Results
67
 
68
- <!--- Describe how your model was evaluated -->
69
 
70
- For an automated evaluation of this model, see the *Sentence Embeddings Benchmark*: [https://seb.sbert.net](https://seb.sbert.net?model_name={MODEL_NAME})
 
71
 
72
- λ¬Έμž₯ κ°„ μœ μ‚¬λ„λŠ” 1이 μ΅œλŒ“κ°’μ΄λ©°, 0에 κ°€κΉŒμšΈμˆ˜λ‘ 의미적으둜 μœ μ‚¬ν•˜μ§€ μ•Šμ€ λ¬Έμž₯μž…λ‹ˆλ‹€.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
73
 
74
  ## Training
75
  The model was trained with the parameters:
 
4
  - sentence-transformers
5
  - sentence-similarity
6
  - transformers
7
+ language: ko
8
  ---
9
 
10
  # TAACO_Similarity
11
 
12
+ λ³Έ λͺ¨λΈμ€ [γ„΄entence-transformers](https://www.SBERT.net)λ₯Ό 기반으둜 ν•˜λ©° KLUE의 STS(Sentence Textual Similarity) 데이터셋을 톡해 ν›ˆλ ¨μ„ μ§„ν–‰ν•œ λͺ¨λΈμž…λ‹ˆλ‹€. ν•„μžκ°€ μ œμž‘ν•˜κ³  μžˆλŠ” ν•œκ΅­μ–΄ λ¬Έμž₯κ°„ 결속성
13
  μΈ‘μ • 도ꡬ인 K-TAACO(κ°€μ œ)의 μ§€ν‘œ 쀑 ν•˜λ‚˜μΈ λ¬Έμž₯ κ°„ 의미적 결속성을 μΈ‘μ •ν•˜κΈ° μœ„ν•΄ μ œμž‘ν•˜μ˜€μŠ΅λ‹ˆλ‹€. λ˜ν•œ λͺ¨λ‘μ˜ λ§λ­‰μΉ˜μ˜ λ¬Έμž₯κ°„ μœ μ‚¬λ„ 데이터셋을 ν†΅ν•΄μ„œλ„ ν›ˆλ ¨μ„ 진행할 μ˜ˆμ •μž…λ‹ˆλ‹€.
14
 
15
  ## Usage (Sentence-Transformers)
16
 
17
+ λ³Έ λͺ¨λΈμ„ μ‚¬μš©ν•˜κΈ° μœ„ν•΄μ„œλŠ” [Sentence-transformers](https://www.SBERT.net)λ₯Ό μ„€μΉ˜ν•˜μ—¬μ•Ό ν•©λ‹ˆλ‹€.
18
 
19
  ```
20
  pip install -U sentence-transformers
 
33
 
34
 
35
  ## Usage (μ‹€μ œ λ¬Έμž₯ κ°„ μœ μ‚¬λ„ 비ꡐ)
36
+ [Sentence-transformers](https://www.SBERT.net) λ₯Ό μ„€μΉ˜ν•œ ν›„ μ•„λž˜ λ‚΄μš©κ³Ό 같이 λ¬Έμž₯ κ°„ μœ μ‚¬λ„λ₯Ό 비ꡐ할 수 μžˆμŠ΅λ‹ˆλ‹€.
37
  query λ³€μˆ˜λŠ” 비ꡐ 기쀀이 λ˜λŠ” λ¬Έμž₯(Source Sentence)이고 비ꡐλ₯Ό 진행할 λ¬Έμž₯은 docs에 list ν˜•μ‹μœΌλ‘œ κ΅¬μ„±ν•˜μ‹œλ©΄ λ©λ‹ˆλ‹€.
38
 
39
  ```python
 
65
 
66
  ## Evaluation Results
67
 
68
+ μœ„ μ˜ˆμ‹œ(Usage)λ₯Ό μ‹€ν–‰ν•˜κ²Œ 되면 μ•„λž˜μ™€ 같은 κ²°κ³Όκ°€ λ„μΆœλ©λ‹ˆλ‹€. 1에 κ°€κΉŒμšΈμˆ˜λ‘ μœ μ‚¬ν•œ λ¬Έμž₯μž…λ‹ˆλ‹€.
69
 
70
+ '''python
71
+ μž…λ ₯ λ¬Έμž₯: 생일을 λ§žμ΄ν•˜μ—¬ 아침을 μ€€λΉ„ν•˜κ² λ‹€κ³  μ˜€μ „ 8μ‹œ 30λΆ„λΆ€ν„° μŒμ‹μ„ μ€€λΉ„ν•˜μ˜€λ‹€
72
 
73
+ <μž…λ ₯ λ¬Έμž₯κ³Ό μœ μ‚¬ν•œ 10 개의 λ¬Έμž₯>
74
+
75
+ 1: 생일을 λ§žμ΄ν•˜μ—¬ 아침을 μ€€λΉ„ν•˜κ² λ‹€κ³  μ˜€μ „ 8μ‹œ 30λΆ„λΆ€ν„° μŒμ‹μ„ μ€€λΉ„ν•˜μ˜€λ‹€. 주된 λ©”λ‰΄λŠ” μŠ€ν…Œμ΄ν¬μ™€ λ‚™μ§€λ³ΆμŒ, λ―Έμ—­κ΅­, μž‘μ±„, μ†Œμ•Ό λ“±μ΄μ—ˆλ‹€ (μœ μ‚¬λ„: 0.6687)
76
+
77
+ 2: 맀년 μ•„λ‚΄μ˜ 생일에 λ§žμ΄ν•˜λ©΄ μ•„μΉ¨λ§ˆλ‹€ 생일을 μ°¨λ €μ•Όκ² λ‹€. μ˜€λŠ˜λ„ 즐거운 ν•˜λ£¨κ°€ λ˜μ—ˆμœΌλ©΄ μ’‹κ² λ‹€ (μœ μ‚¬λ„: 0.6468)
78
+
79
+ 3: 40번째λ₯Ό λ§žμ΄ν•˜λŠ” μ•„λ‚΄μ˜ 생일은 μ„±κ³΅μ μœΌλ‘œ μ€€λΉ„κ°€ λ˜μ—ˆλ‹€ (μœ μ‚¬λ„: 0.4647)
80
+
81
+ 4: μ•„λ‚΄μ˜ 생일이라 λ§›μžˆκ²Œ κ΅¬μ›Œλ³΄κ³  μ‹Άμ—ˆλŠ”λ° μ–΄μ²˜κ΅¬λ‹ˆμ—†λŠ” 상황이 λ°œμƒν•œ 것이닀 (μœ μ‚¬λ„: 0.4469)
82
+
83
+ 5: μƒμΌμ΄λ‹ˆκΉŒ~ (μœ μ‚¬λ„: 0.4218)
84
+
85
+ 6: μ–΄μ œλŠ” μ•„λ‚΄μ˜ μƒμΌμ΄μ—ˆλ‹€ (μœ μ‚¬λ„: 0.4192)
86
+
87
+ 7: μ•„μΉ¨ 일찍 μ•„λ‚΄κ°€ μ’‹μ•„ν•˜λŠ” μŠ€ν…Œμ΄ν¬λ₯Ό μ€€λΉ„ν•˜κ³  그것을 λ§›μžˆκ²Œ λ¨ΉλŠ” μ•„λ‚΄μ˜ λͺ¨μŠ΅μ„ 보고 μ‹Άμ—ˆλŠ”λ° μ „ν˜€ 생각지도 λͺ»ν•œ 상황이 λ°œμƒν•΄μ„œ... ν•˜μ§€λ§Œ 정신을 μΆ”μŠ€λ₯΄κ³  λ°”λ‘œ λ‹€λ₯Έ λ©”λ‰΄λ‘œ λ³€κ²½ν–ˆλ‹€ (μœ μ‚¬λ„: 0.4156)
88
+
89
+ 8: λ§›μžˆκ²Œ λ¨Ήμ–΄ μ€€ μ•„λ‚΄μ—κ²Œλ„ κ°μ‚¬ν–ˆλ‹€ (μœ μ‚¬λ„: 0.3093)
90
+
91
+ 9: μ•„λ‚΄κ°€ μ’‹μ•„ν•˜λŠ”μ§€ λͺ¨λ₯΄κ² μ§€λ§Œ 냉μž₯κ³  μ•ˆμ— μžˆλŠ” ν›„λž‘ν¬μ†Œμ„Έμ§€λ₯Ό λ³΄λ‹ˆ λ°”λ‘œ μ†Œμ•Όλ₯Ό ν•΄μ•Όκ² λ‹€λŠ” 생각이 λ“€μ—ˆλ‹€. μŒμ‹μ€ μ„±κ³΅μ μœΌλ‘œ 완성이 λ˜μ—ˆλ‹€ (μœ μ‚¬λ„: 0.2259)
92
+
93
+ 10: 아내도 그런 μŠ€ν…Œμ΄ν¬λ₯Ό μ’‹μ•„ν•œλ‹€. 그런데 상상도 λͺ»ν•œ 일이 λ²Œμ΄μ§€κ³  λ§μ•˜λ‹€ (μœ μ‚¬λ„: 0.1967)
94
+ '''
95
 
96
  ## Training
97
  The model was trained with the parameters: