joe32140 commited on
Commit
608f025
1 Parent(s): a6536aa

Add new SentenceTransformer model

Browse files
1_Pooling/config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "word_embedding_dimension": 768,
3
+ "pooling_mode_cls_token": false,
4
+ "pooling_mode_mean_tokens": true,
5
+ "pooling_mode_max_tokens": false,
6
+ "pooling_mode_mean_sqrt_len_tokens": false,
7
+ "pooling_mode_weightedmean_tokens": false,
8
+ "pooling_mode_lasttoken": false,
9
+ "include_prompt": true
10
+ }
README.md ADDED
@@ -0,0 +1,717 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ base_model: Alibaba-NLP/gte-en-mlm-base
3
+ datasets:
4
+ - sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1
5
+ language:
6
+ - en
7
+ library_name: sentence-transformers
8
+ metrics:
9
+ - cosine_accuracy
10
+ pipeline_tag: sentence-similarity
11
+ tags:
12
+ - sentence-transformers
13
+ - sentence-similarity
14
+ - feature-extraction
15
+ - generated_from_trainer
16
+ - dataset_size:11662655
17
+ - loss:CachedMultipleNegativesRankingLoss
18
+ widget:
19
+ - source_sentence: what county is lyndhurst, ohio in
20
+ sentences:
21
+ - This article is about the song written by Kenneth Gamble, Leon Huff and Cary Gilbert.
22
+ For the Tina Turner song, see Don't Leave Me This Way (Tina Turner song). Don't
23
+ Leave Me This Way is a song written by Kenneth Gamble, Leon Huff and Cary Gilbert.
24
+ First charting as a hit for Harold Melvin & the Blue Notes featuring Teddy Pendergrass,
25
+ an act on Gamble & Huff's Philadelphia International label in 1975, Don't Leave
26
+ Me This Way was later a huge disco hit for Motown artist Thelma Houston in 1977.
27
+ - "Lyndhurst is a city in Cuyahoga County, Ohio, United States. The population was\
28
+ \ 14,001 at the 2010 census. Lyndhurst is located in northeastern Ohio, and is\
29
+ \ a suburb of Cleveland. A small part of Lyndhurst was originally part of Mayfield\
30
+ \ Township. It used to be called Euclidville before Lyndhurst was chosen. Lyndhurst\
31
+ \ is located at 41°31â\x80²17â\x80³N 81°29â\x80²25â\x80³W / 41.52139°N 81.49028°W\
32
+ \ / 41.52139; -81.49028 (41.521352, -81.490141)."
33
+ - Welcome to Trumbull County... Trumbull County, the county seat, located in Warren,
34
+ Ohio, consists of a combination of both urban and rural communities situated in
35
+ the northeast corner of Ohio. It is situated roughly between the Youngstown, Cleveland
36
+ and Akron corridors.
37
+ - source_sentence: who founded the american graphophone company
38
+ sentences:
39
+ - In 1886, Graham Bell and Charles Sumner Tainter founded the American Graphophone
40
+ Company to distribute and sell graphophones in the US and Canada under license
41
+ from the Volta Graphophone Company. In 1890, the American Graphophone Company
42
+ stopped production of new phonographs due to sagging orders.
43
+ - ShelfGenie How much does a ShelfGenie franchise cost? ShelfGenie has a franchise
44
+ fee of up to $45,000, with a total initial investment range of $70,100 to $107,750.
45
+ Local ShelfGenie franchise opportunities. ShelfGenie is looking to grow in a number
46
+ of cities around the country. To find out if there's a franchise opportunity in
47
+ your city, unlock more information.
48
+ - "A+E Networks. The technology that made the modern music business possible came\
49
+ \ into existence in the New Jersey laboratory where Thomas Alva Edison created\
50
+ \ the first device to both record sound and play it back. He was awarded U.S.\
51
+ \ Patent No. 200,521 for his inventionâ\x80\x93the phonographâ\x80\x93on this\
52
+ \ day in 1878."
53
+ - source_sentence: is housekeeping camp flooded?
54
+ sentences:
55
+ - 'What is the importance of housekeeping at work? A: Workplace housekeeping promotes
56
+ sanitation, safety, organization and productivity. It also boosts morale. Daily
57
+ housekeeping maintenance keeps the workplac... Full Answer >'
58
+ - The back patio area of a cabin is partially submerged in flood water at Housekeeping
59
+ Camp on Monday, Jan. 9, 2017, in Yosemite National Park. The Merced River, swollen
60
+ with storm runoff, crested at 12.7 feet at 4 a.m. SILVIA FLORES sflores@fresnobee.com.
61
+ - "1 Bake for 8 minutes, then rotate the pan and check the underside of the bagels.\
62
+ \ 2 If theyâ\x80\x99re getting too dark, place another pan under the baking sheet.\
63
+ \ ( 3 Doubling the pan will insulate the first baking sheet.) Bake for another\
64
+ \ 8 to 12 minutes, until the bagels are a golden brown. 4 13."
65
+ - source_sentence: causes for infection in the nerve of tooth
66
+ sentences:
67
+ - If a cavity is causing the toothache, your dentist will fill the cavity or possibly
68
+ extract the tooth, if necessary. A root canal might be needed if the cause of
69
+ the toothache is determined to be an infection of the tooth's nerve. Bacteria
70
+ that have worked their way into the inner aspects of the tooth cause such an infection.
71
+ An antibiotic may be prescribed if there is fever or swelling of the jaw.
72
+ - "According to Article III, Section 1 of the Constitution, judges and justices\
73
+ \ of the Judicial Branch serve during good behavior.. This means they are appointed\
74
+ \ for life, unles â\x80¦ s they are impeached and removed from office. + 50 others\
75
+ \ found this useful.he term length for members of the House are two years and\
76
+ \ a staggering six years for members of the Senate."
77
+ - Inflamed or infected pulp (pulpitis) most often causes a toothache. To relieve
78
+ the pain and prevent further complications, the tooth may be extracted (surgically
79
+ removed) or saved by root canal treatment.
80
+ - source_sentence: what county is hayden in
81
+ sentences:
82
+ - Normally, the Lead Agency is the agency with general governmental powers such
83
+ as a city or a county. Agencies with limited powers or districts that provide
84
+ a public service/utility such as a recreation and park district will tend to be
85
+ a Responsible Agency.
86
+ - According to the United States Census Bureau, the city has a total area of 9.61
87
+ square miles (24.89 km2), of which 9.60 square miles (24.86 km2) is land and 0.01
88
+ square miles (0.03 km2) is water. It lies at the southwestern end of Hayden Lake,
89
+ and the elevation of the city is 2,287 feet (697 m) above sea level. Hayden is
90
+ located on U.S. Route 95 at the junction of Route 41. It is also four miles (6
91
+ km) north of Interstate 90 and Coeur d'Alene. The Coeur d'Alene airport is northwest
92
+ of Hayden.
93
+ - Hayden is a city in Kootenai County, Idaho, United States. Located in the northern
94
+ portion of the state, just north of Coeur d'Alene, its population was 13,294 at
95
+ the 2010 census.
96
+ model-index:
97
+ - name: SentenceTransformer based on Alibaba-NLP/gte-en-mlm-base
98
+ results:
99
+ - task:
100
+ type: triplet
101
+ name: Triplet
102
+ dataset:
103
+ name: msmarco co condenser dev
104
+ type: msmarco-co-condenser-dev
105
+ metrics:
106
+ - type: cosine_accuracy
107
+ value: 0.983
108
+ name: Cosine Accuracy
109
+ ---
110
+
111
+ # SentenceTransformer based on Alibaba-NLP/gte-en-mlm-base
112
+
113
+ This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [Alibaba-NLP/gte-en-mlm-base](https://huggingface.co/Alibaba-NLP/gte-en-mlm-base) on the [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
114
+
115
+ ## Model Details
116
+
117
+ ### Model Description
118
+ - **Model Type:** Sentence Transformer
119
+ - **Base model:** [Alibaba-NLP/gte-en-mlm-base](https://huggingface.co/Alibaba-NLP/gte-en-mlm-base) <!-- at revision 5d8e84a2f6d9b819fd50fca613fae4f6ffcafa07 -->
120
+ - **Maximum Sequence Length:** 8192 tokens
121
+ - **Output Dimensionality:** 768 dimensions
122
+ - **Similarity Function:** Cosine Similarity
123
+ - **Training Dataset:**
124
+ - [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1)
125
+ - **Language:** en
126
+ <!-- - **License:** Unknown -->
127
+
128
+ ### Model Sources
129
+
130
+ - **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
131
+ - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
132
+ - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
133
+
134
+ ### Full Model Architecture
135
+
136
+ ```
137
+ SentenceTransformer(
138
+ (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NewModel
139
+ (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
140
+ )
141
+ ```
142
+
143
+ ## Usage
144
+
145
+ ### Direct Usage (Sentence Transformers)
146
+
147
+ First install the Sentence Transformers library:
148
+
149
+ ```bash
150
+ pip install -U sentence-transformers
151
+ ```
152
+
153
+ Then you can load this model and run inference.
154
+ ```python
155
+ from sentence_transformers import SentenceTransformer
156
+
157
+ # Download from the 🤗 Hub
158
+ model = SentenceTransformer("joe32140/gte-en-mlm-base-msmarco")
159
+ # Run inference
160
+ sentences = [
161
+ 'what county is hayden in',
162
+ "Hayden is a city in Kootenai County, Idaho, United States. Located in the northern portion of the state, just north of Coeur d'Alene, its population was 13,294 at the 2010 census.",
163
+ "According to the United States Census Bureau, the city has a total area of 9.61 square miles (24.89 km2), of which 9.60 square miles (24.86 km2) is land and 0.01 square miles (0.03 km2) is water. It lies at the southwestern end of Hayden Lake, and the elevation of the city is 2,287 feet (697 m) above sea level. Hayden is located on U.S. Route 95 at the junction of Route 41. It is also four miles (6 km) north of Interstate 90 and Coeur d'Alene. The Coeur d'Alene airport is northwest of Hayden.",
164
+ ]
165
+ embeddings = model.encode(sentences)
166
+ print(embeddings.shape)
167
+ # [3, 768]
168
+
169
+ # Get the similarity scores for the embeddings
170
+ similarities = model.similarity(embeddings, embeddings)
171
+ print(similarities.shape)
172
+ # [3, 3]
173
+ ```
174
+
175
+ <!--
176
+ ### Direct Usage (Transformers)
177
+
178
+ <details><summary>Click to see the direct usage in Transformers</summary>
179
+
180
+ </details>
181
+ -->
182
+
183
+ <!--
184
+ ### Downstream Usage (Sentence Transformers)
185
+
186
+ You can finetune this model on your own dataset.
187
+
188
+ <details><summary>Click to expand</summary>
189
+
190
+ </details>
191
+ -->
192
+
193
+ <!--
194
+ ### Out-of-Scope Use
195
+
196
+ *List how the model may foreseeably be misused and address what users ought not to do with the model.*
197
+ -->
198
+
199
+ ## Evaluation
200
+
201
+ ### Metrics
202
+
203
+ #### Triplet
204
+
205
+ * Dataset: `msmarco-co-condenser-dev`
206
+ * Evaluated with [<code>TripletEvaluator</code>](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.TripletEvaluator)
207
+
208
+ | Metric | Value |
209
+ |:--------------------|:----------|
210
+ | **cosine_accuracy** | **0.983** |
211
+
212
+ <!--
213
+ ## Bias, Risks and Limitations
214
+
215
+ *What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
216
+ -->
217
+
218
+ <!--
219
+ ### Recommendations
220
+
221
+ *What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
222
+ -->
223
+
224
+ ## Training Details
225
+
226
+ ### Training Dataset
227
+
228
+ #### msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1
229
+
230
+ * Dataset: [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) at [84ed2d3](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1/tree/84ed2d35626f617d890bd493b4d6db69a741e0e2)
231
+ * Size: 11,662,655 training samples
232
+ * Columns: <code>query</code>, <code>positive</code>, and <code>negative</code>
233
+ * Approximate statistics based on the first 1000 samples:
234
+ | | query | positive | negative |
235
+ |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
236
+ | type | string | string | string |
237
+ | details | <ul><li>min: 4 tokens</li><li>mean: 8.96 tokens</li><li>max: 40 tokens</li></ul> | <ul><li>min: 15 tokens</li><li>mean: 77.58 tokens</li><li>max: 222 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 78.59 tokens</li><li>max: 325 tokens</li></ul> |
238
+ * Samples:
239
+ | query | positive | negative |
240
+ |:---------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
241
+ | <code>what is the meaning of menu planning</code> | <code>Menu planning is the selection of a menu for an event. Such as picking out the dinner for your wedding or even a meal at a Birthday Party. Menu planning is when you are preparing a calendar of meals and you have to sit down and decide what meat and veggies you want to serve on each certain day.</code> | <code>Menu Costs. In economics, a menu cost is the cost to a firm resulting from changing its prices. The name stems from the cost of restaurants literally printing new menus, but economists use it to refer to the costs of changing nominal prices in general.</code> |
242
+ | <code>how old is brett butler</code> | <code>Brett Butler is 59 years old. To be more precise (and nerdy), the current age as of right now is 21564 days or (even more geeky) 517536 hours. That's a lot of hours!</code> | <code>Passed in: St. John's, Newfoundland and Labrador, Canada. Passed on: 16/07/2016. Published in the St. John's Telegram. Passed away suddenly at the Health Sciences Centre surrounded by his loving family, on July 16, 2016 Robert (Bobby) Joseph Butler, age 52 years. Predeceased by his special aunt Geri Murrin and uncle Mike Mchugh; grandparents Joe and Margaret Murrin and Jack and Theresa Butler.</code> |
243
+ | <code>when was the last navajo treaty sign?</code> | <code>In Executive Session, Senate of the United States, July 25, 1868. Resolved, (two-thirds of the senators present concurring,) That the Senate advise and consent to the ratification of the treaty between the United States and the Navajo Indians, concluded at Fort Sumner, New Mexico, on the first day of June, 1868.</code> | <code>Share Treaty of Greenville. The Treaty of Greenville was signed August 3, 1795, between the United States, represented by Gen. Anthony Wayne, and chiefs of the Indian tribes located in the Northwest Territory, including the Wyandots, Delawares, Shawnees, Ottawas, Miamis, and others.</code> |
244
+ * Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters:
245
+ ```json
246
+ {
247
+ "scale": 20.0,
248
+ "similarity_fct": "cos_sim"
249
+ }
250
+ ```
251
+
252
+ ### Evaluation Dataset
253
+
254
+ #### msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1
255
+
256
+ * Dataset: [msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1) at [84ed2d3](https://huggingface.co/datasets/sentence-transformers/msmarco-co-condenser-margin-mse-sym-mnrl-mean-v1/tree/84ed2d35626f617d890bd493b4d6db69a741e0e2)
257
+ * Size: 11,662,655 evaluation samples
258
+ * Columns: <code>query</code>, <code>positive</code>, and <code>negative</code>
259
+ * Approximate statistics based on the first 1000 samples:
260
+ | | query | positive | negative |
261
+ |:--------|:---------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------|
262
+ | type | string | string | string |
263
+ | details | <ul><li>min: 4 tokens</li><li>mean: 8.92 tokens</li><li>max: 26 tokens</li></ul> | <ul><li>min: 22 tokens</li><li>mean: 79.14 tokens</li><li>max: 223 tokens</li></ul> | <ul><li>min: 21 tokens</li><li>mean: 78.96 tokens</li><li>max: 233 tokens</li></ul> |
264
+ * Samples:
265
+ | query | positive | negative |
266
+ |:------------------------------------------------------|:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|
267
+ | <code>what county is holly springs nc in</code> | <code>Holly Springs, North Carolina. Holly Springs is a town in Wake County, North Carolina, United States. As of the 2010 census, the town population was 24,661, over 2½ times its population in 2000. Contents.</code> | <code>The Mt. Holly Springs Park & Resort. One of the numerous trolley routes that carried people around the county at the turn of the century was the Carlisle & Mt. Holly Railway Company. The “Holly Trolley” as it came to be known was put into service by Patricio Russo and made its first run on May 14, 1901.</code> |
268
+ | <code>how long does nyquil stay in your system</code> | <code>In order to understand exactly how long Nyquil lasts, it is absolutely vital to learn about the various ingredients in the drug. One of the ingredients found in Nyquil is Doxylamine, which is an antihistamine. This specific medication has a biological half-life or 6 to 12 hours. With this in mind, it is possible for the drug to remain in the system for a period of 12 to 24 hours. It should be known that the specifics will depend on a wide variety of different factors, including your age and metabolism.</code> | <code>I confirmed that NyQuil is about 10% alcohol, a higher content than most domestic beers. When I asked about the relatively high proof, I was told that the alcohol dilutes the active ingredients. The alcohol free version is there for customers with addiction issues.. also found that in that version there is twice the amount of DXM. When I asked if I could speak to a chemist or scientist, I was told they didn't have anyone who fit that description there. It’s been eight years since I kicked NyQuil. I've been sober from alcohol for four years.</code> |
269
+ | <code>what are mineral water</code> | <code>1 Mineral water – water from a mineral spring that contains various minerals, such as salts and sulfur compounds. 2 It comes from a source tapped at one or more bore holes or spring, and originates from a geologically and physically protected underground water source. Mineral water – water from a mineral spring that contains various minerals, such as salts and sulfur compounds. 2 It comes from a source tapped at one or more bore holes or spring, and originates from a geologically and physically protected underground water source.</code> | <code>Minerals for Your Body. Drinking mineral water is beneficial to health and well-being. But it is not only the amount of water you drink that is important-what the water contains is even more essential.inerals for Your Body. Drinking mineral water is beneficial to health and well-being. But it is not only the amount of water you drink that is important-what the water contains is even more essential.</code> |
270
+ * Loss: [<code>CachedMultipleNegativesRankingLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cachedmultiplenegativesrankingloss) with these parameters:
271
+ ```json
272
+ {
273
+ "scale": 20.0,
274
+ "similarity_fct": "cos_sim"
275
+ }
276
+ ```
277
+
278
+ ### Training Hyperparameters
279
+ #### Non-Default Hyperparameters
280
+
281
+ - `per_device_train_batch_size`: 512
282
+ - `per_device_eval_batch_size`: 512
283
+ - `num_train_epochs`: 1
284
+ - `warmup_ratio`: 0.05
285
+ - `bf16`: True
286
+ - `batch_sampler`: no_duplicates
287
+
288
+ #### All Hyperparameters
289
+ <details><summary>Click to expand</summary>
290
+
291
+ - `overwrite_output_dir`: False
292
+ - `do_predict`: False
293
+ - `eval_strategy`: no
294
+ - `prediction_loss_only`: True
295
+ - `per_device_train_batch_size`: 512
296
+ - `per_device_eval_batch_size`: 512
297
+ - `per_gpu_train_batch_size`: None
298
+ - `per_gpu_eval_batch_size`: None
299
+ - `gradient_accumulation_steps`: 1
300
+ - `eval_accumulation_steps`: None
301
+ - `torch_empty_cache_steps`: None
302
+ - `learning_rate`: 5e-05
303
+ - `weight_decay`: 0.0
304
+ - `adam_beta1`: 0.9
305
+ - `adam_beta2`: 0.999
306
+ - `adam_epsilon`: 1e-08
307
+ - `max_grad_norm`: 1.0
308
+ - `num_train_epochs`: 1
309
+ - `max_steps`: -1
310
+ - `lr_scheduler_type`: linear
311
+ - `lr_scheduler_kwargs`: {}
312
+ - `warmup_ratio`: 0.05
313
+ - `warmup_steps`: 0
314
+ - `log_level`: passive
315
+ - `log_level_replica`: warning
316
+ - `log_on_each_node`: True
317
+ - `logging_nan_inf_filter`: True
318
+ - `save_safetensors`: True
319
+ - `save_on_each_node`: False
320
+ - `save_only_model`: False
321
+ - `restore_callback_states_from_checkpoint`: False
322
+ - `no_cuda`: False
323
+ - `use_cpu`: False
324
+ - `use_mps_device`: False
325
+ - `seed`: 42
326
+ - `data_seed`: None
327
+ - `jit_mode_eval`: False
328
+ - `use_ipex`: False
329
+ - `bf16`: True
330
+ - `fp16`: False
331
+ - `fp16_opt_level`: O1
332
+ - `half_precision_backend`: auto
333
+ - `bf16_full_eval`: False
334
+ - `fp16_full_eval`: False
335
+ - `tf32`: None
336
+ - `local_rank`: 0
337
+ - `ddp_backend`: None
338
+ - `tpu_num_cores`: None
339
+ - `tpu_metrics_debug`: False
340
+ - `debug`: []
341
+ - `dataloader_drop_last`: False
342
+ - `dataloader_num_workers`: 0
343
+ - `dataloader_prefetch_factor`: None
344
+ - `past_index`: -1
345
+ - `disable_tqdm`: False
346
+ - `remove_unused_columns`: True
347
+ - `label_names`: None
348
+ - `load_best_model_at_end`: False
349
+ - `ignore_data_skip`: False
350
+ - `fsdp`: []
351
+ - `fsdp_min_num_params`: 0
352
+ - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
353
+ - `fsdp_transformer_layer_cls_to_wrap`: None
354
+ - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
355
+ - `deepspeed`: None
356
+ - `label_smoothing_factor`: 0.0
357
+ - `optim`: adamw_torch
358
+ - `optim_args`: None
359
+ - `adafactor`: False
360
+ - `group_by_length`: False
361
+ - `length_column_name`: length
362
+ - `ddp_find_unused_parameters`: None
363
+ - `ddp_bucket_cap_mb`: None
364
+ - `ddp_broadcast_buffers`: False
365
+ - `dataloader_pin_memory`: True
366
+ - `dataloader_persistent_workers`: False
367
+ - `skip_memory_metrics`: True
368
+ - `use_legacy_prediction_loop`: False
369
+ - `push_to_hub`: False
370
+ - `resume_from_checkpoint`: None
371
+ - `hub_model_id`: None
372
+ - `hub_strategy`: every_save
373
+ - `hub_private_repo`: None
374
+ - `hub_always_push`: False
375
+ - `gradient_checkpointing`: False
376
+ - `gradient_checkpointing_kwargs`: None
377
+ - `include_inputs_for_metrics`: False
378
+ - `include_for_metrics`: []
379
+ - `eval_do_concat_batches`: True
380
+ - `fp16_backend`: auto
381
+ - `push_to_hub_model_id`: None
382
+ - `push_to_hub_organization`: None
383
+ - `mp_parameters`:
384
+ - `auto_find_batch_size`: False
385
+ - `full_determinism`: False
386
+ - `torchdynamo`: None
387
+ - `ray_scope`: last
388
+ - `ddp_timeout`: 1800
389
+ - `torch_compile`: False
390
+ - `torch_compile_backend`: None
391
+ - `torch_compile_mode`: None
392
+ - `dispatch_batches`: None
393
+ - `split_batches`: None
394
+ - `include_tokens_per_second`: False
395
+ - `include_num_input_tokens_seen`: False
396
+ - `neftune_noise_alpha`: None
397
+ - `optim_target_modules`: None
398
+ - `batch_eval_metrics`: False
399
+ - `eval_on_start`: False
400
+ - `use_liger_kernel`: False
401
+ - `eval_use_gather_object`: False
402
+ - `average_tokens_across_devices`: False
403
+ - `prompts`: None
404
+ - `batch_sampler`: no_duplicates
405
+ - `multi_dataset_batch_sampler`: proportional
406
+
407
+ </details>
408
+
409
+ ### Training Logs
410
+ <details><summary>Click to expand</summary>
411
+
412
+ | Epoch | Step | Training Loss | msmarco-co-condenser-dev_cosine_accuracy |
413
+ |:------:|:----:|:-------------:|:----------------------------------------:|
414
+ | 0 | 0 | - | 0.649 |
415
+ | 0.0041 | 10 | 6.3669 | - |
416
+ | 0.0082 | 20 | 5.3993 | - |
417
+ | 0.0123 | 30 | 3.4256 | - |
418
+ | 0.0164 | 40 | 2.0704 | - |
419
+ | 0.0205 | 50 | 1.0275 | - |
420
+ | 0.0246 | 60 | 0.6803 | - |
421
+ | 0.0287 | 70 | 0.5813 | - |
422
+ | 0.0328 | 80 | 0.5144 | - |
423
+ | 0.0369 | 90 | 0.4714 | - |
424
+ | 0.0410 | 100 | 0.4089 | - |
425
+ | 0.0450 | 110 | 0.3974 | - |
426
+ | 0.0491 | 120 | 0.363 | - |
427
+ | 0.0532 | 130 | 0.348 | - |
428
+ | 0.0573 | 140 | 0.3307 | - |
429
+ | 0.0614 | 150 | 0.3171 | - |
430
+ | 0.0655 | 160 | 0.3188 | - |
431
+ | 0.0696 | 170 | 0.3024 | - |
432
+ | 0.0737 | 180 | 0.2971 | - |
433
+ | 0.0778 | 190 | 0.2786 | - |
434
+ | 0.0819 | 200 | 0.2851 | - |
435
+ | 0.0860 | 210 | 0.2798 | - |
436
+ | 0.0901 | 220 | 0.2796 | - |
437
+ | 0.0942 | 230 | 0.2683 | - |
438
+ | 0.0983 | 240 | 0.2591 | - |
439
+ | 0.1024 | 250 | 0.265 | - |
440
+ | 0.1065 | 260 | 0.2703 | - |
441
+ | 0.1106 | 270 | 0.2547 | - |
442
+ | 0.1147 | 280 | 0.257 | - |
443
+ | 0.1188 | 290 | 0.2437 | - |
444
+ | 0.1229 | 300 | 0.2417 | - |
445
+ | 0.1269 | 310 | 0.2444 | - |
446
+ | 0.1310 | 320 | 0.2358 | - |
447
+ | 0.1351 | 330 | 0.2336 | - |
448
+ | 0.1392 | 340 | 0.2297 | - |
449
+ | 0.1433 | 350 | 0.224 | - |
450
+ | 0.1474 | 360 | 0.221 | - |
451
+ | 0.1515 | 370 | 0.227 | - |
452
+ | 0.1556 | 380 | 0.224 | - |
453
+ | 0.1597 | 390 | 0.2226 | - |
454
+ | 0.1638 | 400 | 0.2101 | - |
455
+ | 0.1679 | 410 | 0.2222 | - |
456
+ | 0.1720 | 420 | 0.217 | - |
457
+ | 0.1761 | 430 | 0.2093 | - |
458
+ | 0.1802 | 440 | 0.2077 | - |
459
+ | 0.1843 | 450 | 0.2073 | - |
460
+ | 0.1884 | 460 | 0.2061 | - |
461
+ | 0.1925 | 470 | 0.2074 | - |
462
+ | 0.1966 | 480 | 0.2019 | - |
463
+ | 0.2007 | 490 | 0.1978 | - |
464
+ | 0.2048 | 500 | 0.2115 | - |
465
+ | 0.2088 | 510 | 0.2005 | - |
466
+ | 0.2129 | 520 | 0.2059 | - |
467
+ | 0.2170 | 530 | 0.1925 | - |
468
+ | 0.2211 | 540 | 0.1943 | - |
469
+ | 0.2252 | 550 | 0.1969 | - |
470
+ | 0.2293 | 560 | 0.1899 | - |
471
+ | 0.2334 | 570 | 0.2122 | - |
472
+ | 0.2375 | 580 | 0.188 | - |
473
+ | 0.2416 | 590 | 0.1921 | - |
474
+ | 0.2457 | 600 | 0.1803 | - |
475
+ | 0.2498 | 610 | 0.1983 | - |
476
+ | 0.2539 | 620 | 0.1889 | - |
477
+ | 0.2580 | 630 | 0.1887 | - |
478
+ | 0.2621 | 640 | 0.1833 | - |
479
+ | 0.2662 | 650 | 0.1843 | - |
480
+ | 0.2703 | 660 | 0.1844 | - |
481
+ | 0.2744 | 670 | 0.1843 | - |
482
+ | 0.2785 | 680 | 0.1837 | - |
483
+ | 0.2826 | 690 | 0.173 | - |
484
+ | 0.2867 | 700 | 0.1785 | - |
485
+ | 0.2907 | 710 | 0.1704 | - |
486
+ | 0.2948 | 720 | 0.1703 | - |
487
+ | 0.2989 | 730 | 0.1782 | - |
488
+ | 0.3030 | 740 | 0.1623 | - |
489
+ | 0.3071 | 750 | 0.1688 | - |
490
+ | 0.3112 | 760 | 0.1603 | - |
491
+ | 0.3153 | 770 | 0.1518 | - |
492
+ | 0.3194 | 780 | 0.1605 | - |
493
+ | 0.3235 | 790 | 0.1661 | - |
494
+ | 0.3276 | 800 | 0.1678 | - |
495
+ | 0.3317 | 810 | 0.1656 | - |
496
+ | 0.3358 | 820 | 0.1582 | - |
497
+ | 0.3399 | 830 | 0.1551 | - |
498
+ | 0.3440 | 840 | 0.1587 | - |
499
+ | 0.3481 | 850 | 0.1526 | - |
500
+ | 0.3522 | 860 | 0.1601 | - |
501
+ | 0.3563 | 870 | 0.1557 | - |
502
+ | 0.3604 | 880 | 0.1576 | - |
503
+ | 0.3645 | 890 | 0.1655 | - |
504
+ | 0.3686 | 900 | 0.1595 | - |
505
+ | 0.3726 | 910 | 0.1575 | - |
506
+ | 0.3767 | 920 | 0.1544 | - |
507
+ | 0.3808 | 930 | 0.1432 | - |
508
+ | 0.3849 | 940 | 0.1484 | - |
509
+ | 0.3890 | 950 | 0.1556 | - |
510
+ | 0.3931 | 960 | 0.1552 | - |
511
+ | 0.3972 | 970 | 0.1462 | - |
512
+ | 0.4013 | 980 | 0.1562 | - |
513
+ | 0.4054 | 990 | 0.1461 | - |
514
+ | 0.4095 | 1000 | 0.1597 | - |
515
+ | 0.4136 | 1010 | 0.1466 | - |
516
+ | 0.4177 | 1020 | 0.143 | - |
517
+ | 0.4218 | 1030 | 0.1515 | - |
518
+ | 0.4259 | 1040 | 0.1317 | - |
519
+ | 0.4300 | 1050 | 0.1414 | - |
520
+ | 0.4341 | 1060 | 0.1554 | - |
521
+ | 0.4382 | 1070 | 0.1484 | - |
522
+ | 0.4423 | 1080 | 0.1487 | - |
523
+ | 0.4464 | 1090 | 0.1533 | - |
524
+ | 0.4505 | 1100 | 0.1494 | - |
525
+ | 0.4545 | 1110 | 0.1381 | - |
526
+ | 0.4586 | 1120 | 0.1495 | - |
527
+ | 0.4627 | 1130 | 0.1422 | - |
528
+ | 0.4668 | 1140 | 0.1424 | - |
529
+ | 0.4709 | 1150 | 0.1422 | - |
530
+ | 0.4750 | 1160 | 0.1429 | - |
531
+ | 0.4791 | 1170 | 0.1297 | - |
532
+ | 0.4832 | 1180 | 0.135 | - |
533
+ | 0.4873 | 1190 | 0.1431 | - |
534
+ | 0.4914 | 1200 | 0.143 | - |
535
+ | 0.4955 | 1210 | 0.1399 | - |
536
+ | 0.4996 | 1220 | 0.1339 | - |
537
+ | 0.5037 | 1230 | 0.1309 | - |
538
+ | 0.5078 | 1240 | 0.1377 | - |
539
+ | 0.5119 | 1250 | 0.1361 | - |
540
+ | 0.5160 | 1260 | 0.1311 | - |
541
+ | 0.5201 | 1270 | 0.1363 | - |
542
+ | 0.5242 | 1280 | 0.1368 | - |
543
+ | 0.5283 | 1290 | 0.1376 | - |
544
+ | 0.5324 | 1300 | 0.1323 | - |
545
+ | 0.5364 | 1310 | 0.1302 | - |
546
+ | 0.5405 | 1320 | 0.1322 | - |
547
+ | 0.5446 | 1330 | 0.1294 | - |
548
+ | 0.5487 | 1340 | 0.1295 | - |
549
+ | 0.5528 | 1350 | 0.1341 | - |
550
+ | 0.5569 | 1360 | 0.1244 | - |
551
+ | 0.5610 | 1370 | 0.1287 | - |
552
+ | 0.5651 | 1380 | 0.1247 | - |
553
+ | 0.5692 | 1390 | 0.1265 | - |
554
+ | 0.5733 | 1400 | 0.1221 | - |
555
+ | 0.5774 | 1410 | 0.1245 | - |
556
+ | 0.5815 | 1420 | 0.1252 | - |
557
+ | 0.5856 | 1430 | 0.1275 | - |
558
+ | 0.5897 | 1440 | 0.1211 | - |
559
+ | 0.5938 | 1450 | 0.1256 | - |
560
+ | 0.5979 | 1460 | 0.1208 | - |
561
+ | 0.6020 | 1470 | 0.1203 | - |
562
+ | 0.6061 | 1480 | 0.1243 | - |
563
+ | 0.6102 | 1490 | 0.1201 | - |
564
+ | 0.6143 | 1500 | 0.1233 | - |
565
+ | 0.6183 | 1510 | 0.1325 | - |
566
+ | 0.6224 | 1520 | 0.127 | - |
567
+ | 0.6265 | 1530 | 0.1195 | - |
568
+ | 0.6306 | 1540 | 0.1272 | - |
569
+ | 0.6347 | 1550 | 0.1176 | - |
570
+ | 0.6388 | 1560 | 0.1189 | - |
571
+ | 0.6429 | 1570 | 0.1231 | - |
572
+ | 0.6470 | 1580 | 0.1159 | - |
573
+ | 0.6511 | 1590 | 0.1233 | - |
574
+ | 0.6552 | 1600 | 0.1178 | - |
575
+ | 0.6593 | 1610 | 0.119 | - |
576
+ | 0.6634 | 1620 | 0.119 | - |
577
+ | 0.6675 | 1630 | 0.121 | - |
578
+ | 0.6716 | 1640 | 0.1185 | - |
579
+ | 0.6757 | 1650 | 0.117 | - |
580
+ | 0.6798 | 1660 | 0.1171 | - |
581
+ | 0.6839 | 1670 | 0.1198 | - |
582
+ | 0.6880 | 1680 | 0.1175 | - |
583
+ | 0.6921 | 1690 | 0.1173 | - |
584
+ | 0.6962 | 1700 | 0.1211 | - |
585
+ | 0.7002 | 1710 | 0.1154 | - |
586
+ | 0.7043 | 1720 | 0.1155 | - |
587
+ | 0.7084 | 1730 | 0.124 | - |
588
+ | 0.7125 | 1740 | 0.1147 | - |
589
+ | 0.7166 | 1750 | 0.1185 | - |
590
+ | 0.7207 | 1760 | 0.109 | - |
591
+ | 0.7248 | 1770 | 0.1119 | - |
592
+ | 0.7289 | 1780 | 0.1134 | - |
593
+ | 0.7330 | 1790 | 0.1163 | - |
594
+ | 0.7371 | 1800 | 0.1109 | - |
595
+ | 0.7412 | 1810 | 0.1223 | - |
596
+ | 0.7453 | 1820 | 0.1192 | - |
597
+ | 0.7494 | 1830 | 0.1142 | - |
598
+ | 0.7535 | 1840 | 0.1133 | - |
599
+ | 0.7576 | 1850 | 0.1148 | - |
600
+ | 0.7617 | 1860 | 0.1111 | - |
601
+ | 0.7658 | 1870 | 0.1128 | - |
602
+ | 0.7699 | 1880 | 0.1114 | - |
603
+ | 0.7740 | 1890 | 0.1111 | - |
604
+ | 0.7781 | 1900 | 0.1128 | - |
605
+ | 0.7821 | 1910 | 0.1128 | - |
606
+ | 0.7862 | 1920 | 0.1144 | - |
607
+ | 0.7903 | 1930 | 0.1102 | - |
608
+ | 0.7944 | 1940 | 0.107 | - |
609
+ | 0.7985 | 1950 | 0.1104 | - |
610
+ | 0.8026 | 1960 | 0.1074 | - |
611
+ | 0.8067 | 1970 | 0.1084 | - |
612
+ | 0.8108 | 1980 | 0.1091 | - |
613
+ | 0.8149 | 1990 | 0.1161 | - |
614
+ | 0.8190 | 2000 | 0.1077 | - |
615
+ | 0.8231 | 2010 | 0.1088 | - |
616
+ | 0.8272 | 2020 | 0.1099 | - |
617
+ | 0.8313 | 2030 | 0.11 | - |
618
+ | 0.8354 | 2040 | 0.1102 | - |
619
+ | 0.8395 | 2050 | 0.1098 | - |
620
+ | 0.8436 | 2060 | 0.1076 | - |
621
+ | 0.8477 | 2070 | 0.1062 | - |
622
+ | 0.8518 | 2080 | 0.1078 | - |
623
+ | 0.8559 | 2090 | 0.1058 | - |
624
+ | 0.8600 | 2100 | 0.1067 | - |
625
+ | 0.8640 | 2110 | 0.1037 | - |
626
+ | 0.8681 | 2120 | 0.1147 | - |
627
+ | 0.8722 | 2130 | 0.1169 | - |
628
+ | 0.8763 | 2140 | 0.1054 | - |
629
+ | 0.8804 | 2150 | 0.101 | - |
630
+ | 0.8845 | 2160 | 0.1026 | - |
631
+ | 0.8886 | 2170 | 0.1028 | - |
632
+ | 0.8927 | 2180 | 0.1084 | - |
633
+ | 0.8968 | 2190 | 0.1091 | - |
634
+ | 0.9009 | 2200 | 0.1045 | - |
635
+ | 0.9050 | 2210 | 0.1076 | - |
636
+ | 0.9091 | 2220 | 0.1129 | - |
637
+ | 0.9132 | 2230 | 0.1099 | - |
638
+ | 0.9173 | 2240 | 0.0969 | - |
639
+ | 0.9214 | 2250 | 0.1101 | - |
640
+ | 0.9255 | 2260 | 0.107 | - |
641
+ | 0.9296 | 2270 | 0.1042 | - |
642
+ | 0.9337 | 2280 | 0.1073 | - |
643
+ | 0.9378 | 2290 | 0.1035 | - |
644
+ | 0.9419 | 2300 | 0.1056 | - |
645
+ | 0.9459 | 2310 | 0.1026 | - |
646
+ | 0.9500 | 2320 | 0.1044 | - |
647
+ | 0.9541 | 2330 | 0.106 | - |
648
+ | 0.9582 | 2340 | 0.1054 | - |
649
+ | 0.9623 | 2350 | 0.1032 | - |
650
+ | 0.9664 | 2360 | 0.1019 | - |
651
+ | 0.9705 | 2370 | 0.1106 | - |
652
+ | 0.9746 | 2380 | 0.1076 | - |
653
+ | 0.9787 | 2390 | 0.1018 | - |
654
+ | 0.9828 | 2400 | 0.1026 | - |
655
+ | 0.9869 | 2410 | 0.1015 | - |
656
+ | 0.9910 | 2420 | 0.1036 | - |
657
+ | 0.9951 | 2430 | 0.1104 | - |
658
+ | 0.9992 | 2440 | 0.097 | - |
659
+ | 1.0 | 2442 | - | 0.983 |
660
+
661
+ </details>
662
+
663
+ ### Framework Versions
664
+ - Python: 3.11.9
665
+ - Sentence Transformers: 3.3.0
666
+ - Transformers: 4.48.0.dev0
667
+ - PyTorch: 2.4.0
668
+ - Accelerate: 1.2.1
669
+ - Datasets: 2.21.0
670
+ - Tokenizers: 0.21.0
671
+
672
+ ## Citation
673
+
674
+ ### BibTeX
675
+
676
+ #### Sentence Transformers
677
+ ```bibtex
678
+ @inproceedings{reimers-2019-sentence-bert,
679
+ title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
680
+ author = "Reimers, Nils and Gurevych, Iryna",
681
+ booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
682
+ month = "11",
683
+ year = "2019",
684
+ publisher = "Association for Computational Linguistics",
685
+ url = "https://arxiv.org/abs/1908.10084",
686
+ }
687
+ ```
688
+
689
+ #### CachedMultipleNegativesRankingLoss
690
+ ```bibtex
691
+ @misc{gao2021scaling,
692
+ title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup},
693
+ author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
694
+ year={2021},
695
+ eprint={2101.06983},
696
+ archivePrefix={arXiv},
697
+ primaryClass={cs.LG}
698
+ }
699
+ ```
700
+
701
+ <!--
702
+ ## Glossary
703
+
704
+ *Clearly define terms in order to be accessible across audiences.*
705
+ -->
706
+
707
+ <!--
708
+ ## Model Card Authors
709
+
710
+ *Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
711
+ -->
712
+
713
+ <!--
714
+ ## Model Card Contact
715
+
716
+ *Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
717
+ -->
config.json ADDED
@@ -0,0 +1,41 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "Alibaba-NLP/gte-en-mlm-base",
3
+ "architectures": [
4
+ "NewModel"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.0,
7
+ "auto_map": {
8
+ "AutoConfig": "Alibaba-NLP/new-impl--configuration.NewConfig",
9
+ "AutoModel": "Alibaba-NLP/new-impl--modeling.NewModel",
10
+ "AutoModelForMaskedLM": "Alibaba-NLP/new-impl--modeling.NewForMaskedLM",
11
+ "AutoModelForMultipleChoice": "Alibaba-NLP/new-impl--modeling.NewForMultipleChoice",
12
+ "AutoModelForQuestionAnswering": "Alibaba-NLP/new-impl--modeling.NewForQuestionAnswering",
13
+ "AutoModelForSequenceClassification": "Alibaba-NLP/new-impl--modeling.NewForSequenceClassification",
14
+ "AutoModelForTokenClassification": "Alibaba-NLP/new-impl--modeling.NewForTokenClassification"
15
+ },
16
+ "classifier_dropout": 0.1,
17
+ "hidden_act": "gelu",
18
+ "hidden_dropout_prob": 0.1,
19
+ "hidden_size": 768,
20
+ "initializer_range": 0.02,
21
+ "intermediate_size": 3072,
22
+ "layer_norm_eps": 1e-12,
23
+ "layer_norm_type": "layer_norm",
24
+ "logn_attention_clip1": false,
25
+ "logn_attention_scale": false,
26
+ "max_position_embeddings": 8192,
27
+ "model_type": "new",
28
+ "num_attention_heads": 12,
29
+ "num_hidden_layers": 12,
30
+ "pack_qkv": true,
31
+ "pad_token_id": 0,
32
+ "position_embedding_type": "rope",
33
+ "rope_scaling": null,
34
+ "rope_theta": 500000,
35
+ "torch_dtype": "float32",
36
+ "transformers_version": "4.48.0.dev0",
37
+ "type_vocab_size": 0,
38
+ "unpad_inputs": false,
39
+ "use_memory_efficient_attention": false,
40
+ "vocab_size": 30528
41
+ }
config_sentence_transformers.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "__version__": {
3
+ "sentence_transformers": "3.3.0",
4
+ "transformers": "4.48.0.dev0",
5
+ "pytorch": "2.4.0"
6
+ },
7
+ "prompts": {},
8
+ "default_prompt_name": null,
9
+ "similarity_fn_name": "cosine"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:5fdb868a01bacbed5083024d5de5a1ecea444dd45b3bbce13e35196e0ad008f5
3
+ size 547119128
modules.json ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ [
2
+ {
3
+ "idx": 0,
4
+ "name": "0",
5
+ "path": "",
6
+ "type": "sentence_transformers.models.Transformer"
7
+ },
8
+ {
9
+ "idx": 1,
10
+ "name": "1",
11
+ "path": "1_Pooling",
12
+ "type": "sentence_transformers.models.Pooling"
13
+ }
14
+ ]
sentence_bert_config.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "max_seq_length": 8192,
3
+ "do_lower_case": false
4
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "cls_token": {
3
+ "content": "[CLS]",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "mask_token": {
10
+ "content": "[MASK]",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "[PAD]",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "sep_token": {
24
+ "content": "[SEP]",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ },
30
+ "unk_token": {
31
+ "content": "[UNK]",
32
+ "lstrip": false,
33
+ "normalized": false,
34
+ "rstrip": false,
35
+ "single_word": false
36
+ }
37
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,56 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "added_tokens_decoder": {
3
+ "0": {
4
+ "content": "[PAD]",
5
+ "lstrip": false,
6
+ "normalized": false,
7
+ "rstrip": false,
8
+ "single_word": false,
9
+ "special": true
10
+ },
11
+ "100": {
12
+ "content": "[UNK]",
13
+ "lstrip": false,
14
+ "normalized": false,
15
+ "rstrip": false,
16
+ "single_word": false,
17
+ "special": true
18
+ },
19
+ "101": {
20
+ "content": "[CLS]",
21
+ "lstrip": false,
22
+ "normalized": false,
23
+ "rstrip": false,
24
+ "single_word": false,
25
+ "special": true
26
+ },
27
+ "102": {
28
+ "content": "[SEP]",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false,
33
+ "special": true
34
+ },
35
+ "103": {
36
+ "content": "[MASK]",
37
+ "lstrip": false,
38
+ "normalized": false,
39
+ "rstrip": false,
40
+ "single_word": false,
41
+ "special": true
42
+ }
43
+ },
44
+ "clean_up_tokenization_spaces": true,
45
+ "cls_token": "[CLS]",
46
+ "do_lower_case": true,
47
+ "extra_special_tokens": {},
48
+ "mask_token": "[MASK]",
49
+ "model_max_length": 32768,
50
+ "pad_token": "[PAD]",
51
+ "sep_token": "[SEP]",
52
+ "strip_accents": null,
53
+ "tokenize_chinese_chars": true,
54
+ "tokenizer_class": "BertTokenizer",
55
+ "unk_token": "[UNK]"
56
+ }
vocab.txt ADDED
The diff for this file is too large to render. See raw diff