File size: 22,147 Bytes
82ef89a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
---
language: []
library_name: sentence-transformers
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- dataset_size:1K<n<10K
- loss:CosineSimilarityLoss
base_model: distilbert/distilroberta-base
widget:
- source_sentence: 'Herb Butter ["2 Tbsp. dried herbs: equal parts of parsley, tarragon,
    chives and/or basil", "1/2 c. margarine"] ["Blend all together and chill overnight."]'
  sentences:
  - Salad Dressing ["2 Tbsp. lemon juice or wine vinegar", "1 Tbsp. honey", "1 clove
    garlic, diced", "1 Tbsp. rosemary", "2 Tbsp. water", "1 small diced onion", "1
    Tbsp. flax seed", "1 tsp. parsley"] ["Place in blender until smooth."]
  - Fried Sweet Potato Strips ["1 large sweet potato, peeled and grated into long
    strips", "1 c. vegetable oil"] ["Fry potato in hot oil in a small skillet until
    lightly browned (watch carefully, they brown quickly).", "Remove with a slotted
    spoon and drain on paper towels.", "(Strips will be crisp when cooled.)", "Yield:",
    "about 1 cup."]
  - Chocolate Chip Pecan Pie ["1/2 c. semi-sweet chocolate chips", "4 eggs", "1/3
    c. granulated sugar", "1 1/4 c. Karo syrup (lite or dark)", "3 Tbsp. melted butter",
    "1 1/2 tsp. vanilla", "3/4 c. chopped pecans"] ["Beat eggs; add sugar, corn syrup
    and vanilla.", "Mix well.", "Stir in nuts and chips.", "Pour into 9-inch unbaked
    pie shell. Bake at 325\u00b0 to 350\u00b0", "for 25 minutes.", "Yields 1 pie."]
- source_sentence: Snicker Bars ["1 c. milk chocolate chips", "1/4 c. butterscotch
    chips", "1/4 c. peanut butter"] ["Melt together; pour into 9 x 13-inch greased
    pan and cool."]
  sentences:
  - Reeses Cups(Candy)   ["1 c. peanut butter", "3/4 c. graham cracker crumbs", "1
    c. melted butter", "1 lb. (3 1/2 c.) powdered sugar", "1 large pkg. chocolate
    chips"] ["Combine first four ingredients and press in 13 x 9-inch ungreased pan.",
    "Melt chocolate chips and spread over mixture. Refrigerate for about 20 minutes
    and cut into pieces before chocolate gets hard.", "Keep in refrigerator."]
  - Heavenly Potatoes ["1 (24 oz.) pkg. frozen hash browns, thaw to use", "1 3/4 c.
    grated Cheddar cheese", "1 can cream of chicken soup", "8 oz. carton sour cream",
    "1 stick melted butter or margarine", "1 tsp. salt", "1 medium onion, chopped"]
    ["Mix all together and place in a casserole dish.", "Bake 45 minutes to 1 hour
    at 350\u00b0.", "Serves 12."]
  - Summer Spaghetti ["1 lb. very thin spaghetti", "1/2 bottle McCormick Salad Supreme
    (seasoning)", "1 bottle Zesty Italian dressing"] ["Prepare spaghetti per package.",
    "Drain.", "Melt a little butter through it.", "Marinate overnight in Salad Supreme
    and Zesty Italian dressing.", "Just before serving, add cucumbers, tomatoes, green
    peppers, mushrooms, olives or whatever your taste may want."]
- source_sentence: Foil Packs ["boneless pork chops (or other meat)", "potatoes, quartered",
    "carrots, quartered", "onions, quartered"] ["You will also need 1 large piece
    of aluminum foil."]
  sentences:
  - Pork Sausage ["12 lb. pork meat, cut in pieces, ready for grinding", "5 Tbsp.
    salt", "3 Tbsp. black pepper", "2 Tbsp. pulverized sage leaves"] ["Sprinkle meat
    well with the remaining ingredients.", "Grind all together and it will need no
    further mixing."]
  - Shepherd'S Pie ["1 lb. hamburg", "1/4 c. chopped onion", "1/4 tsp. salt", "1/8
    tsp. pepper", "1 c. mashed potatoes"] ["Fry hamburg and onion until brown.", "Drain
    off liquid.", "Add salt and pepper.", "Spoon into 1-quart casserole and place
    potatoes on top.", "Put butter and paprika over potatoes.", "Bake in a 425\u00b0
    oven for 15 minutes."]
  - Homemade Vanilla Ice Cream ["4 eggs", "2 c. sugar", "4 c. milk", "1 can Eagle
    Brand milk", "2 Tbsp. vanilla", "1/2 tsp. salt"] ["Mix milk, Eagle Brand, vanilla
    and salt in small mixing bowl. In large mixing bowl, beat eggs until light; add
    sugar gradually beating constantly.", "Beat in mixture from small bowl.", "Pour
    into freezer and freeze according to freezer directions."]
- source_sentence: Orange Julius ["couple of oranges", "2 Tbsp. honey"] ["Put in blender.",
    "Add crushed ice until desired thickness.", "Add enough milk to fill blender,
    approximately 1 cup."]
  sentences:
  - Ambrosia ["8 to 10 juicy oranges, peeled and diced", "1 c. moist coconut", "1/2
    c. pecans, chopped", "1/2 c. cherries, halved", "1/4 c. sugar", "1 c. orange juice"]
    ["Combine all ingredients. Chill overnight.", "Yields 4 to 6 servings."]
  - Toffee Refrigerator Dessert ["1 1/2 c. graham cracker crumbs, finely crushed",
    "1/2 c. soda cracker crumbs", "1/2 c. oleo, melted", "2 pkg. vanilla instant pudding",
    "2 c. milk", "1 qt. vanilla ice cream, softened", "1 (4 1/2 oz.) tub Cool Whip",
    "2 Butterfinger candy bars, crushed"] ["Mix the first 3 ingredients and pat into
    a 9 x 13-inch dish."]
  - Mediterranean Orzo ["1 1/2 c. orzo pasta", "1 Tbsp. olive oil", "3 Tbsp. sun-dried
    tomato paste", "1 Tbsp. white balsamic vinegar"] ["Cook orzo according to directions.
    Drain. Add remaining ingredients."]
- source_sentence: Sour Cream Coconut Cake ["2 c. sugar", "2 (8 oz.) carton sour cream",
    "2 pkg. frozen coconut", "1 (3-layer) cake, baked"] ["Bake cake; split the 3 layers
    into 6 layers."]
  sentences:
  - Milk Chocolate Bar Cake ["1 (18 oz.) pkg. Swiss chocolate cake mix", "1 (8 oz.)
    pkg. cream cheese, softened", "1 c. powdered sugar", "1/2 c. granulated sugar",
    "10 (15 oz.) milk chocolate candy bars with almonds, divided", "1 (12 oz.) carton
    thawed Cool Whip"] ["Prepare cake batter according to directions on box.", "Pour
    into 2 greased and floured 8-inch round cake pans.", "Bake at 325\u00b0 for 20
    to 25 minutes.", "Cool and divide to make 4 layers."]
  - Chili Sauce ["12 ripe tomatoes", "4 onions", "2 green peppers", "1 red pepper",
    "4 Tbsp. sugar", "2 Tbsp. salt", "2 tsp. cinnamon", "2 tsp. cloves", "2 tsp. allspice",
    "1 tsp. ginger", "1 qt. vinegar"] ["Peel onions and tomatoes, seed peppers and
    chop all fine, add the spices and put over the fire. Boil steadily for two hours;
    cool, bottle and seal."]
  - Creamed Onions(Makes 8 Servings)   ["4 c. small white onions, peeled (1 1/2 lb.)",
    "2 Tbsp. plus 2 tsp. reduced calorie margarine", "1 1/2 Tbsp. all-purpose flour",
    "1 c. skim milk", "1/2 tsp. thyme", "1/4 tsp. salt", "pinch of nutmeg", "pinch
    of ground white pepper"] ["In a medium saucepan of boiling water, cook onion for
    15 to 20 minutes, until tender.", "Drain."]
pipeline_tag: sentence-similarity
---

# SentenceTransformer based on distilbert/distilroberta-base

This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [distilbert/distilroberta-base](https://huggingface.co/distilbert/distilroberta-base). It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

## Model Details

### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [distilbert/distilroberta-base](https://huggingface.co/distilbert/distilroberta-base) <!-- at revision fb53ab8802853c8e4fbdbcd0529f21fc6f459b2b -->
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 tokens
- **Similarity Function:** Cosine Similarity
<!-- - **Training Dataset:** Unknown -->
<!-- - **Language:** Unknown -->
<!-- - **License:** Unknown -->

### Model Sources

- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)

### Full Model Architecture

```
SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: RobertaModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
```

## Usage

### Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

```bash
pip install -U sentence-transformers
```

Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("jeevansai93/Jeevan_cv_run2_roberta_5_epoc")
# Run inference
sentences = [
    'Sour Cream Coconut Cake ["2 c. sugar", "2 (8 oz.) carton sour cream", "2 pkg. frozen coconut", "1 (3-layer) cake, baked"] ["Bake cake; split the 3 layers into 6 layers."]',
    'Milk Chocolate Bar Cake ["1 (18 oz.) pkg. Swiss chocolate cake mix", "1 (8 oz.) pkg. cream cheese, softened", "1 c. powdered sugar", "1/2 c. granulated sugar", "10 (15 oz.) milk chocolate candy bars with almonds, divided", "1 (12 oz.) carton thawed Cool Whip"] ["Prepare cake batter according to directions on box.", "Pour into 2 greased and floured 8-inch round cake pans.", "Bake at 325\\u00b0 for 20 to 25 minutes.", "Cool and divide to make 4 layers."]',
    'Chili Sauce ["12 ripe tomatoes", "4 onions", "2 green peppers", "1 red pepper", "4 Tbsp. sugar", "2 Tbsp. salt", "2 tsp. cinnamon", "2 tsp. cloves", "2 tsp. allspice", "1 tsp. ginger", "1 qt. vinegar"] ["Peel onions and tomatoes, seed peppers and chop all fine, add the spices and put over the fire. Boil steadily for two hours; cool, bottle and seal."]',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```

<!--
### Direct Usage (Transformers)

<details><summary>Click to see the direct usage in Transformers</summary>

</details>
-->

<!--
### Downstream Usage (Sentence Transformers)

You can finetune this model on your own dataset.

<details><summary>Click to expand</summary>

</details>
-->

<!--
### Out-of-Scope Use

*List how the model may foreseeably be misused and address what users ought not to do with the model.*
-->

<!--
## Bias, Risks and Limitations

*What are the known or foreseeable issues stemming from this model? You could also flag here known failure cases or weaknesses of the model.*
-->

<!--
### Recommendations

*What are recommendations with respect to the foreseeable issues? For example, filtering explicit content.*
-->

## Training Details

### Training Dataset

#### Unnamed Dataset


* Size: 4,149 training samples
* Columns: <code>sentence_0</code>, <code>sentence_1</code>, and <code>label</code>
* Approximate statistics based on the first 1000 samples:
  |         | sentence_0                                                                           | sentence_1                                                                           | label                                                          |
  |:--------|:-------------------------------------------------------------------------------------|:-------------------------------------------------------------------------------------|:---------------------------------------------------------------|
  | type    | string                                                                               | string                                                                               | float                                                          |
  | details | <ul><li>min: 42 tokens</li><li>mean: 136.35 tokens</li><li>max: 326 tokens</li></ul> | <ul><li>min: 34 tokens</li><li>mean: 137.99 tokens</li><li>max: 358 tokens</li></ul> | <ul><li>min: 0.0</li><li>mean: 0.24</li><li>max: 1.0</li></ul> |
* Samples:
  | sentence_0                                                                                                                                                                                                                                                                                                                                                                       | sentence_1                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                                         | label                            |
  |:---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:---------------------------------|
  | <code>Quick Barbecue Wings ["chicken wings (as many as you need for dinner)", "flour", "barbecue sauce (your choice)"] ["Clean wings.", "Flour and fry until done.", "Place fried chicken wings in microwave bowl.", "Stir in barbecue sauce.", "Microwave on High (stir once) for 4 minutes."]</code>                                                                           | <code>Spaghetti Sauce To Can ["1/2 bushel tomatoes", "1 c. oil", "1/4 c. minced garlic", "6 cans tomato paste", "3 peppers (2 sweet and 1 hot)", "1 1/2 c. sugar", "1/2 c. salt", "1 Tbsp. sweet basil", "2 Tbsp. oregano", "1 tsp. Italian seasoning"] ["Cook ground or chopped peppers and onions in oil for 1/2 hour. Cook tomatoes and garlic as for juice.", "Put through the mill.", "(I use a food processor and do my tomatoes uncooked.", "I then add the garlic right to the juice.)", "Add peppers and onions to juice and remainder of ingredients.", "Cook approximately 1 hour.", "Put in jars and seal.", "Yields 7 quarts."]</code>                                | <code>0.15000000000000002</code> |
  | <code>Grandma Mary'S Butter Cookies ["1 c. sweet butter", "1 c. granulated sugar", "3 egg yolks", "2 1/2 c. sifted flour", "1 tsp. vanilla"] ["Cream butter.", "Beat into sugar.", "Add egg yolks and vanilla. Beat well after adding each yolk.", "Add flour and beat after each 1/2 cup is added.", "Chill about 1 hour."]</code>                                              | <code>Magic Cookie Bars ["1/2 c. butter", "1 1/2 c. graham cracker crumbs", "1 (14 oz.) can Eagle Brand milk", "6 oz. semi-sweet chocolate chips", "1 (3 1/2 oz.) can flaked coconut (1 1/2 c.)", "1 c. chopped nuts"] ["Preheat oven to 350\u00b0 (325\u00b0 for glass dish).", "In 13 x 9-inch pan, melt butter in oven.", "Sprinkle with crumbs.", "Top with Eagle Brand milk evenly.", "Top with remaining ingredients.", "Press down. Bake 25 to 30 minutes until lightly brown.", "Cool or chill.", "Cut into bars; store, loosely covered, at room temperature."]</code>                                                                                                    | <code>0.65</code>                |
  | <code>Angel Biscuits ["5 c. flour", "3 Tbsp. sugar", "4 tsp. baking powder", "1 1/2 pkg. dry yeast", "2 c. buttermilk", "1 tsp. soda", "1 1/2 sticks margarine", "1/2 c. warm water"] ["Mix flour, sugar, baking powder, soda and salt together.", "Cut in margarine, dissolve yeast in warm water.", "Stir into buttermilk and add to dry mixture.", "Cover and chill."]</code> | <code>Mexican Cookie Rings ["1 1/2 c. sifted flour", "1/2 tsp. baking powder", "1/2 tsp. salt", "1/2 c. butter", "2/3 c. sugar", "3 egg yolks", "1 tsp. vanilla", "multi-colored candies"] ["Sift flour, baking powder and salt together.", "Cream together butter and sugar.", "Add egg yolks and vanilla.", "Beat until light and fluffy.", "Mix in sifted dry ingredients.", "Shape into 1-inch balls.", "Push wooden spoon handle through center (twist).", "Shape into rings.", "Dip each cookie into candies.", "Place on lightly greased baking sheets.", "Bake in 375\u00b0 oven for 10 to 12 minutes or until golden brown.", "Cool on racks.", "Serves 2 dozen."]</code> | <code>0.1</code>                 |
* Loss: [<code>CosineSimilarityLoss</code>](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#cosinesimilarityloss) with these parameters:
  ```json
  {
      "loss_fct": "torch.nn.modules.loss.MSELoss"
  }
  ```

### Training Hyperparameters
#### Non-Default Hyperparameters

- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `num_train_epochs`: 1
- `multi_dataset_batch_sampler`: round_robin

#### All Hyperparameters
<details><summary>Click to expand</summary>

- `overwrite_output_dir`: False
- `do_predict`: False
- `eval_strategy`: no
- `prediction_loss_only`: True
- `per_device_train_batch_size`: 16
- `per_device_eval_batch_size`: 16
- `per_gpu_train_batch_size`: None
- `per_gpu_eval_batch_size`: None
- `gradient_accumulation_steps`: 1
- `eval_accumulation_steps`: None
- `learning_rate`: 5e-05
- `weight_decay`: 0.0
- `adam_beta1`: 0.9
- `adam_beta2`: 0.999
- `adam_epsilon`: 1e-08
- `max_grad_norm`: 1
- `num_train_epochs`: 1
- `max_steps`: -1
- `lr_scheduler_type`: linear
- `lr_scheduler_kwargs`: {}
- `warmup_ratio`: 0.0
- `warmup_steps`: 0
- `log_level`: passive
- `log_level_replica`: warning
- `log_on_each_node`: True
- `logging_nan_inf_filter`: True
- `save_safetensors`: True
- `save_on_each_node`: False
- `save_only_model`: False
- `restore_callback_states_from_checkpoint`: False
- `no_cuda`: False
- `use_cpu`: False
- `use_mps_device`: False
- `seed`: 42
- `data_seed`: None
- `jit_mode_eval`: False
- `use_ipex`: False
- `bf16`: False
- `fp16`: False
- `fp16_opt_level`: O1
- `half_precision_backend`: auto
- `bf16_full_eval`: False
- `fp16_full_eval`: False
- `tf32`: None
- `local_rank`: 0
- `ddp_backend`: None
- `tpu_num_cores`: None
- `tpu_metrics_debug`: False
- `debug`: []
- `dataloader_drop_last`: False
- `dataloader_num_workers`: 0
- `dataloader_prefetch_factor`: None
- `past_index`: -1
- `disable_tqdm`: False
- `remove_unused_columns`: True
- `label_names`: None
- `load_best_model_at_end`: False
- `ignore_data_skip`: False
- `fsdp`: []
- `fsdp_min_num_params`: 0
- `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
- `fsdp_transformer_layer_cls_to_wrap`: None
- `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
- `deepspeed`: None
- `label_smoothing_factor`: 0.0
- `optim`: adamw_torch
- `optim_args`: None
- `adafactor`: False
- `group_by_length`: False
- `length_column_name`: length
- `ddp_find_unused_parameters`: None
- `ddp_bucket_cap_mb`: None
- `ddp_broadcast_buffers`: False
- `dataloader_pin_memory`: True
- `dataloader_persistent_workers`: False
- `skip_memory_metrics`: True
- `use_legacy_prediction_loop`: False
- `push_to_hub`: False
- `resume_from_checkpoint`: None
- `hub_model_id`: None
- `hub_strategy`: every_save
- `hub_private_repo`: False
- `hub_always_push`: False
- `gradient_checkpointing`: False
- `gradient_checkpointing_kwargs`: None
- `include_inputs_for_metrics`: False
- `eval_do_concat_batches`: True
- `fp16_backend`: auto
- `push_to_hub_model_id`: None
- `push_to_hub_organization`: None
- `mp_parameters`: 
- `auto_find_batch_size`: False
- `full_determinism`: False
- `torchdynamo`: None
- `ray_scope`: last
- `ddp_timeout`: 1800
- `torch_compile`: False
- `torch_compile_backend`: None
- `torch_compile_mode`: None
- `dispatch_batches`: None
- `split_batches`: None
- `include_tokens_per_second`: False
- `include_num_input_tokens_seen`: False
- `neftune_noise_alpha`: None
- `optim_target_modules`: None
- `batch_eval_metrics`: False
- `batch_sampler`: batch_sampler
- `multi_dataset_batch_sampler`: round_robin

</details>

### Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.0
- Transformers: 4.41.1
- PyTorch: 2.3.0+cu121
- Accelerate: 0.30.0
- Datasets: 2.19.1
- Tokenizers: 0.19.1

## Citation

### BibTeX

#### Sentence Transformers
```bibtex
@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}
```

<!--
## Glossary

*Clearly define terms in order to be accessible across audiences.*
-->

<!--
## Model Card Authors

*Lists the people who create the model card, providing recognition and accountability for the detailed work that goes into its construction.*
-->

<!--
## Model Card Contact

*Provides a way for people who have updates to the Model Card, suggestions, or questions, to contact the Model Card authors.*
-->