SentenceTransformer based on Snowflake/snowflake-arctic-embed-m-v1.5
This is a sentence-transformers model finetuned from Snowflake/snowflake-arctic-embed-m-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: Snowflake/snowflake-arctic-embed-m-v1.5
- Maximum Sequence Length: 512 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
'How does the operational efficiency of energy systems impact the integration of renewable energy sources into the grid, and what role does this play in improving overall energy efficiency?',
'(58) Directive 2010/75/EU of the European Parliament and of the Council (21) lays down rules on installations that contribute to energy production or use energy for production purposes, and provides that information on the energy used in or generated by the installation is to be included in applications for integrated permits in accordance with Article 12(1), point (b) of that Directive. Moreover, Article 11 of that Directive provides that efficient use of energy is one of the general principles governing the basic obligations of the operator and one of the criteria for determining best available techniques pursuant to Annex III to that Directive. The operational efficiency of energy systems at any given moment is influenced by the ability to feed power generated from different sources with different degrees of inertia and start-up times into the grid smoothly and flexibly. Improving efficiency will enable better use to be made of renewable energy.\n\n(59) Improvement in energy efficiency can contribute to higher economic output. Member States and the Union should aim to decrease energy consumption regardless of levels of economic growth.\n\n(60) The energy savings obligation established by this Directive should be increased and should also apply after 2030. That ensures stability for investors and thus encourages long-term investments and long-term energy efficiency measures, such as the deep renovation of buildings with the long-term objective of facilitating the cost effective transformation of existing buildings into nearly zero-energy buildings. The energy savings obligation plays an important role in the creation of local growth, jobs, competitiveness and alleviating energy poverty. It should ensure that the Union can achieve its energy and climate objectives by creating further opportunities and by breaking the link between energy consumption and growth. Cooperation with the private sector is important to assess the conditions on which private investment for energy efficiency projects can be unlocked and to develop new revenue models for innovation in the field of energy efficiency.',
'where:\n\n‘distance’ means the great circle distance between the aerodrome of departure and the aerodrome of arrival plus an additional fixed factor of 95 km; and\n\n‘payload’ means the total mass of freight, mail and passengers carried.\n\nFor the purposes of calculating the payload:\n\nthe number of passengers shall be the number of persons on-board excluding crew members,\n\nan aircraft operator may choose to apply either the actual or standard mass for passengers and checked baggage contained in its mass and balance documentation for the relevant flights or a default value of 100 kg for each passenger and his checked baggage.\n\nReporting of tonne-kilometre data for the purpose of Articles 3e and 3f',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.811 |
cosine_accuracy@3 | 0.943 |
cosine_accuracy@5 | 0.9704 |
cosine_accuracy@10 | 0.9849 |
cosine_precision@1 | 0.811 |
cosine_precision@3 | 0.3143 |
cosine_precision@5 | 0.1941 |
cosine_precision@10 | 0.0985 |
cosine_recall@1 | 0.811 |
cosine_recall@3 | 0.943 |
cosine_recall@5 | 0.9704 |
cosine_recall@10 | 0.9849 |
cosine_ndcg@10 | 0.9068 |
cosine_mrr@10 | 0.8808 |
cosine_map@100 | 0.8816 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 32,897 training samples
- Columns:
sentence_0
andsentence_1
- Approximate statistics based on the first 1000 samples:
sentence_0 sentence_1 type string string details - min: 12 tokens
- mean: 41.21 tokens
- max: 184 tokens
- min: 3 tokens
- mean: 234.18 tokens
- max: 512 tokens
- Samples:
sentence_0 sentence_1 What is the maximum allowable concentration of Dimethylfumarate in articles or parts thereof, and what are the implications for market placement if this concentration is exceeded?
"32011R0366: INSERTED") 60. Acrylamide CAS No 79-06-1 Shall not be placed on the market or used as a substance or constituent of mixtures in a concentration, equal to or greater than 0,1 % by weight for grouting applications after 5 November 2012. ▼M16 61. Dimethylfumarate (DMF) CAS No 624-49-7 EC 210-849-0 Shall not be used in articles or any parts thereof in concentrations greater than 0,1 mg/kg. Articles or any parts thereof containing DMF in concentrations greater than 0,1 mg/kg shall not be placed on the market. ▼M20 62. (a) Phenylmercury acetate EC No: 200-532-5 CAS No:
The passage highlights the importance of ensuring that hygiene products made partly or wholly from plastics remain safe by avoiding harmful chemical substances. By calling for the assessment of additional restrictions under chemical regulations, it underscores a precautionary stance rooted in safeguarding consumer well-being, particularly for products intimately connected to human health.
(19)
What responsibilities do Member States have regarding penalties for infringements of national provisions adopted under the directive?
Member States shall lay down the rules on penalties applicable to infringements of national provisions adopted pursuant to this Directive and shall take all measures necessary to ensure that they are implemented. The penalties provided for shall be effective, proportionate and dissuasive. Member States shall, by 3 July 2021, notify the Commission of those rules and those measures and shall notify it of any subsequent amendment affecting them.
- Loss:
MatryoshkaLoss
with these parameters:{ "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 4per_device_eval_batch_size
: 4num_train_epochs
: 4multi_dataset_batch_sampler
: round_robin
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 4per_device_eval_batch_size
: 4per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 5e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1num_train_epochs
: 4max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.0warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: round_robin
Training Logs
Click to expand
Epoch | Step | Training Loss | cosine_ndcg@10 |
---|---|---|---|
0.0122 | 100 | - | 0.6297 |
0.0243 | 200 | - | 0.7126 |
0.0365 | 300 | - | 0.7753 |
0.0486 | 400 | - | 0.8207 |
0.0608 | 500 | 0.3802 | 0.8440 |
0.0729 | 600 | - | 0.8541 |
0.0851 | 700 | - | 0.8600 |
0.0973 | 800 | - | 0.8646 |
0.1094 | 900 | - | 0.8693 |
0.1216 | 1000 | 0.0994 | 0.8703 |
0.1337 | 1100 | - | 0.8769 |
0.1459 | 1200 | - | 0.8739 |
0.1581 | 1300 | - | 0.8716 |
0.1702 | 1400 | - | 0.8766 |
0.1824 | 1500 | 0.0749 | 0.8791 |
0.1945 | 1600 | - | 0.8791 |
0.2067 | 1700 | - | 0.8793 |
0.2188 | 1800 | - | 0.8810 |
0.2310 | 1900 | - | 0.8801 |
0.2432 | 2000 | 0.0834 | 0.8821 |
0.2553 | 2100 | - | 0.8851 |
0.2675 | 2200 | - | 0.8810 |
0.2796 | 2300 | - | 0.8790 |
0.2918 | 2400 | - | 0.8799 |
0.3040 | 2500 | 0.0686 | 0.8762 |
0.3161 | 2600 | - | 0.8785 |
0.3283 | 2700 | - | 0.8801 |
0.3404 | 2800 | - | 0.8754 |
0.3526 | 2900 | - | 0.8767 |
0.3647 | 3000 | 0.0402 | 0.8748 |
0.3769 | 3100 | - | 0.8742 |
0.3891 | 3200 | - | 0.8733 |
0.4012 | 3300 | - | 0.8801 |
0.4134 | 3400 | - | 0.8824 |
0.4255 | 3500 | 0.065 | 0.8796 |
0.4377 | 3600 | - | 0.8783 |
0.4498 | 3700 | - | 0.8707 |
0.4620 | 3800 | - | 0.8768 |
0.4742 | 3900 | - | 0.8761 |
0.4863 | 4000 | 0.0543 | 0.8797 |
0.4985 | 4100 | - | 0.8787 |
0.5106 | 4200 | - | 0.8796 |
0.5228 | 4300 | - | 0.8788 |
0.5350 | 4400 | - | 0.8808 |
0.5471 | 4500 | 0.0817 | 0.8813 |
0.5593 | 4600 | - | 0.8778 |
0.5714 | 4700 | - | 0.8794 |
0.5836 | 4800 | - | 0.8824 |
0.5957 | 4900 | - | 0.8820 |
0.6079 | 5000 | 0.0425 | 0.8809 |
0.6201 | 5100 | - | 0.8786 |
0.6322 | 5200 | - | 0.8794 |
0.6444 | 5300 | - | 0.8760 |
0.6565 | 5400 | - | 0.8765 |
0.6687 | 5500 | 0.0509 | 0.8743 |
0.6809 | 5600 | - | 0.8792 |
0.6930 | 5700 | - | 0.8835 |
0.7052 | 5800 | - | 0.8825 |
0.7173 | 5900 | - | 0.8826 |
0.7295 | 6000 | 0.0538 | 0.8817 |
0.7416 | 6100 | - | 0.8769 |
0.7538 | 6200 | - | 0.8793 |
0.7660 | 6300 | - | 0.8785 |
0.7781 | 6400 | - | 0.8775 |
0.7903 | 6500 | 0.0193 | 0.8790 |
0.8024 | 6600 | - | 0.8827 |
0.8146 | 6700 | - | 0.8782 |
0.8267 | 6800 | - | 0.8727 |
0.8389 | 6900 | - | 0.8807 |
0.8511 | 7000 | 0.0441 | 0.8819 |
0.8632 | 7100 | - | 0.8825 |
0.8754 | 7200 | - | 0.8822 |
0.8875 | 7300 | - | 0.8781 |
0.8997 | 7400 | - | 0.8794 |
0.9119 | 7500 | 0.0518 | 0.8799 |
0.9240 | 7600 | - | 0.8801 |
0.9362 | 7700 | - | 0.8792 |
0.9483 | 7800 | - | 0.8773 |
0.9605 | 7900 | - | 0.8759 |
0.9726 | 8000 | 0.03 | 0.8768 |
0.9848 | 8100 | - | 0.8748 |
0.9970 | 8200 | - | 0.8736 |
1.0 | 8225 | - | 0.8802 |
1.0091 | 8300 | - | 0.8779 |
1.0213 | 8400 | - | 0.8747 |
1.0334 | 8500 | 0.028 | 0.8798 |
1.0456 | 8600 | - | 0.8793 |
1.0578 | 8700 | - | 0.8773 |
1.0699 | 8800 | - | 0.8810 |
1.0821 | 8900 | - | 0.8823 |
1.0942 | 9000 | 0.0232 | 0.8782 |
1.1064 | 9100 | - | 0.8844 |
1.1185 | 9200 | - | 0.8772 |
1.1307 | 9300 | - | 0.8746 |
1.1429 | 9400 | - | 0.8739 |
1.1550 | 9500 | 0.0316 | 0.8806 |
1.1672 | 9600 | - | 0.8789 |
1.1793 | 9700 | - | 0.8818 |
1.1915 | 9800 | - | 0.8812 |
1.2036 | 9900 | - | 0.8871 |
1.2158 | 10000 | 0.0251 | 0.8876 |
1.2280 | 10100 | - | 0.8884 |
1.2401 | 10200 | - | 0.8859 |
1.2523 | 10300 | - | 0.8844 |
1.2644 | 10400 | - | 0.8840 |
1.2766 | 10500 | 0.0244 | 0.8844 |
1.2888 | 10600 | - | 0.8842 |
1.3009 | 10700 | - | 0.8874 |
1.3131 | 10800 | - | 0.8875 |
1.3252 | 10900 | - | 0.8827 |
1.3374 | 11000 | 0.0258 | 0.8831 |
1.3495 | 11100 | - | 0.8830 |
1.3617 | 11200 | - | 0.8831 |
1.3739 | 11300 | - | 0.8845 |
1.3860 | 11400 | - | 0.8868 |
1.3982 | 11500 | 0.0334 | 0.8871 |
1.4103 | 11600 | - | 0.8872 |
1.4225 | 11700 | - | 0.8886 |
1.4347 | 11800 | - | 0.8894 |
1.4468 | 11900 | - | 0.8876 |
1.4590 | 12000 | 0.0167 | 0.8906 |
1.4711 | 12100 | - | 0.8915 |
1.4833 | 12200 | - | 0.8910 |
1.4954 | 12300 | - | 0.8892 |
1.5076 | 12400 | - | 0.8894 |
1.5198 | 12500 | 0.0216 | 0.8877 |
1.5319 | 12600 | - | 0.8899 |
1.5441 | 12700 | - | 0.8881 |
1.5562 | 12800 | - | 0.8889 |
1.5684 | 12900 | - | 0.8869 |
1.5805 | 13000 | 0.0181 | 0.8879 |
1.5927 | 13100 | - | 0.8883 |
1.6049 | 13200 | - | 0.8876 |
1.6170 | 13300 | - | 0.8873 |
1.6292 | 13400 | - | 0.8895 |
1.6413 | 13500 | 0.0236 | 0.8913 |
1.6535 | 13600 | - | 0.8918 |
1.6657 | 13700 | - | 0.8933 |
1.6778 | 13800 | - | 0.8913 |
1.6900 | 13900 | - | 0.8937 |
1.7021 | 14000 | 0.0291 | 0.8941 |
1.7143 | 14100 | - | 0.8950 |
1.7264 | 14200 | - | 0.8957 |
1.7386 | 14300 | - | 0.8912 |
1.7508 | 14400 | - | 0.8937 |
1.7629 | 14500 | 0.0266 | 0.8922 |
1.7751 | 14600 | - | 0.8903 |
1.7872 | 14700 | - | 0.8930 |
1.7994 | 14800 | - | 0.8913 |
1.8116 | 14900 | - | 0.8913 |
1.8237 | 15000 | 0.018 | 0.8925 |
1.8359 | 15100 | - | 0.8928 |
1.8480 | 15200 | - | 0.8889 |
1.8602 | 15300 | - | 0.8884 |
1.8723 | 15400 | - | 0.8876 |
1.8845 | 15500 | 0.018 | 0.8891 |
1.8967 | 15600 | - | 0.8887 |
1.9088 | 15700 | - | 0.8891 |
1.9210 | 15800 | - | 0.8924 |
1.9331 | 15900 | - | 0.8891 |
1.9453 | 16000 | 0.0223 | 0.8902 |
1.9574 | 16100 | - | 0.8943 |
1.9696 | 16200 | - | 0.8925 |
1.9818 | 16300 | - | 0.8905 |
1.9939 | 16400 | - | 0.8892 |
2.0 | 16450 | - | 0.8897 |
2.0061 | 16500 | 0.0274 | 0.8889 |
2.0182 | 16600 | - | 0.8914 |
2.0304 | 16700 | - | 0.8871 |
2.0426 | 16800 | - | 0.8878 |
2.0547 | 16900 | - | 0.8882 |
2.0669 | 17000 | 0.0178 | 0.8879 |
2.0790 | 17100 | - | 0.8892 |
2.0912 | 17200 | - | 0.8896 |
2.1033 | 17300 | - | 0.8933 |
2.1155 | 17400 | - | 0.8936 |
2.1277 | 17500 | 0.0079 | 0.8942 |
2.1398 | 17600 | - | 0.8927 |
2.1520 | 17700 | - | 0.8914 |
2.1641 | 17800 | - | 0.8932 |
2.1763 | 17900 | - | 0.8917 |
2.1884 | 18000 | 0.0031 | 0.8921 |
2.2006 | 18100 | - | 0.8912 |
2.2128 | 18200 | - | 0.8879 |
2.2249 | 18300 | - | 0.8865 |
2.2371 | 18400 | - | 0.8847 |
2.2492 | 18500 | 0.0127 | 0.8869 |
2.2614 | 18600 | - | 0.8880 |
2.2736 | 18700 | - | 0.8885 |
2.2857 | 18800 | - | 0.8901 |
2.2979 | 18900 | - | 0.8880 |
2.3100 | 19000 | 0.019 | 0.8897 |
2.3222 | 19100 | - | 0.8918 |
2.3343 | 19200 | - | 0.8895 |
2.3465 | 19300 | - | 0.8918 |
2.3587 | 19400 | - | 0.8933 |
2.3708 | 19500 | 0.0177 | 0.8936 |
2.3830 | 19600 | - | 0.8919 |
2.3951 | 19700 | - | 0.8922 |
2.4073 | 19800 | - | 0.8923 |
2.4195 | 19900 | - | 0.8946 |
2.4316 | 20000 | 0.0085 | 0.8935 |
2.4438 | 20100 | - | 0.8944 |
2.4559 | 20200 | - | 0.8918 |
2.4681 | 20300 | - | 0.8948 |
2.4802 | 20400 | - | 0.8937 |
2.4924 | 20500 | 0.0073 | 0.8935 |
2.5046 | 20600 | - | 0.8942 |
2.5167 | 20700 | - | 0.8939 |
2.5289 | 20800 | - | 0.8950 |
2.5410 | 20900 | - | 0.8974 |
2.5532 | 21000 | 0.01 | 0.8961 |
2.5653 | 21100 | - | 0.8977 |
2.5775 | 21200 | - | 0.8985 |
2.5897 | 21300 | - | 0.8962 |
2.6018 | 21400 | - | 0.8981 |
2.6140 | 21500 | 0.0148 | 0.8978 |
2.6261 | 21600 | - | 0.8967 |
2.6383 | 21700 | - | 0.8978 |
2.6505 | 21800 | - | 0.8975 |
2.6626 | 21900 | - | 0.9010 |
2.6748 | 22000 | 0.0271 | 0.9000 |
2.6869 | 22100 | - | 0.8973 |
2.6991 | 22200 | - | 0.8987 |
2.7112 | 22300 | - | 0.9005 |
2.7234 | 22400 | - | 0.8990 |
2.7356 | 22500 | 0.0108 | 0.8993 |
2.7477 | 22600 | - | 0.9011 |
2.7599 | 22700 | - | 0.8998 |
2.7720 | 22800 | - | 0.8981 |
2.7842 | 22900 | - | 0.9006 |
2.7964 | 23000 | 0.0067 | 0.9010 |
2.8085 | 23100 | - | 0.9028 |
2.8207 | 23200 | - | 0.9024 |
2.8328 | 23300 | - | 0.9027 |
2.8450 | 23400 | - | 0.9024 |
2.8571 | 23500 | 0.01 | 0.9031 |
2.8693 | 23600 | - | 0.9029 |
2.8815 | 23700 | - | 0.9022 |
2.8936 | 23800 | - | 0.8992 |
2.9058 | 23900 | - | 0.9007 |
2.9179 | 24000 | 0.0081 | 0.9005 |
2.9301 | 24100 | - | 0.8990 |
2.9422 | 24200 | - | 0.8992 |
2.9544 | 24300 | - | 0.9021 |
2.9666 | 24400 | - | 0.9003 |
2.9787 | 24500 | 0.0142 | 0.9016 |
2.9909 | 24600 | - | 0.9018 |
3.0 | 24675 | - | 0.9014 |
3.0030 | 24700 | - | 0.9014 |
3.0152 | 24800 | - | 0.9012 |
3.0274 | 24900 | - | 0.9006 |
3.0395 | 25000 | 0.0033 | 0.9002 |
3.0517 | 25100 | - | 0.9013 |
3.0638 | 25200 | - | 0.9025 |
3.0760 | 25300 | - | 0.9022 |
3.0881 | 25400 | - | 0.9013 |
3.1003 | 25500 | 0.0072 | 0.9008 |
3.1125 | 25600 | - | 0.9007 |
3.1246 | 25700 | - | 0.8997 |
3.1368 | 25800 | - | 0.8982 |
3.1489 | 25900 | - | 0.8992 |
3.1611 | 26000 | 0.0147 | 0.8999 |
3.1733 | 26100 | - | 0.9010 |
3.1854 | 26200 | - | 0.9002 |
3.1976 | 26300 | - | 0.9024 |
3.2097 | 26400 | - | 0.9017 |
3.2219 | 26500 | 0.0154 | 0.9038 |
3.2340 | 26600 | - | 0.9039 |
3.2462 | 26700 | - | 0.9040 |
3.2584 | 26800 | - | 0.9031 |
3.2705 | 26900 | - | 0.9035 |
3.2827 | 27000 | 0.0078 | 0.9035 |
3.2948 | 27100 | - | 0.9038 |
3.3070 | 27200 | - | 0.9041 |
3.3191 | 27300 | - | 0.9039 |
3.3313 | 27400 | - | 0.9026 |
3.3435 | 27500 | 0.0041 | 0.9025 |
3.3556 | 27600 | - | 0.9027 |
3.3678 | 27700 | - | 0.9037 |
3.3799 | 27800 | - | 0.9029 |
3.3921 | 27900 | - | 0.9030 |
3.4043 | 28000 | 0.0043 | 0.9028 |
3.4164 | 28100 | - | 0.9027 |
3.4286 | 28200 | - | 0.9027 |
3.4407 | 28300 | - | 0.9037 |
3.4529 | 28400 | - | 0.9048 |
3.4650 | 28500 | 0.0102 | 0.9040 |
3.4772 | 28600 | - | 0.9042 |
3.4894 | 28700 | - | 0.9050 |
3.5015 | 28800 | - | 0.9046 |
3.5137 | 28900 | - | 0.9057 |
3.5258 | 29000 | 0.0075 | 0.9048 |
3.5380 | 29100 | - | 0.9045 |
3.5502 | 29200 | - | 0.9052 |
3.5623 | 29300 | - | 0.9053 |
3.5745 | 29400 | - | 0.9068 |
Framework Versions
- Python: 3.10.11
- Sentence Transformers: 3.4.1
- Transformers: 4.48.1
- PyTorch: 2.4.0+cu121
- Accelerate: 1.4.0
- Datasets: 3.3.2
- Tokenizers: 0.21.0
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MatryoshkaLoss
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 5,816
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for fjavigv/snoweu_v3
Base model
Snowflake/snowflake-arctic-embed-m-v1.5Evaluation results
- Cosine Accuracy@1 on Unknownself-reported0.811
- Cosine Accuracy@3 on Unknownself-reported0.943
- Cosine Accuracy@5 on Unknownself-reported0.970
- Cosine Accuracy@10 on Unknownself-reported0.985
- Cosine Precision@1 on Unknownself-reported0.811
- Cosine Precision@3 on Unknownself-reported0.314
- Cosine Precision@5 on Unknownself-reported0.194
- Cosine Precision@10 on Unknownself-reported0.098
- Cosine Recall@1 on Unknownself-reported0.811
- Cosine Recall@3 on Unknownself-reported0.943