Edit model card

SentenceTransformer based on nomic-ai/nomic-embed-text-v1.5

This is a sentence-transformers model finetuned from nomic-ai/nomic-embed-text-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nomic-ai/nomic-embed-text-v1.5
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NomicBertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'search_query: dab rig',
    'search_query: volcano weed vaporizer',
    'search_query: 22 gold chain for men',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.7405
dot_accuracy 0.269
manhattan_accuracy 0.7432
euclidean_accuracy 0.7457
max_accuracy 0.7457

Training Details

Training Dataset

Unnamed Dataset

  • Size: 167,039 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 11.1 tokens
    • max: 38 tokens
    • min: 14 tokens
    • mean: 43.23 tokens
    • max: 124 tokens
    • min: 16 tokens
    • mean: 43.16 tokens
    • max: 97 tokens
  • Samples:
    anchor positive negative
    search_query: foos ball coffee table search_document: KICK Vanquish 55" in Foosball Table, KICK, Blue/Gray search_document: KICK Legend 55" Foosball Table (Black), KICK, Black
    search_query: bathroom rugs white washable search_document: Luxury Bath Mat Floor Towel Set - Absorbent Cotton Hotel Spa Shower/Bathtub Mats [Not a Bathroom Rug] 22"x34" White
    search_query: kids gloves search_document: EvridWear Boys Girls Magic Stretch Gripper Gloves 3 Pair Pack Assortment, Kids One Size Winter Warm Gloves Children (8-14Years, 3 Pairs Camo), Evridwear, 3 Pairs Camo search_document: Body Glove Little Boys 2-Piece UPF 50+ Rash Guard Swimsuit Set (2 Piece), All Black, Size 5, Body Glove, All Black
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Evaluation Dataset

Unnamed Dataset

  • Size: 10,000 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 11.44 tokens
    • max: 31 tokens
    • min: 16 tokens
    • mean: 42.26 tokens
    • max: 92 tokens
    • min: 16 tokens
    • mean: 42.28 tokens
    • max: 105 tokens
  • Samples:
    anchor positive negative
    search_query: defender series iphone 8 search_document: Hand-e Muscle Series Belt Clip Case for Apple iPhone 7 / iPhone 8 / iPhone SE “2020” (4.7”) 2-in-1 Protective Defender w Screen Protector & Holster & Kickstand/Shock & Drop Proof – Camouflage/Orange, Hand-e, Camouflage / Orange search_document: OtterBox Defender Series Rugged Case for iPhone 8 PLUS & iPhone 7 PLUS - Case Only - Non-Retail Packaging - Dark Lake - With Microbial Defense, OtterBox, Dark Lake
    search_query: joy mangano search_document: Joy by Joy Mangano 11-Piece Complete Luxury Towel Set, Ivory, Joy Mangano, Ivory search_document: BAGSMART Jewelry Organizer Case Travel Jewelry Storage Bag for Necklace, Earrings, Rings, Bracelet, Soft Pink, BAGSMART, Soft Pink
    search_query: cashel fly masks for horses without ears search_document: Cashel Crusader Designer Horse Fly Mask, Leopard, Weanling, Cashel, Leopard search_document: Cashel Crusader Designer Horse Fly Mask with Ears, Teal Tribal, Weanling, Cashel, Teal Tribal
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • gradient_accumulation_steps: 4
  • learning_rate: 1e-06
  • num_train_epochs: 5
  • lr_scheduler_type: cosine_with_restarts
  • warmup_ratio: 0.1
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: 2
  • load_best_model_at_end: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 4
  • eval_accumulation_steps: None
  • learning_rate: 1e-06
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: 2
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss triplet-esci_cosine_accuracy
0.0096 100 0.6669 - -
0.0192 200 0.6633 - -
0.0287 300 0.6575 - -
0.0383 400 0.6638 - -
0.0479 500 0.6191 - -
0.0575 600 0.6464 - -
0.0671 700 0.6291 - -
0.0766 800 0.5973 - -
0.0862 900 0.605 - -
0.0958 1000 0.6278 0.6525 0.7269
0.1054 1100 0.6041 - -
0.1149 1200 0.6077 - -
0.1245 1300 0.589 - -
0.1341 1400 0.5811 - -
0.1437 1500 0.5512 - -
0.1533 1600 0.5907 - -
0.1628 1700 0.5718 - -
0.1724 1800 0.5446 - -
0.1820 1900 0.546 - -
0.1916 2000 0.5141 0.6105 0.7386
0.2012 2100 0.5359 - -
0.2107 2200 0.5093 - -
0.2203 2300 0.5384 - -
0.2299 2400 0.5582 - -
0.2395 2500 0.5038 - -
0.2490 2600 0.5031 - -
0.2586 2700 0.5393 - -
0.2682 2800 0.4979 - -
0.2778 2900 0.5221 - -
0.2874 3000 0.4956 0.5852 0.7495
0.2969 3100 0.506 - -
0.3065 3200 0.4962 - -
0.3161 3300 0.4713 - -
0.3257 3400 0.5016 - -
0.3353 3500 0.4749 - -
0.3448 3600 0.4732 - -
0.3544 3700 0.4789 - -
0.3640 3800 0.4825 - -
0.3736 3900 0.4803 - -
0.3832 4000 0.4471 0.5743 0.7546
0.3927 4100 0.4593 - -
0.4023 4200 0.4481 - -
0.4119 4300 0.4603 - -
0.4215 4400 0.4569 - -
0.4310 4500 0.4807 - -
0.4406 4600 0.4368 - -
0.4502 4700 0.4532 - -
0.4598 4800 0.4432 - -
0.4694 4900 0.4802 - -
0.4789 5000 0.4643 0.5663 0.7593
0.4885 5100 0.4154 - -
0.4981 5200 0.4441 - -
0.5077 5300 0.4156 - -
0.5173 5400 0.4273 - -
0.5268 5500 0.3988 - -
0.5364 5600 0.3942 - -
0.5460 5700 0.4186 - -
0.5556 5800 0.423 - -
0.5651 5900 0.434 - -
0.5747 6000 0.4136 0.5704 0.7616
0.5843 6100 0.3968 - -
0.5939 6200 0.4045 - -
0.6035 6300 0.4122 - -
0.6130 6400 0.3618 - -
0.6226 6500 0.341 - -
0.6322 6600 0.3689 - -
0.6418 6700 0.3621 - -
0.6514 6800 0.3774 - -
0.6609 6900 0.3519 - -
0.6705 7000 0.3974 0.5729 0.7644
0.6801 7100 0.3443 - -
0.6897 7200 0.3665 - -
0.6993 7300 0.3683 - -
0.7088 7400 0.3593 - -
0.7184 7500 0.3419 - -
0.7280 7600 0.3587 - -
0.7376 7700 0.3463 - -
0.7471 7800 0.3417 - -
0.7567 7900 0.32 - -
0.7663 8000 0.32 0.5735 0.7677
0.7759 8100 0.3296 - -
0.7855 8200 0.3492 - -
0.7950 8300 0.3022 - -
0.8046 8400 0.3159 - -
0.8142 8500 0.3172 - -
0.8238 8600 0.3157 - -
0.8334 8700 0.3271 - -
0.8429 8800 0.337 - -
0.8525 8900 0.322 - -
0.8621 9000 0.3187 0.5803 0.7652
0.8717 9100 0.307 - -
0.8812 9200 0.2984 - -
0.8908 9300 0.2727 - -
0.9004 9400 0.304 - -
0.9100 9500 0.321 - -
0.9196 9600 0.304 - -
0.9291 9700 0.3302 - -
0.9387 9800 0.3302 - -
0.9483 9900 0.3134 - -
0.9579 10000 0.2936 0.5858 0.7671
0.9675 10100 0.2953 - -
0.9770 10200 0.3035 - -
0.9866 10300 0.303 - -
0.9962 10400 0.2606 - -
1.0058 10500 0.2615 - -
1.0153 10600 0.2703 - -
1.0249 10700 0.2761 - -
1.0345 10800 0.2559 - -
1.0441 10900 0.2672 - -
1.0537 11000 0.2656 0.5933 0.7676
1.0632 11100 0.2825 - -
1.0728 11200 0.2484 - -
1.0824 11300 0.2472 - -
1.0920 11400 0.2678 - -
1.1016 11500 0.2443 - -
1.1111 11600 0.2685 - -
1.1207 11700 0.2504 - -
1.1303 11800 0.2431 - -
1.1399 11900 0.2248 - -
1.1495 12000 0.2229 0.5958 0.7688
1.1590 12100 0.228 - -
1.1686 12200 0.2304 - -
1.1782 12300 0.2193 - -
1.1878 12400 0.2238 - -
1.1973 12500 0.1957 - -
1.2069 12600 0.2075 - -
1.2165 12700 0.2014 - -
1.2261 12800 0.2222 - -
1.2357 12900 0.2059 - -
1.2452 13000 0.2051 0.6077 0.7651
1.2548 13100 0.2076 - -
1.2644 13200 0.226 - -
1.2740 13300 0.1941 - -
1.2836 13400 0.2053 - -
1.2931 13500 0.2003 - -
1.3027 13600 0.1947 - -
1.3123 13700 0.1914 - -
1.3219 13800 0.1956 - -
1.3314 13900 0.1862 - -
1.3410 14000 0.1873 0.6110 0.7646
1.3506 14100 0.1812 - -
1.3602 14200 0.1828 - -
1.3698 14300 0.1696 - -
1.3793 14400 0.1705 - -
1.3889 14500 0.1746 - -
1.3985 14600 0.1756 - -
1.4081 14700 0.1682 - -
1.4177 14800 0.1769 - -
1.4272 14900 0.1795 - -
1.4368 15000 0.1736 0.6278 0.7616
1.4464 15100 0.1546 - -
1.4560 15200 0.1643 - -
1.4656 15300 0.1903 - -
1.4751 15400 0.1902 - -
1.4847 15500 0.1531 - -
1.4943 15600 0.1711 - -
1.5039 15700 0.1546 - -
1.5134 15800 0.1503 - -
1.5230 15900 0.1429 - -
1.5326 16000 0.147 0.6306 0.7623
1.5422 16100 0.1507 - -
1.5518 16200 0.152 - -
1.5613 16300 0.1602 - -
1.5709 16400 0.1541 - -
1.5805 16500 0.1491 - -
1.5901 16600 0.1378 - -
1.5997 16700 0.1505 - -
1.6092 16800 0.1334 - -
1.6188 16900 0.1288 - -
1.6284 17000 0.1168 0.6372 0.7629
1.6380 17100 0.135 - -
1.6475 17200 0.1239 - -
1.6571 17300 0.1398 - -
1.6667 17400 0.1292 - -
1.6763 17500 0.1414 - -
1.6859 17600 0.116 - -
1.6954 17700 0.1302 - -
1.7050 17800 0.1194 - -
1.7146 17900 0.1394 - -
1.7242 18000 0.1316 0.6561 0.7592
1.7338 18100 0.1246 - -
1.7433 18200 0.1277 - -
1.7529 18300 0.1055 - -
1.7625 18400 0.1211 - -
1.7721 18500 0.1107 - -
1.7817 18600 0.1145 - -
1.7912 18700 0.1162 - -
1.8008 18800 0.1114 - -
1.8104 18900 0.1182 - -
1.8200 19000 0.1152 0.6567 0.7591
1.8295 19100 0.1212 - -
1.8391 19200 0.1253 - -
1.8487 19300 0.115 - -
1.8583 19400 0.1292 - -
1.8679 19500 0.1151 - -
1.8774 19600 0.1005 - -
1.8870 19700 0.1079 - -
1.8966 19800 0.0954 - -
1.9062 19900 0.1045 - -
1.9158 20000 0.1086 0.6727 0.7554
1.9253 20100 0.1174 - -
1.9349 20200 0.1108 - -
1.9445 20300 0.0992 - -
1.9541 20400 0.1168 - -
1.9636 20500 0.1028 - -
1.9732 20600 0.1126 - -
1.9828 20700 0.1113 - -
1.9924 20800 0.1065 - -
2.0020 20900 0.078 - -
2.0115 21000 0.0921 0.6727 0.7568
2.0211 21100 0.0866 - -
2.0307 21200 0.0918 - -
2.0403 21300 0.0893 - -
2.0499 21400 0.0882 - -
2.0594 21500 0.0986 - -
2.0690 21600 0.0923 - -
2.0786 21700 0.0805 - -
2.0882 21800 0.0887 - -
2.0978 21900 0.1 - -
2.1073 22000 0.0957 0.6854 0.7539
2.1169 22100 0.0921 - -
2.1265 22200 0.0892 - -
2.1361 22300 0.0805 - -
2.1456 22400 0.0767 - -
2.1552 22500 0.0715 - -
2.1648 22600 0.083 - -
2.1744 22700 0.0755 - -
2.1840 22800 0.075 - -
2.1935 22900 0.0724 - -
2.2031 23000 0.0822 0.6913 0.7534
2.2127 23100 0.0623 - -
2.2223 23200 0.0765 - -
2.2319 23300 0.0755 - -
2.2414 23400 0.0786 - -
2.2510 23500 0.0651 - -
2.2606 23600 0.081 - -
2.2702 23700 0.0664 - -
2.2797 23800 0.0906 - -
2.2893 23900 0.0714 - -
2.2989 24000 0.0703 0.6971 0.7536
2.3085 24100 0.0672 - -
2.3181 24200 0.0754 - -
2.3276 24300 0.0687 - -
2.3372 24400 0.0668 - -
2.3468 24500 0.0616 - -
2.3564 24600 0.0693 - -
2.3660 24700 0.0587 - -
2.3755 24800 0.0612 - -
2.3851 24900 0.0559 - -
2.3947 25000 0.0676 0.7128 0.7497
2.4043 25100 0.0607 - -
2.4139 25200 0.0727 - -
2.4234 25300 0.0573 - -
2.4330 25400 0.0717 - -
2.4426 25500 0.0493 - -
2.4522 25600 0.0558 - -
2.4617 25700 0.0676 - -
2.4713 25800 0.0757 - -
2.4809 25900 0.0735 - -
2.4905 26000 0.056 0.7044 0.7513
2.5001 26100 0.0687 - -
2.5096 26200 0.0592 - -
2.5192 26300 0.057 - -
2.5288 26400 0.0444 - -
2.5384 26500 0.0547 - -
2.5480 26600 0.0605 - -
2.5575 26700 0.066 - -
2.5671 26800 0.0631 - -
2.5767 26900 0.0634 - -
2.5863 27000 0.0537 0.7127 0.7512
2.5958 27100 0.0535 - -
2.6054 27200 0.0572 - -
2.6150 27300 0.0473 - -
2.6246 27400 0.0418 - -
2.6342 27500 0.0585 - -
2.6437 27600 0.0475 - -
2.6533 27700 0.0549 - -
2.6629 27800 0.0452 - -
2.6725 27900 0.0514 - -
2.6821 28000 0.0449 0.7337 0.7482
2.6916 28100 0.0544 - -
2.7012 28200 0.041 - -
2.7108 28300 0.0599 - -
2.7204 28400 0.057 - -
2.7300 28500 0.0503 - -
2.7395 28600 0.0487 - -
2.7491 28700 0.0503 - -
2.7587 28800 0.0446 - -
2.7683 28900 0.042 - -
2.7778 29000 0.0501 0.7422 0.7469
2.7874 29100 0.0494 - -
2.7970 29200 0.0423 - -
2.8066 29300 0.0508 - -
2.8162 29400 0.0459 - -
2.8257 29500 0.0514 - -
2.8353 29600 0.0484 - -
2.8449 29700 0.0571 - -
2.8545 29800 0.0558 - -
2.8641 29900 0.0466 - -
2.8736 30000 0.0465 0.7478 0.7447
2.8832 30100 0.0463 - -
2.8928 30200 0.0362 - -
2.9024 30300 0.0435 - -
2.9119 30400 0.0419 - -
2.9215 30500 0.046 - -
2.9311 30600 0.0451 - -
2.9407 30700 0.0458 - -
2.9503 30800 0.052 - -
2.9598 30900 0.0454 - -
2.9694 31000 0.0433 0.7580 0.745
2.9790 31100 0.0438 - -
2.9886 31200 0.0537 - -
2.9982 31300 0.033 - -
3.0077 31400 0.0384 - -
3.0173 31500 0.0349 - -
3.0269 31600 0.0365 - -
3.0365 31700 0.0397 - -
3.0460 31800 0.0396 - -
3.0556 31900 0.0358 - -
3.0652 32000 0.0443 0.7592 0.7454
3.0748 32100 0.0323 - -
3.0844 32200 0.0418 - -
3.0939 32300 0.0463 - -
3.1035 32400 0.0397 - -
3.1131 32500 0.0425 - -
3.1227 32600 0.0406 - -
3.1323 32700 0.0454 - -
3.1418 32800 0.0287 - -
3.1514 32900 0.0267 - -
3.1610 33000 0.0341 0.7672 0.7431
3.1706 33100 0.0357 - -
3.1802 33200 0.0322 - -
3.1897 33300 0.0367 - -
3.1993 33400 0.0419 - -
3.2089 33500 0.0349 - -
3.2185 33600 0.0327 - -
3.2280 33700 0.0377 - -
3.2376 33800 0.0353 - -
3.2472 33900 0.0305 - -
3.2568 34000 0.0362 0.7668 0.7463
3.2664 34100 0.0311 - -
3.2759 34200 0.0405 - -
3.2855 34300 0.0401 - -
3.2951 34400 0.0361 - -
3.3047 34500 0.0302 - -
3.3143 34600 0.0379 - -
3.3238 34700 0.03 - -
3.3334 34800 0.039 - -
3.3430 34900 0.0288 - -
3.3526 35000 0.0318 0.7782 0.7436
3.3621 35100 0.0283 - -
3.3717 35200 0.029 - -
3.3813 35300 0.0287 - -
3.3909 35400 0.0343 - -
3.4005 35500 0.0326 - -
3.4100 35600 0.031 - -
3.4196 35700 0.0304 - -
3.4292 35800 0.0314 - -
3.4388 35900 0.0286 - -
3.4484 36000 0.0229 0.7978 0.7428
3.4579 36100 0.0258 - -
3.4675 36200 0.043 - -
3.4771 36300 0.042 - -
3.4867 36400 0.029 - -
3.4963 36500 0.0343 - -
3.5058 36600 0.0317 - -
3.5154 36700 0.0307 - -
3.5250 36800 0.0251 - -
3.5346 36900 0.025 - -
3.5441 37000 0.0309 0.8002 0.7446
3.5537 37100 0.031 - -
3.5633 37200 0.0345 - -
3.5729 37300 0.0332 - -
3.5825 37400 0.0346 - -
3.5920 37500 0.026 - -
3.6016 37600 0.0293 - -
3.6112 37700 0.0268 - -
3.6208 37800 0.0264 - -
3.6304 37900 0.0259 - -
3.6399 38000 0.032 0.7896 0.7438
3.6495 38100 0.0246 - -
3.6591 38200 0.0279 - -
3.6687 38300 0.0274 - -
3.6782 38400 0.0241 - -
3.6878 38500 0.027 - -
3.6974 38600 0.022 - -
3.7070 38700 0.0305 - -
3.7166 38800 0.0368 - -
3.7261 38900 0.0304 - -
3.7357 39000 0.0249 0.7978 0.7437
3.7453 39100 0.0312 - -
3.7549 39200 0.0257 - -
3.7645 39300 0.0273 - -
3.7740 39400 0.0209 - -
3.7836 39500 0.0298 - -
3.7932 39600 0.0282 - -
3.8028 39700 0.028 - -
3.8124 39800 0.0279 - -
3.8219 39900 0.0283 - -
3.8315 40000 0.0239 0.7982 0.7424
3.8411 40100 0.0378 - -
3.8507 40200 0.028 - -
3.8602 40300 0.0321 - -
3.8698 40400 0.0289 - -
3.8794 40500 0.027 - -
3.8890 40600 0.0224 - -
3.8986 40700 0.0236 - -
3.9081 40800 0.0267 - -
3.9177 40900 0.0228 - -
3.9273 41000 0.0322 0.8101 0.7415
3.9369 41100 0.0262 - -
3.9465 41200 0.0276 - -
3.9560 41300 0.0292 - -
3.9656 41400 0.0278 - -
3.9752 41500 0.0262 - -
3.9848 41600 0.0306 - -
3.9943 41700 0.0238 - -
4.0039 41800 0.0165 - -
4.0135 41900 0.0241 - -
4.0231 42000 0.0211 0.8092 0.742
4.0327 42100 0.0257 - -
4.0422 42200 0.0236 - -
4.0518 42300 0.0254 - -
4.0614 42400 0.0248 - -
4.0710 42500 0.026 - -
4.0806 42600 0.0245 - -
4.0901 42700 0.0325 - -
4.0997 42800 0.0209 - -
4.1093 42900 0.033 - -
4.1189 43000 0.0265 0.8105 0.7412
4.1285 43100 0.027 - -
4.1380 43200 0.0208 - -
4.1476 43300 0.0179 - -
4.1572 43400 0.0194 - -
4.1668 43500 0.0217 - -
4.1763 43600 0.0212 - -
4.1859 43700 0.0226 - -
4.1955 43800 0.0252 - -
4.2051 43900 0.0293 - -
4.2147 44000 0.0216 0.8029 0.7414
4.2242 44100 0.029 - -
4.2338 44200 0.0216 - -
4.2434 44300 0.0251 - -
4.2530 44400 0.018 - -
4.2626 44500 0.025 - -
4.2721 44600 0.0225 - -
4.2817 44700 0.0303 - -
4.2913 44800 0.028 - -
4.3009 44900 0.0203 - -
4.3104 45000 0.026 0.8081 0.7405

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.0
  • Transformers: 4.38.2
  • PyTorch: 2.1.2+cu121
  • Accelerate: 0.27.2
  • Datasets: 2.19.1
  • Tokenizers: 0.15.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, 
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}
Downloads last month
4
Safetensors
Model size
137M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for lv12/esci-nomic-embed-text-v1_5_1

Finetuned
(11)
this model

Evaluation results