juanlofer's picture
Add new SentenceTransformer model.
82d5a79 verified
metadata
language:
  - en
license: apache-2.0
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:3877
  - loss:MatryoshkaLoss
  - loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
datasets: []
metrics:
  - cosine_accuracy@1
  - cosine_accuracy@3
  - cosine_accuracy@5
  - cosine_accuracy@10
  - cosine_precision@1
  - cosine_precision@3
  - cosine_precision@5
  - cosine_precision@10
  - cosine_recall@1
  - cosine_recall@3
  - cosine_recall@5
  - cosine_recall@10
  - cosine_ndcg@10
  - cosine_mrr@10
  - cosine_map@100
widget:
  - source_sentence: >-
      <summary>The "_fastdds_statistics_sample_datas" topic tracks the number of
      data messages or fragments sent by a DataWriter to deliver a single
      sample, excluding built-in and statistics DataWriters.</summary>
    sentences:
      - |2-
          If several new data changes are received at once, the callbacks may
          be triggered just once, instead of once per change. The application
          must keep *reading* or *taking* until no new changes are available.
      - |-
        The "_fastdds_statistics_sample_datas" statistics topic collects the
        number of user's data messages (or data fragments in case that the
        message size is large enough to require RTPS fragmentation) that have
        been sent by the user's DataWriter to completely deliver a single
        sample. This topic does not apply to builtin (related to Discovery)
        and statistics DataWriters.
      - >-
        +------------------------------------------------+-----------------------------------------+------------+-------------+

        | Name                                           |
        Description                             | Values     | Default     |

        |================================================|=========================================|============|=============|

        | "<disable_heartbeat_piggyback>"                | See
        DisableHeartbeatPiggyback.          | "bool"     | "false"     |

        +------------------------------------------------+-----------------------------------------+------------+-------------+
  - source_sentence: >-
      The "enable_statistics_datawriter_with_profile()" method enables a
      DataWriter by searching a specific XML profile, requiring two parameters:
      the name of the XML profile and the name of the statistics topic to be
      enabled.
    sentences:
      - |-
        "enable_statistics_datawriter_with_profile()" method requires as
        parameters:
      - |-
        * **FIELDNAME**: is a reference to a field in the data-structure. The
          dot "." is used to navigate through nested structures. The number of
          dots that may be used in a FIELDNAME is unlimited. The FIELDNAME can
          refer to fields at any depth in the data structure. The names of the
          field are those specified in the IDL definition of the corresponding
          structure.
      - |2-
           * The TopicQos describing the behavior of the Topic. If the
             provided value is "TOPIC_QOS_DEFAULT", the value of the Default
             TopicQos is used.
  - source_sentence: >-
      <summary> ParticipantResourceLimitsQos configures allocation limits and
      physical memory usage for internal resources, including locators,
      participants, readers, writers, send buffers, data limits, and content
      filter discovery information. 
    sentences:
      - |-
        * "max_properties": Defines the maximum size, in octets, of the
          properties data in the local or remote participant.
      - |-
        Log entries can be filtered upon consumption according to their
        Category component using regular expressions. Each time an entry is
        ready to be consumed, the category filter is applied using
        "std::regex_search()". To set a category filter, member function
        "Log::SetCategoryFilter()" is used:
      - |-
        "create_datawriter_with_profile()" will return a null pointer if there
        was an error during the operation, e.g. if the provided QoS is not
        compatible or is not supported. It is advisable to check that the
        returned value is a valid pointer.
  - source_sentence: >-
      <summary>The Fast DDS Statistics module enables data collection and
      publication using DDS topics, which can be activated by setting
      "-DFASTDDS_STATISTICS=ON" during CMake configuration.>
    sentences:
      - |-

        "set_default_subscriber_qos()" member function also accepts the
        special value "SUBSCRIBER_QOS_DEFAULT" as input argument. This will
        reset the current default SubscriberQos to default constructed value
        "SubscriberQos()".
      - >-
        +------------------------------------------------------------------------------+-------------------------------------------+

        | Data Member
        Name                                                             |
        Type                                      |

        |==============================================================================|===========================================|

        |
        "last_instance_handle"                                                      
        | "InstanceHandle_t"                        |

        +------------------------------------------------------------------------------+-------------------------------------------+
      - |
        Note:  Please refer to Statistics QoS Troubleshooting for any problems
          related to the statistics module.
  - source_sentence: >-
      The transport layer provides communication services between DDS entities,
      using UDPv4, UDPv6, TCPv4, TCPv6, and SHM transports.
    sentences:
      - '* **TCPv4**: TCP communication over IPv4 (see TCP Transport).'
      - |-
        The following table shows the supported primitive types and their
        corresponding "TypeKind". The "TypeKind" is used to query the
        DynamicTypeBuilderFactory for the specific primitive DynamicType.
      - |2-
           @annotation MyAnnotation
           {
               long value;
               string name;
           };
pipeline_tag: sentence-similarity
model-index:
  - name: BGE base Fast-DDS summaries
    results:
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 768
          type: dim_768
        metrics:
          - type: cosine_accuracy@1
            value: 0.33410672853828305
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.44547563805104406
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.5034802784222738
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.5661252900232019
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.33410672853828305
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.14849187935034802
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.10069605568445474
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.05661252900232018
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.33410672853828305
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.44547563805104406
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.5034802784222738
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.5661252900232019
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.4437291164486755
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.40535023754281285
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.4159956670067687
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 512
          type: dim_512
        metrics:
          - type: cosine_accuracy@1
            value: 0.33642691415313225
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.44779582366589327
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.4965197215777262
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.5777262180974478
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.33642691415313225
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.14926527455529776
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.09930394431554523
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.057772621809744774
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.33642691415313225
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.44779582366589327
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.4965197215777262
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.5777262180974478
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.44632006141530195
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.4056724855448751
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.4154320968121733
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 256
          type: dim_256
        metrics:
          - type: cosine_accuracy@1
            value: 0.3271461716937355
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.44779582366589327
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.4988399071925754
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.5754060324825986
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.3271461716937355
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.14926527455529776
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.09976798143851506
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.05754060324825985
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.3271461716937355
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.44779582366589327
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.4988399071925754
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.5754060324825986
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.44144646221433803
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.3997293116782675
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.41051122365814446
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 128
          type: dim_128
        metrics:
          - type: cosine_accuracy@1
            value: 0.31554524361948955
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.42923433874709976
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.4802784222737819
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.5754060324825986
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.31554524361948955
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.1430781129156999
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.09605568445475636
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.05754060324825985
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.31554524361948955
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.42923433874709976
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.4802784222737819
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.5754060324825986
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.4328383223462609
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.38895517990645573
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.39937008449735967
            name: Cosine Map@100
      - task:
          type: information-retrieval
          name: Information Retrieval
        dataset:
          name: dim 64
          type: dim_64
        metrics:
          - type: cosine_accuracy@1
            value: 0.2853828306264501
            name: Cosine Accuracy@1
          - type: cosine_accuracy@3
            value: 0.41531322505800466
            name: Cosine Accuracy@3
          - type: cosine_accuracy@5
            value: 0.46867749419953597
            name: Cosine Accuracy@5
          - type: cosine_accuracy@10
            value: 0.5568445475638051
            name: Cosine Accuracy@10
          - type: cosine_precision@1
            value: 0.2853828306264501
            name: Cosine Precision@1
          - type: cosine_precision@3
            value: 0.13843774168600154
            name: Cosine Precision@3
          - type: cosine_precision@5
            value: 0.09373549883990717
            name: Cosine Precision@5
          - type: cosine_precision@10
            value: 0.0556844547563805
            name: Cosine Precision@10
          - type: cosine_recall@1
            value: 0.2853828306264501
            name: Cosine Recall@1
          - type: cosine_recall@3
            value: 0.41531322505800466
            name: Cosine Recall@3
          - type: cosine_recall@5
            value: 0.46867749419953597
            name: Cosine Recall@5
          - type: cosine_recall@10
            value: 0.5568445475638051
            name: Cosine Recall@10
          - type: cosine_ndcg@10
            value: 0.4098284836140229
            name: Cosine Ndcg@10
          - type: cosine_mrr@10
            value: 0.36409144477589944
            name: Cosine Mrr@10
          - type: cosine_map@100
            value: 0.37437465138771003
            name: Cosine Map@100

BGE base Fast-DDS summaries

This is a sentence-transformers model finetuned from BAAI/bge-base-en-v1.5. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-base-en-v1.5
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Language: en
  • License: apache-2.0

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("juanlofer/bge-base-fastdds-summaries-20epochs-666seed")
# Run inference
sentences = [
    'The transport layer provides communication services between DDS entities, using UDPv4, UDPv6, TCPv4, TCPv6, and SHM transports.',
    '* **TCPv4**: TCP communication over IPv4 (see TCP Transport).',
    'The following table shows the supported primitive types and their\ncorresponding "TypeKind". The "TypeKind" is used to query the\nDynamicTypeBuilderFactory for the specific primitive DynamicType.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Information Retrieval

Metric Value
cosine_accuracy@1 0.3341
cosine_accuracy@3 0.4455
cosine_accuracy@5 0.5035
cosine_accuracy@10 0.5661
cosine_precision@1 0.3341
cosine_precision@3 0.1485
cosine_precision@5 0.1007
cosine_precision@10 0.0566
cosine_recall@1 0.3341
cosine_recall@3 0.4455
cosine_recall@5 0.5035
cosine_recall@10 0.5661
cosine_ndcg@10 0.4437
cosine_mrr@10 0.4054
cosine_map@100 0.416

Information Retrieval

Metric Value
cosine_accuracy@1 0.3364
cosine_accuracy@3 0.4478
cosine_accuracy@5 0.4965
cosine_accuracy@10 0.5777
cosine_precision@1 0.3364
cosine_precision@3 0.1493
cosine_precision@5 0.0993
cosine_precision@10 0.0578
cosine_recall@1 0.3364
cosine_recall@3 0.4478
cosine_recall@5 0.4965
cosine_recall@10 0.5777
cosine_ndcg@10 0.4463
cosine_mrr@10 0.4057
cosine_map@100 0.4154

Information Retrieval

Metric Value
cosine_accuracy@1 0.3271
cosine_accuracy@3 0.4478
cosine_accuracy@5 0.4988
cosine_accuracy@10 0.5754
cosine_precision@1 0.3271
cosine_precision@3 0.1493
cosine_precision@5 0.0998
cosine_precision@10 0.0575
cosine_recall@1 0.3271
cosine_recall@3 0.4478
cosine_recall@5 0.4988
cosine_recall@10 0.5754
cosine_ndcg@10 0.4414
cosine_mrr@10 0.3997
cosine_map@100 0.4105

Information Retrieval

Metric Value
cosine_accuracy@1 0.3155
cosine_accuracy@3 0.4292
cosine_accuracy@5 0.4803
cosine_accuracy@10 0.5754
cosine_precision@1 0.3155
cosine_precision@3 0.1431
cosine_precision@5 0.0961
cosine_precision@10 0.0575
cosine_recall@1 0.3155
cosine_recall@3 0.4292
cosine_recall@5 0.4803
cosine_recall@10 0.5754
cosine_ndcg@10 0.4328
cosine_mrr@10 0.389
cosine_map@100 0.3994

Information Retrieval

Metric Value
cosine_accuracy@1 0.2854
cosine_accuracy@3 0.4153
cosine_accuracy@5 0.4687
cosine_accuracy@10 0.5568
cosine_precision@1 0.2854
cosine_precision@3 0.1384
cosine_precision@5 0.0937
cosine_precision@10 0.0557
cosine_recall@1 0.2854
cosine_recall@3 0.4153
cosine_recall@5 0.4687
cosine_recall@10 0.5568
cosine_ndcg@10 0.4098
cosine_mrr@10 0.3641
cosine_map@100 0.3744

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: epoch
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • gradient_accumulation_steps: 16
  • learning_rate: 2e-05
  • num_train_epochs: 20
  • lr_scheduler_type: cosine
  • warmup_ratio: 0.1
  • fp16: True
  • tf32: False
  • load_best_model_at_end: True
  • optim: adamw_torch_fused
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: epoch
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 16
  • eval_accumulation_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 20
  • max_steps: -1
  • lr_scheduler_type: cosine
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: False
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch_fused
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss dim_128_cosine_map@100 dim_256_cosine_map@100 dim_512_cosine_map@100 dim_64_cosine_map@100 dim_768_cosine_map@100
0.6584 10 5.9441 - - - - -
0.9877 15 - 0.3686 0.3792 0.3819 0.3414 0.3795
1.3128 20 4.7953 - - - - -
1.9712 30 3.77 0.3854 0.3963 0.3962 0.3682 0.3995
2.6255 40 2.9211 - - - - -
2.9547 45 - 0.3866 0.3919 0.3958 0.3759 0.3963
3.2798 50 2.4548 - - - - -
3.9383 60 2.0513 - - - - -
4.0041 61 - 0.3808 0.4018 0.3980 0.3647 0.3962
4.5926 70 1.5898 - - - - -
4.9877 76 - 0.3829 0.4029 0.4035 0.3625 0.4014
5.2469 80 1.4677 - - - - -
5.9053 90 1.1974 - - - - -
5.9712 91 - 0.3918 0.4006 0.4041 0.3654 0.4033
6.5597 100 0.9285 - - - - -
6.9547 106 - 0.3914 0.4019 0.4033 0.3678 0.4014
7.2140 110 0.9214 - - - - -
7.8724 120 0.8141 - - - - -
8.0041 122 - 0.3914 0.3993 0.4071 0.3670 0.4027
8.5267 130 0.6706 - - - - -
8.9877 137 - 0.3903 0.4033 0.4060 0.3721 0.4060
9.1811 140 0.6388 - - - - -
9.8395 150 0.5466 - - - - -
9.9712 152 - 0.3915 0.4020 0.4079 0.3673 0.4046
10.4938 160 0.466 - - - - -
10.9547 167 - 0.3963 0.4069 0.4112 0.3697 0.4078
11.1481 170 0.4709 - - - - -
11.8066 180 0.437 - - - - -
12.0041 183 - 0.4003 0.4051 0.4096 0.3701 0.4059
12.4609 190 0.3678 - - - - -
12.9877 198 - 0.3976 0.4075 0.4088 0.3713 0.4080
13.1152 200 0.3944 - - - - -
13.7737 210 0.361 - - - - -
13.9712 213 - 0.3966 0.4091 0.4096 0.3724 0.4107
14.4280 220 0.2977 - - - - -
14.9547 228 - 0.3979 0.4102 0.4149 0.3744 0.4143
15.0823 230 0.3306 - - - - -
15.7407 240 0.3075 - - - - -
16.0041 244 - 0.3991 0.4102 0.4156 0.3726 0.4148
16.3951 250 0.2777 - - - - -
16.9877 259 - 0.3990 0.4101 0.4154 0.3743 0.4167
17.0494 260 0.3044 - - - - -
17.7078 270 0.2885 - - - - -
17.9712 274 - 0.3991 0.4099 0.4153 0.3746 0.4167
18.3621 280 0.2862 - - - - -
18.9547 289 - 0.3994 0.4105 0.4154 0.3743 0.4156
19.0165 290 0.2974 - - - - -
19.6749 300 0.2648 0.3994 0.4105 0.4154 0.3744 0.4160
  • The bold row denotes the saved checkpoint.

Framework Versions

  • Python: 3.10.13
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.1.2
  • Accelerate: 0.30.1
  • Datasets: 2.19.1
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MatryoshkaLoss

@misc{kusupati2024matryoshka,
    title={Matryoshka Representation Learning}, 
    author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
    year={2024},
    eprint={2205.13147},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

MultipleNegativesRankingLoss

@misc{henderson2017efficient,
    title={Efficient Natural Language Response Suggestion for Smart Reply}, 
    author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
    year={2017},
    eprint={1705.00652},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}