--- language: - id library_name: sentence-transformers tags: - sentence-transformers - sentence-similarity - feature-extraction - generated_from_trainer - dataset_size:10000 - loss:SoftmaxLoss base_model: indobenchmark/indobert-base-p2 datasets: - afaji/indonli metrics: - pearson_cosine - spearman_cosine - pearson_manhattan - spearman_manhattan - pearson_euclidean - spearman_euclidean - pearson_dot - spearman_dot - pearson_max - spearman_max widget: - source_sentence: '"Berbagai macam jenis minuman sehat untuk mengembalikan ion ataupun mengandung vitamin, dapat kita temui dengan mudah di sekitar."' sentences: - Moody's tidak memiliki metrik peringkat untuk penerbit sekuritas yang dikenai pajak. - Lupa olahraga adalah alasan yang selalu digunakan untuk tak berolahraga. - Minuman sehat sulit ditemui. - source_sentence: Mayweather menepis anggapan bahwa McGregor yang merupakan petarung kidal mungkin menyebabkan masalah baginya. sentences: - Cimahi Selatan merupakan sebuah Kecamatan di Kota Cimahi. - Masyarakat umum dilibatkan untuk memberikan respon dalam acara dengar pendapat CRTC. - McGregor dan Mayweather pernah bertarung dengan sengit. - source_sentence: Wonosobo adalah salah satu kabupaten yang terdapat di Provinsi Jawa Tengah. sentences: - Tidak terdapat kabupaten di Provinsi Jawa Tengah. - Nogizaka46 sekarang sudah merilis 25 singel. - Joko Driyono adalah Wakil Ketua Umum PSSI. - source_sentence: Bangunan ini digunakan untuk penjualan berbagai material. ' sentences: - Istri bisa mengidamkan makanan yang mudah dicari. - Saluran telepon tidak digunakan oleh FastNet dalam menyediakan akses internet. - Bangunan ini digunakan untuk penjualan. - source_sentence: Set album musik pengiring seri film Harry Potter akan dirilis dalam versi baru. sentences: - Seri film Harry Potter memiliki set album musik pengiring. - Daya tahan tubuh bayi tidak terjaga walaupun diberi ASI. - Laga dan kolosal adalah genre film. pipeline_tag: sentence-similarity model-index: - name: SentenceTransformer based on indobenchmark/indobert-base-p2 results: - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts dev type: sts-dev metrics: - type: pearson_cosine value: 0.3021139089985203 name: Pearson Cosine - type: spearman_cosine value: 0.30301169986128346 name: Spearman Cosine - type: pearson_manhattan value: 0.2767840491173264 name: Pearson Manhattan - type: spearman_manhattan value: 0.2725949754810958 name: Spearman Manhattan - type: pearson_euclidean value: 0.3071661849384816 name: Pearson Euclidean - type: spearman_euclidean value: 0.3044966278223258 name: Spearman Euclidean - type: pearson_dot value: 0.3039090779569512 name: Pearson Dot - type: spearman_dot value: 0.3047234168200123 name: Spearman Dot - type: pearson_max value: 0.3071661849384816 name: Pearson Max - type: spearman_max value: 0.3047234168200123 name: Spearman Max - task: type: semantic-similarity name: Semantic Similarity dataset: name: sts test type: sts-test metrics: - type: pearson_cosine value: 0.10382066164158449 name: Pearson Cosine - type: spearman_cosine value: 0.09693567465932618 name: Spearman Cosine - type: pearson_manhattan value: 0.07492996229311771 name: Pearson Manhattan - type: spearman_manhattan value: 0.07823414156216839 name: Spearman Manhattan - type: pearson_euclidean value: 0.09422022261567607 name: Pearson Euclidean - type: spearman_euclidean value: 0.09902189422521299 name: Spearman Euclidean - type: pearson_dot value: 0.10695495102872325 name: Pearson Dot - type: spearman_dot value: 0.09978448101169902 name: Spearman Dot - type: pearson_max value: 0.10695495102872325 name: Pearson Max - type: spearman_max value: 0.09978448101169902 name: Spearman Max --- # SentenceTransformer based on indobenchmark/indobert-base-p2 This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [indobenchmark/indobert-base-p2](https://huggingface.co/indobenchmark/indobert-base-p2) on the [afaji/indonli](https://huggingface.co/datasets/afaji/indonli) dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more. ## Model Details ### Model Description - **Model Type:** Sentence Transformer - **Base model:** [indobenchmark/indobert-base-p2](https://huggingface.co/indobenchmark/indobert-base-p2) - **Maximum Sequence Length:** 512 tokens - **Output Dimensionality:** 768 tokens - **Similarity Function:** Cosine Similarity - **Training Dataset:** - [afaji/indonli](https://huggingface.co/datasets/afaji/indonli) - **Language:** id ### Model Sources - **Documentation:** [Sentence Transformers Documentation](https://sbert.net) - **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers) - **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers) ### Full Model Architecture ``` SentenceTransformer( (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True}) ) ``` ## Usage ### Direct Usage (Sentence Transformers) First install the Sentence Transformers library: ```bash pip install -U sentence-transformers ``` Then you can load this model and run inference. ```python from sentence_transformers import SentenceTransformer # Download from the 🤗 Hub model = SentenceTransformer("cassador/indobert-base-p2-nli-v2") # Run inference sentences = [ 'Set album musik pengiring seri film Harry Potter akan dirilis dalam versi baru.', 'Seri film Harry Potter memiliki set album musik pengiring.', 'Laga dan kolosal adalah genre film.', ] embeddings = model.encode(sentences) print(embeddings.shape) # [3, 768] # Get the similarity scores for the embeddings similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [3, 3] ``` ## Evaluation ### Metrics #### Semantic Similarity * Dataset: `sts-dev` * Evaluated with [EmbeddingSimilarityEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:----------| | pearson_cosine | 0.3021 | | **spearman_cosine** | **0.303** | | pearson_manhattan | 0.2768 | | spearman_manhattan | 0.2726 | | pearson_euclidean | 0.3072 | | spearman_euclidean | 0.3045 | | pearson_dot | 0.3039 | | spearman_dot | 0.3047 | | pearson_max | 0.3072 | | spearman_max | 0.3047 | #### Semantic Similarity * Dataset: `sts-test` * Evaluated with [EmbeddingSimilarityEvaluator](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.EmbeddingSimilarityEvaluator) | Metric | Value | |:--------------------|:-----------| | pearson_cosine | 0.1038 | | **spearman_cosine** | **0.0969** | | pearson_manhattan | 0.0749 | | spearman_manhattan | 0.0782 | | pearson_euclidean | 0.0942 | | spearman_euclidean | 0.099 | | pearson_dot | 0.107 | | spearman_dot | 0.0998 | | pearson_max | 0.107 | | spearman_max | 0.0998 | ## Training Details ### Training Dataset #### afaji/indonli * Dataset: [afaji/indonli](https://huggingface.co/datasets/afaji/indonli) * Size: 10,000 training samples * Columns: premise, hypothesis, and label * Approximate statistics based on the first 1000 samples: | | premise | hypothesis | label | |:--------|:------------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | | | | * Samples: | premise | hypothesis | label | |:-----------------------------------------------------------------------------------------------------------------------------------------------------------|:-------------------------------------------------------------------------|:---------------| | Presiden Joko Widodo (Jokowi) menyampaikan prediksi bahwa wabah virus Corona (COVID-19) di Indonesia akan selesai akhir tahun ini. | Prediksi akhir wabah tidak disampaikan Jokowi. | 0 | | Meski biasanya hanya digunakan di fasilitas kesehatan, saat ini masker dan sarung tangan sekali pakai banyak dipakai di tingkat rumah tangga. | Masker sekali pakai banyak dipakai di tingkat rumah tangga. | 1 | | Data dari Nielsen Music mencatat, "Joanne" telah terjual 201 ribu kopi di akhir minggu ini, seperti dilansir aceshowbiz.com. | Nielsen Music mencatat pada akhir minggu ini. | 0 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) ### Evaluation Dataset #### afaji/indonli * Dataset: [afaji/indonli](https://huggingface.co/datasets/afaji/indonli) * Size: 2,000 evaluation samples * Columns: premise, hypothesis, and label * Approximate statistics based on the first 1000 samples: | | premise | hypothesis | label | |:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|:------------------------------------------------| | type | string | string | int | | details | | | | * Samples: | premise | hypothesis | label | |:-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|:------------------------------------------------------------------------------------------------------------|:---------------| | Manuskrip tersebut berisi tiga catatan yang menceritakan bagaimana peristiwa jatuhnya meteorit serta laporan kematian akibat kejadian tersebut seperti dilansir dari Science Alert, Sabtu (25/4/2020). | Manuskrip tersebut tidak mencatat laporan kematian. | 0 | | Dilansir dari Business Insider, menurut observasi dari Mauna Loa Observatory di Hawaii pada karbon dioksida (CO2) di level mencapai 410 ppm tidak langsung memberikan efek pada pernapasan, karena tubuh manusia juga masih membutuhkan CO2 dalam kadar tertentu. | Tidak ada observasi yang pernah dilansir oleh Business Insider. | 0 | | Perekonomian Jakarta terutama ditunjang oleh sektor perdagangan, jasa, properti, industri kreatif, dan keuangan. | Sektor jasa memberi pengaruh lebih besar daripada industri kreatif dalam perekonomian Jakarta. | 0 | * Loss: [SoftmaxLoss](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#softmaxloss) ### Training Hyperparameters #### Non-Default Hyperparameters - `eval_strategy`: epoch - `learning_rate`: 1e-05 - `num_train_epochs`: 10 - `warmup_ratio`: 0.001 - `fp16`: True #### All Hyperparameters
Click to expand - `overwrite_output_dir`: False - `do_predict`: False - `eval_strategy`: epoch - `prediction_loss_only`: True - `per_device_train_batch_size`: 8 - `per_device_eval_batch_size`: 8 - `per_gpu_train_batch_size`: None - `per_gpu_eval_batch_size`: None - `gradient_accumulation_steps`: 1 - `eval_accumulation_steps`: None - `learning_rate`: 1e-05 - `weight_decay`: 0.0 - `adam_beta1`: 0.9 - `adam_beta2`: 0.999 - `adam_epsilon`: 1e-08 - `max_grad_norm`: 1.0 - `num_train_epochs`: 10 - `max_steps`: -1 - `lr_scheduler_type`: linear - `lr_scheduler_kwargs`: {} - `warmup_ratio`: 0.001 - `warmup_steps`: 0 - `log_level`: passive - `log_level_replica`: warning - `log_on_each_node`: True - `logging_nan_inf_filter`: True - `save_safetensors`: True - `save_on_each_node`: False - `save_only_model`: False - `restore_callback_states_from_checkpoint`: False - `no_cuda`: False - `use_cpu`: False - `use_mps_device`: False - `seed`: 42 - `data_seed`: None - `jit_mode_eval`: False - `use_ipex`: False - `bf16`: False - `fp16`: True - `fp16_opt_level`: O1 - `half_precision_backend`: auto - `bf16_full_eval`: False - `fp16_full_eval`: False - `tf32`: None - `local_rank`: 0 - `ddp_backend`: None - `tpu_num_cores`: None - `tpu_metrics_debug`: False - `debug`: [] - `dataloader_drop_last`: False - `dataloader_num_workers`: 0 - `dataloader_prefetch_factor`: None - `past_index`: -1 - `disable_tqdm`: False - `remove_unused_columns`: True - `label_names`: None - `load_best_model_at_end`: False - `ignore_data_skip`: False - `fsdp`: [] - `fsdp_min_num_params`: 0 - `fsdp_config`: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False} - `fsdp_transformer_layer_cls_to_wrap`: None - `accelerator_config`: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None} - `deepspeed`: None - `label_smoothing_factor`: 0.0 - `optim`: adamw_torch - `optim_args`: None - `adafactor`: False - `group_by_length`: False - `length_column_name`: length - `ddp_find_unused_parameters`: None - `ddp_bucket_cap_mb`: None - `ddp_broadcast_buffers`: False - `dataloader_pin_memory`: True - `dataloader_persistent_workers`: False - `skip_memory_metrics`: True - `use_legacy_prediction_loop`: False - `push_to_hub`: False - `resume_from_checkpoint`: None - `hub_model_id`: None - `hub_strategy`: every_save - `hub_private_repo`: False - `hub_always_push`: False - `gradient_checkpointing`: False - `gradient_checkpointing_kwargs`: None - `include_inputs_for_metrics`: False - `eval_do_concat_batches`: True - `fp16_backend`: auto - `push_to_hub_model_id`: None - `push_to_hub_organization`: None - `mp_parameters`: - `auto_find_batch_size`: False - `full_determinism`: False - `torchdynamo`: None - `ray_scope`: last - `ddp_timeout`: 1800 - `torch_compile`: False - `torch_compile_backend`: None - `torch_compile_mode`: None - `dispatch_batches`: None - `split_batches`: None - `include_tokens_per_second`: False - `include_num_input_tokens_seen`: False - `neftune_noise_alpha`: None - `optim_target_modules`: None - `batch_eval_metrics`: False - `batch_sampler`: batch_sampler - `multi_dataset_batch_sampler`: proportional
### Training Logs
Click to expand | Epoch | Step | Training Loss | loss | sts-dev_spearman_cosine | sts-test_spearman_cosine | |:------:|:-----:|:-------------:|:------:|:-----------------------:|:------------------------:| | 0 | 0 | - | - | 0.1928 | - | | 0.04 | 100 | 1.1407 | - | - | - | | 0.08 | 200 | 0.7456 | - | - | - | | 0.12 | 300 | 0.6991 | - | - | - | | 0.16 | 400 | 0.6653 | - | - | - | | 0.2 | 500 | 0.6317 | - | - | - | | 0.24 | 600 | 0.5975 | - | - | - | | 0.28 | 700 | 0.5955 | - | - | - | | 0.32 | 800 | 0.6168 | - | - | - | | 0.36 | 900 | 0.5851 | - | - | - | | 0.4 | 1000 | 0.591 | - | - | - | | 0.44 | 1100 | 0.6063 | - | - | - | | 0.48 | 1200 | 0.6122 | - | - | - | | 0.52 | 1300 | 0.5881 | - | - | - | | 0.56 | 1400 | 0.59 | - | - | - | | 0.6 | 1500 | 0.5715 | - | - | - | | 0.64 | 1600 | 0.5725 | - | - | - | | 0.68 | 1700 | 0.5771 | - | - | - | | 0.72 | 1800 | 0.5935 | - | - | - | | 0.76 | 1900 | 0.584 | - | - | - | | 0.8 | 2000 | 0.5829 | - | - | - | | 0.84 | 2100 | 0.5507 | - | - | - | | 0.88 | 2200 | 0.5447 | - | - | - | | 0.92 | 2300 | 0.6059 | - | - | - | | 0.96 | 2400 | 0.5389 | - | - | - | | 1.0 | 2500 | 0.639 | 0.5432 | 0.4007 | - | | 1.04 | 2600 | 0.463 | - | - | - | | 1.08 | 2700 | 0.4936 | - | - | - | | 1.12 | 2800 | 0.4966 | - | - | - | | 1.16 | 2900 | 0.4588 | - | - | - | | 1.2 | 3000 | 0.5148 | - | - | - | | 1.24 | 3100 | 0.5043 | - | - | - | | 1.28 | 3200 | 0.5048 | - | - | - | | 1.32 | 3300 | 0.4803 | - | - | - | | 1.3600 | 3400 | 0.465 | - | - | - | | 1.4 | 3500 | 0.5133 | - | - | - | | 1.44 | 3600 | 0.5505 | - | - | - | | 1.48 | 3700 | 0.4498 | - | - | - | | 1.52 | 3800 | 0.5418 | - | - | - | | 1.56 | 3900 | 0.5268 | - | - | - | | 1.6 | 4000 | 0.4546 | - | - | - | | 1.6400 | 4100 | 0.5279 | - | - | - | | 1.6800 | 4200 | 0.5309 | - | - | - | | 1.72 | 4300 | 0.487 | - | - | - | | 1.76 | 4400 | 0.5371 | - | - | - | | 1.8 | 4500 | 0.5097 | - | - | - | | 1.8400 | 4600 | 0.5242 | - | - | - | | 1.88 | 4700 | 0.4583 | - | - | - | | 1.92 | 4800 | 0.4923 | - | - | - | | 1.96 | 4900 | 0.5028 | - | - | - | | 2.0 | 5000 | 0.5139 | 0.6274 | 0.4335 | - | | 2.04 | 5100 | 0.322 | - | - | - | | 2.08 | 5200 | 0.389 | - | - | - | | 2.12 | 5300 | 0.3633 | - | - | - | | 2.16 | 5400 | 0.3868 | - | - | - | | 2.2 | 5500 | 0.3798 | - | - | - | | 2.24 | 5600 | 0.4385 | - | - | - | | 2.2800 | 5700 | 0.3965 | - | - | - | | 2.32 | 5800 | 0.3895 | - | - | - | | 2.36 | 5900 | 0.4484 | - | - | - | | 2.4 | 6000 | 0.3452 | - | - | - | | 2.44 | 6100 | 0.3905 | - | - | - | | 2.48 | 6200 | 0.376 | - | - | - | | 2.52 | 6300 | 0.4986 | - | - | - | | 2.56 | 6400 | 0.3732 | - | - | - | | 2.6 | 6500 | 0.3632 | - | - | - | | 2.64 | 6600 | 0.3915 | - | - | - | | 2.68 | 6700 | 0.4394 | - | - | - | | 2.7200 | 6800 | 0.3852 | - | - | - | | 2.76 | 6900 | 0.3984 | - | - | - | | 2.8 | 7000 | 0.426 | - | - | - | | 2.84 | 7100 | 0.3274 | - | - | - | | 2.88 | 7200 | 0.4673 | - | - | - | | 2.92 | 7300 | 0.4599 | - | - | - | | 2.96 | 7400 | 0.4304 | - | - | - | | 3.0 | 7500 | 0.4151 | 0.8967 | 0.4007 | - | | 3.04 | 7600 | 0.2345 | - | - | - | | 3.08 | 7700 | 0.1807 | - | - | - | | 3.12 | 7800 | 0.2984 | - | - | - | | 3.16 | 7900 | 0.2357 | - | - | - | | 3.2 | 8000 | 0.4506 | - | - | - | | 3.24 | 8100 | 0.2178 | - | - | - | | 3.2800 | 8200 | 0.2654 | - | - | - | | 3.32 | 8300 | 0.2863 | - | - | - | | 3.36 | 8400 | 0.2626 | - | - | - | | 3.4 | 8500 | 0.3281 | - | - | - | | 3.44 | 8600 | 0.2555 | - | - | - | | 3.48 | 8700 | 0.4245 | - | - | - | | 3.52 | 8800 | 0.2368 | - | - | - | | 3.56 | 8900 | 0.3288 | - | - | - | | 3.6 | 9000 | 0.3417 | - | - | - | | 3.64 | 9100 | 0.3249 | - | - | - | | 3.68 | 9200 | 0.3378 | - | - | - | | 3.7200 | 9300 | 0.233 | - | - | - | | 3.76 | 9400 | 0.3215 | - | - | - | | 3.8 | 9500 | 0.251 | - | - | - | | 3.84 | 9600 | 0.3138 | - | - | - | | 3.88 | 9700 | 0.3081 | - | - | - | | 3.92 | 9800 | 0.3875 | - | - | - | | 3.96 | 9900 | 0.3231 | - | - | - | | 4.0 | 10000 | 0.2119 | 1.4983 | 0.4129 | - | | 4.04 | 10100 | 0.1323 | - | - | - | | 4.08 | 10200 | 0.2222 | - | - | - | | 4.12 | 10300 | 0.2005 | - | - | - | | 4.16 | 10400 | 0.127 | - | - | - | | 4.2 | 10500 | 0.1052 | - | - | - | | 4.24 | 10600 | 0.1657 | - | - | - | | 4.28 | 10700 | 0.2305 | - | - | - | | 4.32 | 10800 | 0.1048 | - | - | - | | 4.36 | 10900 | 0.2081 | - | - | - | | 4.4 | 11000 | 0.201 | - | - | - | | 4.44 | 11100 | 0.1515 | - | - | - | | 4.48 | 11200 | 0.2112 | - | - | - | | 4.52 | 11300 | 0.1936 | - | - | - | | 4.5600 | 11400 | 0.1578 | - | - | - | | 4.6 | 11500 | 0.2551 | - | - | - | | 4.64 | 11600 | 0.2888 | - | - | - | | 4.68 | 11700 | 0.128 | - | - | - | | 4.72 | 11800 | 0.2172 | - | - | - | | 4.76 | 11900 | 0.114 | - | - | - | | 4.8 | 12000 | 0.2135 | - | - | - | | 4.84 | 12100 | 0.2421 | - | - | - | | 4.88 | 12200 | 0.2392 | - | - | - | | 4.92 | 12300 | 0.1478 | - | - | - | | 4.96 | 12400 | 0.1901 | - | - | - | | 5.0 | 12500 | 0.2219 | 1.9582 | 0.3469 | - | | 5.04 | 12600 | 0.1586 | - | - | - | | 5.08 | 12700 | 0.1587 | - | - | - | | 5.12 | 12800 | 0.0663 | - | - | - | | 5.16 | 12900 | 0.0703 | - | - | - | | 5.2 | 13000 | 0.0783 | - | - | - | | 5.24 | 13100 | 0.1143 | - | - | - | | 5.28 | 13200 | 0.1155 | - | - | - | | 5.32 | 13300 | 0.0661 | - | - | - | | 5.36 | 13400 | 0.0935 | - | - | - | | 5.4 | 13500 | 0.1344 | - | - | - | | 5.44 | 13600 | 0.1031 | - | - | - | | 5.48 | 13700 | 0.1294 | - | - | - | | 5.52 | 13800 | 0.103 | - | - | - | | 5.5600 | 13900 | 0.0739 | - | - | - | | 5.6 | 14000 | 0.1477 | - | - | - | | 5.64 | 14100 | 0.1171 | - | - | - | | 5.68 | 14200 | 0.1504 | - | - | - | | 5.72 | 14300 | 0.1122 | - | - | - | | 5.76 | 14400 | 0.1279 | - | - | - | | 5.8 | 14500 | 0.0813 | - | - | - | | 5.84 | 14600 | 0.1372 | - | - | - | | 5.88 | 14700 | 0.1615 | - | - | - | | 5.92 | 14800 | 0.1944 | - | - | - | | 5.96 | 14900 | 0.0436 | - | - | - | | 6.0 | 15000 | 0.1195 | 2.2220 | 0.3559 | - | | 0.08 | 100 | 0.0844 | - | - | - | | 0.16 | 200 | 0.1357 | - | - | - | | 0.24 | 300 | 0.1382 | - | - | - | | 0.32 | 400 | 0.2091 | - | - | - | | 0.4 | 500 | 0.2351 | - | - | - | | 0.48 | 600 | 0.2976 | - | - | - | | 0.56 | 700 | 0.3408 | - | - | - | | 0.64 | 800 | 0.2656 | - | - | - | | 0.72 | 900 | 0.3183 | - | - | - | | 0.8 | 1000 | 0.2513 | - | - | - | | 0.88 | 1100 | 0.2293 | - | - | - | | 0.96 | 1200 | 0.3241 | - | - | - | | 1.0 | 1250 | - | 1.1813 | 0.3495 | - | | 0.3195 | 100 | 0.6132 | - | - | - | | 0.6390 | 200 | 0.1554 | - | - | - | | 0.9585 | 300 | 0.1366 | - | - | - | | 1.0 | 313 | - | 1.2867 | 0.3839 | - | | 0.08 | 100 | 0.2713 | - | - | - | | 0.16 | 200 | 0.1273 | - | - | - | | 0.24 | 300 | 0.0883 | - | - | - | | 0.32 | 400 | 0.0749 | - | - | - | | 0.08 | 100 | 0.0653 | - | - | - | | 0.16 | 200 | 0.0311 | - | - | - | | 0.24 | 300 | 0.0368 | - | - | - | | 0.32 | 400 | 0.0259 | - | - | - | | 0.4 | 500 | 0.059 | - | - | - | | 0.48 | 600 | 0.046 | - | - | - | | 0.56 | 700 | 0.1266 | - | - | - | | 0.64 | 800 | 0.0661 | - | - | - | | 0.72 | 900 | 0.0676 | - | - | - | | 0.8 | 1000 | 0.0759 | - | - | - | | 0.88 | 1100 | 0.0527 | - | - | - | | 0.96 | 1200 | 0.1038 | - | - | - | | 1.0 | 1250 | - | 2.2411 | 0.3892 | - | | 1.04 | 1300 | 0.0456 | - | - | - | | 1.12 | 1400 | 0.1363 | - | - | - | | 1.2 | 1500 | 0.1398 | - | - | - | | 1.28 | 1600 | 0.1237 | - | - | - | | 1.3600 | 1700 | 0.123 | - | - | - | | 1.44 | 1800 | 0.1893 | - | - | - | | 1.52 | 1900 | 0.1192 | - | - | - | | 1.6 | 2000 | 0.1347 | - | - | - | | 1.6800 | 2100 | 0.0937 | - | - | - | | 1.76 | 2200 | 0.1506 | - | - | - | | 1.8400 | 2300 | 0.1366 | - | - | - | | 1.92 | 2400 | 0.1194 | - | - | - | | 2.0 | 2500 | 0.1485 | 2.1340 | 0.3245 | - | | 2.08 | 2600 | 0.0485 | - | - | - | | 2.16 | 2700 | 0.0579 | - | - | - | | 2.24 | 2800 | 0.0932 | - | - | - | | 2.32 | 2900 | 0.0743 | - | - | - | | 2.4 | 3000 | 0.0783 | - | - | - | | 2.48 | 3100 | 0.0918 | - | - | - | | 2.56 | 3200 | 0.0973 | - | - | - | | 2.64 | 3300 | 0.0623 | - | - | - | | 2.7200 | 3400 | 0.1284 | - | - | - | | 2.8 | 3500 | 0.1247 | - | - | - | | 2.88 | 3600 | 0.0648 | - | - | - | | 2.96 | 3700 | 0.0921 | - | - | - | | 3.0 | 3750 | - | 2.4354 | 0.2824 | - | | 3.04 | 3800 | 0.04 | - | - | - | | 3.12 | 3900 | 0.0417 | - | - | - | | 3.2 | 4000 | 0.0414 | - | - | - | | 3.2800 | 4100 | 0.0485 | - | - | - | | 3.36 | 4200 | 0.0255 | - | - | - | | 3.44 | 4300 | 0.0688 | - | - | - | | 3.52 | 4400 | 0.0574 | - | - | - | | 3.6 | 4500 | 0.0766 | - | - | - | | 3.68 | 4600 | 0.0481 | - | - | - | | 3.76 | 4700 | 0.06 | - | - | - | | 3.84 | 4800 | 0.0528 | - | - | - | | 3.92 | 4900 | 0.0426 | - | - | - | | 4.0 | 5000 | 0.092 | 2.5427 | 0.3284 | - | | 4.08 | 5100 | 0.0349 | - | - | - | | 4.16 | 5200 | 0.0107 | - | - | - | | 4.24 | 5300 | 0.0608 | - | - | - | | 4.32 | 5400 | 0.0473 | - | - | - | | 4.4 | 5500 | 0.0452 | - | - | - | | 4.48 | 5600 | 0.0316 | - | - | - | | 4.5600 | 5700 | 0.0096 | - | - | - | | 4.64 | 5800 | 0.0511 | - | - | - | | 4.72 | 5900 | 0.0207 | - | - | - | | 4.8 | 6000 | 0.0061 | - | - | - | | 4.88 | 6100 | 0.0381 | - | - | - | | 4.96 | 6200 | 0.0378 | - | - | - | | 5.0 | 6250 | - | 2.6061 | 0.3061 | - | | 5.04 | 6300 | 0.0326 | - | - | - | | 5.12 | 6400 | 0.0349 | - | - | - | | 5.2 | 6500 | 0.0128 | - | - | - | | 5.28 | 6600 | 0.0185 | - | - | - | | 5.36 | 6700 | 0.0145 | - | - | - | | 5.44 | 6800 | 0.0521 | - | - | - | | 5.52 | 6900 | 0.0427 | - | - | - | | 5.6 | 7000 | 0.0215 | - | - | - | | 5.68 | 7100 | 0.0195 | - | - | - | | 5.76 | 7200 | 0.0426 | - | - | - | | 5.84 | 7300 | 0.057 | - | - | - | | 5.92 | 7400 | 0.0106 | - | - | - | | 6.0 | 7500 | 0.0284 | 2.8348 | 0.3291 | - | | 6.08 | 7600 | 0.0286 | - | - | - | | 6.16 | 7700 | 0.018 | - | - | - | | 6.24 | 7800 | 0.0224 | - | - | - | | 6.32 | 7900 | 0.0102 | - | - | - | | 6.4 | 8000 | 0.0287 | - | - | - | | 6.48 | 8100 | 0.0078 | - | - | - | | 6.5600 | 8200 | 0.0237 | - | - | - | | 6.64 | 8300 | 0.0148 | - | - | - | | 6.72 | 8400 | 0.0271 | - | - | - | | 6.8 | 8500 | 0.015 | - | - | - | | 6.88 | 8600 | 0.0278 | - | - | - | | 6.96 | 8700 | 0.0237 | - | - | - | | 7.0 | 8750 | - | 2.8785 | 0.3188 | - | | 7.04 | 8800 | 0.0203 | - | - | - | | 7.12 | 8900 | 0.0089 | - | - | - | | 7.2 | 9000 | 0.0121 | - | - | - | | 7.28 | 9100 | 0.0185 | - | - | - | | 7.36 | 9200 | 0.0127 | - | - | - | | 7.44 | 9300 | 0.017 | - | - | - | | 7.52 | 9400 | 0.0117 | - | - | - | | 7.6 | 9500 | 0.006 | - | - | - | | 7.68 | 9600 | 0.0061 | - | - | - | | 7.76 | 9700 | 0.0141 | - | - | - | | 7.84 | 9800 | 0.0091 | - | - | - | | 7.92 | 9900 | 0.0164 | - | - | - | | 8.0 | 10000 | 0.0244 | 2.8054 | 0.3040 | - | | 8.08 | 10100 | 0.0001 | - | - | - | | 8.16 | 10200 | 0.0187 | - | - | - | | 8.24 | 10300 | 0.0098 | - | - | - | | 8.32 | 10400 | 0.0114 | - | - | - | | 8.4 | 10500 | 0.004 | - | - | - | | 8.48 | 10600 | 0.0017 | - | - | - | | 8.56 | 10700 | 0.0018 | - | - | - | | 8.64 | 10800 | 0.009 | - | - | - | | 8.72 | 10900 | 0.0047 | - | - | - | | 8.8 | 11000 | 0.0014 | - | - | - | | 8.88 | 11100 | 0.0049 | - | - | - | | 8.96 | 11200 | 0.006 | - | - | - | | 9.0 | 11250 | - | 2.9460 | 0.2967 | - | | 9.04 | 11300 | 0.0057 | - | - | - | | 9.12 | 11400 | 0.0051 | - | - | - | | 9.2 | 11500 | 0.0067 | - | - | - | | 9.28 | 11600 | 0.0009 | - | - | - | | 9.36 | 11700 | 0.0046 | - | - | - | | 9.44 | 11800 | 0.0138 | - | - | - | | 9.52 | 11900 | 0.0067 | - | - | - | | 9.6 | 12000 | 0.0043 | - | - | - | | 9.68 | 12100 | 0.001 | - | - | - | | 9.76 | 12200 | 0.0004 | - | - | - | | 9.84 | 12300 | 0.0044 | - | - | - | | 9.92 | 12400 | 0.003 | - | - | - | | 10.0 | 12500 | 0.0055 | 2.9714 | 0.3030 | 0.0969 |
### Framework Versions - Python: 3.10.12 - Sentence Transformers: 3.0.1 - Transformers: 4.41.2 - PyTorch: 2.3.0+cu121 - Accelerate: 0.31.0 - Datasets: 2.20.0 - Tokenizers: 0.19.1 ## Citation ### BibTeX #### Sentence Transformers and SoftmaxLoss ```bibtex @inproceedings{reimers-2019-sentence-bert, title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks", author = "Reimers, Nils and Gurevych, Iryna", booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing", month = "11", year = "2019", publisher = "Association for Computational Linguistics", url = "https://arxiv.org/abs/1908.10084", } ```