SentenceTransformer based on sentence-transformers/all-MiniLM-L6-v2

This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L6-v2 on the en-ja dataset. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: sentence-transformers/all-MiniLM-L6-v2
  • Maximum Sequence Length: 128 tokens
  • Output Dimensionality: 384 dimensions
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • en-ja

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    "We'll apply that throughout our global supply chain regardless of ownership or control.",
    '私たちはこれを グローバル・サプライチェーンにおいて 所有権や支配権に関係なく適用します。',
    '浸透していません',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Knowledge Distillation

Metric Value
negative_mse -0.1723

Translation

Metric Value
src2trg_accuracy 0.6883
trg2src_accuracy 0.6617
mean_accuracy 0.675

Training Details

Training Dataset

en-ja

  • Dataset: en-ja
  • Size: 8,692,806 training samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 8 tokens
    • mean: 72.37 tokens
    • max: 128 tokens
    • min: 4 tokens
    • mean: 37.76 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Are the basics of driving a car learning how to service it, or design it for that matter? 車の運転の基礎は 点検の仕方?それともデザインの仕方? [0.04097634181380272, 0.015725482255220413, 0.04093917831778526, 0.005089071579277515, -0.008469141088426113, ...]
    Are the basics of writing learning how to sharpen a quill? 執筆の基礎は羽ペンの削り方? [-0.0036177693400532007, 0.010684962384402752, -0.014135013334453106, -0.05535861477255821, -0.08116177469491959, ...]
    I don't think so. 違うと思います [-0.07840073108673096, -0.0229326281696558, -0.012929541990160942, -0.007635382004082203, -0.04817994683980942, ...]
  • Loss: MSELoss

Evaluation Dataset

en-ja

  • Dataset: en-ja
  • Size: 7,244 evaluation samples
  • Columns: english, non_english, and label
  • Approximate statistics based on the first 1000 samples:
    english non_english label
    type string string list
    details
    • min: 5 tokens
    • mean: 77.14 tokens
    • max: 128 tokens
    • min: 4 tokens
    • mean: 42.63 tokens
    • max: 128 tokens
    • size: 384 elements
  • Samples:
    english non_english label
    Thank you so much, Chris. どうもありがとう クリス このステージに立てる機会を [0.02692059800028801, 0.05314800143241882, 0.14048902690410614, -0.10380179435014725, -0.04118778929114342, ...]
    And it's truly a great honor to have the opportunity to come to this stage twice; I'm extremely grateful. 2度もいただけるというのは実に光栄なことで とてもうれしく思っています [0.024387270212173462, 0.09500126540660858, 0.12180333584547043, -0.07149269431829453, -0.018444567918777466, ...]
    I have been blown away by this conference, and I want to thank all of you for the many nice comments about what I had to say the other night. このカンファレンスには圧倒されっぱなしです 皆さんから― 前回の講演に対していただいた温かいコメントにお礼を申し上げたい [0.015005433931946754, 0.014678305946290493, 0.13112004101276398, 0.03133269399404526, 0.06942533701658249, ...]
  • Loss: MSELoss

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • gradient_accumulation_steps: 4
  • learning_rate: 0.0003
  • num_train_epochs: 10
  • warmup_ratio: 0.15
  • bf16: True

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 256
  • per_device_eval_batch_size: 256
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 4
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 0.0003
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 10
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.15
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: True
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: None
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • include_for_metrics: []
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • use_liger_kernel: False
  • eval_use_gather_object: False
  • average_tokens_across_devices: False
  • prompts: None
  • batch_sampler: batch_sampler
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss en-ja loss en-ja_negative_mse en-ja_mean_accuracy
0.0118 100 0.0049 - - -
0.0236 200 0.004 - - -
0.0353 300 0.0038 - - -
0.0471 400 0.0037 - - -
0.0589 500 0.0037 - - -
0.0707 600 0.0037 - - -
0.0825 700 0.0036 - - -
0.0942 800 0.0036 - - -
0.1060 900 0.0035 - - -
0.1178 1000 0.0035 0.0034 -0.34472314 0.0121
0.1296 1100 0.0035 - - -
0.1414 1200 0.0034 - - -
0.1531 1300 0.0034 - - -
0.1649 1400 0.0034 - - -
0.1767 1500 0.0033 - - -
0.1885 1600 0.0033 - - -
0.2003 1700 0.0032 - - -
0.2120 1800 0.0032 - - -
0.2238 1900 0.0032 - - -
0.2356 2000 0.0031 0.0031 -0.3021978 0.0279
0.2474 2100 0.0031 - - -
0.2592 2200 0.0031 - - -
0.2709 2300 0.0031 - - -
0.2827 2400 0.003 - - -
0.2945 2500 0.003 - - -
0.3063 2600 0.003 - - -
0.3180 2700 0.0029 - - -
0.3298 2800 0.0029 - - -
0.3416 2900 0.0029 - - -
0.3534 3000 0.0028 0.0027 -0.27366027 0.0888
0.3652 3100 0.0028 - - -
0.3769 3200 0.0028 - - -
0.3887 3300 0.0027 - - -
0.4005 3400 0.0027 - - -
0.4123 3500 0.0027 - - -
0.4241 3600 0.0026 - - -
0.4358 3700 0.0026 - - -
0.4476 3800 0.0026 - - -
0.4594 3900 0.0025 - - -
0.4712 4000 0.0025 0.0024 -0.25388333 0.2314
0.4830 4100 0.0025 - - -
0.4947 4200 0.0024 - - -
0.5065 4300 0.0024 - - -
0.5183 4400 0.0024 - - -
0.5301 4500 0.0023 - - -
0.5419 4600 0.0023 - - -
0.5536 4700 0.0023 - - -
0.5654 4800 0.0023 - - -
0.5772 4900 0.0022 - - -
0.5890 5000 0.0022 0.0021 -0.23925437 0.3782
0.6008 5100 0.0022 - - -
0.6125 5200 0.0022 - - -
0.6243 5300 0.0021 - - -
0.6361 5400 0.0021 - - -
0.6479 5500 0.0021 - - -
0.6597 5600 0.0021 - - -
0.6714 5700 0.0021 - - -
0.6832 5800 0.002 - - -
0.6950 5900 0.002 - - -
0.7068 6000 0.002 0.0020 -0.22806446 0.4660
0.7186 6100 0.002 - - -
0.7303 6200 0.002 - - -
0.7421 6300 0.002 - - -
0.7539 6400 0.0019 - - -
0.7657 6500 0.0019 - - -
0.7775 6600 0.0019 - - -
0.7892 6700 0.0019 - - -
0.8010 6800 0.0019 - - -
0.8128 6900 0.0019 - - -
0.8246 7000 0.0018 0.0018 -0.2197086 0.5271
0.8364 7100 0.0018 - - -
0.8481 7200 0.0018 - - -
0.8599 7300 0.0018 - - -
0.8717 7400 0.0018 - - -
0.8835 7500 0.0018 - - -
0.8952 7600 0.0018 - - -
0.9070 7700 0.0018 - - -
0.9188 7800 0.0018 - - -
0.9306 7900 0.0018 - - -
0.9424 8000 0.0017 0.0017 -0.21387897 0.5574
0.9541 8100 0.0017 - - -
0.9659 8200 0.0017 - - -
0.9777 8300 0.0017 - - -
0.9895 8400 0.0017 - - -
1.0012 8500 0.0017 - - -
1.0130 8600 0.0017 - - -
1.0247 8700 0.0017 - - -
1.0365 8800 0.0017 - - -
1.0483 8900 0.0017 - - -
1.0601 9000 0.0017 0.0016 -0.2085446 0.5752
1.0719 9100 0.0016 - - -
1.0836 9200 0.0016 - - -
1.0954 9300 0.0016 - - -
1.1072 9400 0.0016 - - -
1.1190 9500 0.0016 - - -
1.1308 9600 0.0016 - - -
1.1425 9700 0.0016 - - -
1.1543 9800 0.0016 - - -
1.1661 9900 0.0016 - - -
1.1779 10000 0.0016 0.0016 -0.20458691 0.5946
1.1897 10100 0.0016 - - -
1.2014 10200 0.0016 - - -
1.2132 10300 0.0016 - - -
1.2250 10400 0.0016 - - -
1.2368 10500 0.0016 - - -
1.2485 10600 0.0016 - - -
1.2603 10700 0.0015 - - -
1.2721 10800 0.0015 - - -
1.2839 10900 0.0015 - - -
1.2957 11000 0.0015 0.0015 -0.20193738 0.6010
1.3074 11100 0.0015 - - -
1.3192 11200 0.0015 - - -
1.3310 11300 0.0015 - - -
1.3428 11400 0.0015 - - -
1.3546 11500 0.0015 - - -
1.3663 11600 0.0015 - - -
1.3781 11700 0.0015 - - -
1.3899 11800 0.0015 - - -
1.4017 11900 0.0015 - - -
1.4135 12000 0.0015 0.0015 -0.19939835 0.6132
1.4252 12100 0.0015 - - -
1.4370 12200 0.0015 - - -
1.4488 12300 0.0015 - - -
1.4606 12400 0.0015 - - -
1.4724 12500 0.0015 - - -
1.4841 12600 0.0015 - - -
1.4959 12700 0.0015 - - -
1.5077 12800 0.0015 - - -
1.5195 12900 0.0015 - - -
1.5313 13000 0.0015 0.0014 -0.19621891 0.6252
1.5430 13100 0.0014 - - -
1.5548 13200 0.0014 - - -
1.5666 13300 0.0014 - - -
1.5784 13400 0.0014 - - -
1.5902 13500 0.0014 - - -
1.6019 13600 0.0014 - - -
1.6137 13700 0.0014 - - -
1.6255 13800 0.0014 - - -
1.6373 13900 0.0014 - - -
1.6491 14000 0.0014 0.0014 -0.19430138 0.6285
1.6608 14100 0.0014 - - -
1.6726 14200 0.0014 - - -
1.6844 14300 0.0014 - - -
1.6962 14400 0.0014 - - -
1.7080 14500 0.0014 - - -
1.7197 14600 0.0014 - - -
1.7315 14700 0.0014 - - -
1.7433 14800 0.0014 - - -
1.7551 14900 0.0014 - - -
1.7669 15000 0.0014 0.0014 -0.19249634 0.6341
1.7786 15100 0.0014 - - -
1.7904 15200 0.0014 - - -
1.8022 15300 0.0014 - - -
1.8140 15400 0.0014 - - -
1.8258 15500 0.0014 - - -
1.8375 15600 0.0014 - - -
1.8493 15700 0.0014 - - -
1.8611 15800 0.0014 - - -
1.8729 15900 0.0014 - - -
1.8846 16000 0.0014 0.0014 -0.19071046 0.6386
1.8964 16100 0.0014 - - -
1.9082 16200 0.0014 - - -
1.9200 16300 0.0014 - - -
1.9318 16400 0.0014 - - -
1.9435 16500 0.0014 - - -
1.9553 16600 0.0014 - - -
1.9671 16700 0.0014 - - -
1.9789 16800 0.0014 - - -
1.9907 16900 0.0014 - - -
2.0024 17000 0.0013 0.0014 -0.1893078 0.6416
2.0141 17100 0.0013 - - -
2.0259 17200 0.0013 - - -
2.0377 17300 0.0013 - - -
2.0495 17400 0.0013 - - -
2.0613 17500 0.0013 - - -
2.0730 17600 0.0013 - - -
2.0848 17700 0.0013 - - -
2.0966 17800 0.0013 - - -
2.1084 17900 0.0013 - - -
2.1202 18000 0.0013 0.0013 -0.188142 0.6453
2.1319 18100 0.0013 - - -
2.1437 18200 0.0013 - - -
2.1555 18300 0.0013 - - -
2.1673 18400 0.0013 - - -
2.1790 18500 0.0013 - - -
2.1908 18600 0.0013 - - -
2.2026 18700 0.0013 - - -
2.2144 18800 0.0013 - - -
2.2262 18900 0.0013 - - -
2.2379 19000 0.0013 0.0013 -0.18757328 0.6469
2.2497 19100 0.0013 - - -
2.2615 19200 0.0013 - - -
2.2733 19300 0.0013 - - -
2.2851 19400 0.0013 - - -
2.2968 19500 0.0013 - - -
2.3086 19600 0.0013 - - -
2.3204 19700 0.0013 - - -
2.3322 19800 0.0013 - - -
2.3440 19900 0.0013 - - -
2.3557 20000 0.0013 0.0013 -0.18630134 0.6434
2.3675 20100 0.0013 - - -
2.3793 20200 0.0013 - - -
2.3911 20300 0.0013 - - -
2.4029 20400 0.0013 - - -
2.4146 20500 0.0013 - - -
2.4264 20600 0.0013 - - -
2.4382 20700 0.0013 - - -
2.4500 20800 0.0013 - - -
2.4618 20900 0.0013 - - -
2.4735 21000 0.0013 0.0013 -0.18556643 0.6488
2.4853 21100 0.0013 - - -
2.4971 21200 0.0013 - - -
2.5089 21300 0.0013 - - -
2.5207 21400 0.0013 - - -
2.5324 21500 0.0013 - - -
2.5442 21600 0.0013 - - -
2.5560 21700 0.0013 - - -
2.5678 21800 0.0013 - - -
2.5796 21900 0.0013 - - -
2.5913 22000 0.0013 0.0013 -0.18505765 0.6485
2.6031 22100 0.0013 - - -
2.6149 22200 0.0013 - - -
2.6267 22300 0.0013 - - -
2.6385 22400 0.0013 - - -
2.6502 22500 0.0013 - - -
2.6620 22600 0.0013 - - -
2.6738 22700 0.0013 - - -
2.6856 22800 0.0013 - - -
2.6974 22900 0.0013 - - -
2.7091 23000 0.0013 0.0013 -0.18397436 0.6518
2.7209 23100 0.0013 - - -
2.7327 23200 0.0013 - - -
2.7445 23300 0.0013 - - -
2.7563 23400 0.0013 - - -
2.7680 23500 0.0013 - - -
2.7798 23600 0.0013 - - -
2.7916 23700 0.0013 - - -
2.8034 23800 0.0013 - - -
2.8151 23900 0.0013 - - -
2.8269 24000 0.0013 0.0013 -0.18347077 0.6547
2.8387 24100 0.0013 - - -
2.8505 24200 0.0013 - - -
2.8623 24300 0.0013 - - -
2.8740 24400 0.0013 - - -
2.8858 24500 0.0013 - - -
2.8976 24600 0.0013 - - -
2.9094 24700 0.0013 - - -
2.9212 24800 0.0013 - - -
2.9329 24900 0.0013 - - -
2.9447 25000 0.0013 0.0013 -0.18285994 0.6518
2.9565 25100 0.0013 - - -
2.9683 25200 0.0013 - - -
2.9801 25300 0.0013 - - -
2.9918 25400 0.0013 - - -
3.0035 25500 0.0012 - - -
3.0153 25600 0.0013 - - -
3.0271 25700 0.0012 - - -
3.0389 25800 0.0013 - - -
3.0507 25900 0.0012 - - -
3.0624 26000 0.0012 0.0013 -0.18248549 0.6536
3.0742 26100 0.0012 - - -
3.0860 26200 0.0012 - - -
3.0978 26300 0.0012 - - -
3.1096 26400 0.0012 - - -
3.1213 26500 0.0012 - - -
3.1331 26600 0.0012 - - -
3.1449 26700 0.0012 - - -
3.1567 26800 0.0012 - - -
3.1684 26900 0.0012 - - -
3.1802 27000 0.0012 0.0013 -0.1820667 0.6569
3.1920 27100 0.0012 - - -
3.2038 27200 0.0012 - - -
3.2156 27300 0.0012 - - -
3.2273 27400 0.0012 - - -
3.2391 27500 0.0012 - - -
3.2509 27600 0.0012 - - -
3.2627 27700 0.0012 - - -
3.2745 27800 0.0012 - - -
3.2862 27900 0.0012 - - -
3.2980 28000 0.0012 0.0013 -0.18136427 0.6596
3.3098 28100 0.0012 - - -
3.3216 28200 0.0012 - - -
3.3334 28300 0.0012 - - -
3.3451 28400 0.0012 - - -
3.3569 28500 0.0012 - - -
3.3687 28600 0.0012 - - -
3.3805 28700 0.0012 - - -
3.3923 28800 0.0012 - - -
3.4040 28900 0.0012 - - -
3.4158 29000 0.0012 0.0012 -0.18090644 0.6586
3.4276 29100 0.0012 - - -
3.4394 29200 0.0012 - - -
3.4512 29300 0.0012 - - -
3.4629 29400 0.0012 - - -
3.4747 29500 0.0012 - - -
3.4865 29600 0.0012 - - -
3.4983 29700 0.0012 - - -
3.5101 29800 0.0012 - - -
3.5218 29900 0.0012 - - -
3.5336 30000 0.0012 0.0012 -0.18087307 0.6602
3.5454 30100 0.0012 - - -
3.5572 30200 0.0012 - - -
3.5690 30300 0.0012 - - -
3.5807 30400 0.0012 - - -
3.5925 30500 0.0012 - - -
3.6043 30600 0.0012 - - -
3.6161 30700 0.0012 - - -
3.6279 30800 0.0012 - - -
3.6396 30900 0.0012 - - -
3.6514 31000 0.0012 0.0012 -0.17998679 0.6612
3.6632 31100 0.0012 - - -
3.6750 31200 0.0012 - - -
3.6868 31300 0.0012 - - -
3.6985 31400 0.0012 - - -
3.7103 31500 0.0012 - - -
3.7221 31600 0.0012 - - -
3.7339 31700 0.0012 - - -
3.7456 31800 0.0012 - - -
3.7574 31900 0.0012 - - -
3.7692 32000 0.0012 0.0012 -0.17991492 0.6630
3.7810 32100 0.0012 - - -
3.7928 32200 0.0012 - - -
3.8045 32300 0.0012 - - -
3.8163 32400 0.0012 - - -
3.8281 32500 0.0012 - - -
3.8399 32600 0.0012 - - -
3.8517 32700 0.0012 - - -
3.8634 32800 0.0012 - - -
3.8752 32900 0.0012 - - -
3.8870 33000 0.0012 0.0012 -0.1794728 0.6632
3.8988 33100 0.0012 - - -
3.9106 33200 0.0012 - - -
3.9223 33300 0.0012 - - -
3.9341 33400 0.0012 - - -
3.9459 33500 0.0012 - - -
3.9577 33600 0.0012 - - -
3.9695 33700 0.0012 - - -
3.9812 33800 0.0012 - - -
3.9930 33900 0.0012 - - -
4.0047 34000 0.0012 0.0012 -0.17887509 0.6643
4.0165 34100 0.0012 - - -
4.0283 34200 0.0012 - - -
4.0401 34300 0.0012 - - -
4.0518 34400 0.0012 - - -
4.0636 34500 0.0012 - - -
4.0754 34600 0.0012 - - -
4.0872 34700 0.0012 - - -
4.0989 34800 0.0012 - - -
4.1107 34900 0.0012 - - -
4.1225 35000 0.0012 0.0012 -0.17876983 0.6648
4.1343 35100 0.0012 - - -
4.1461 35200 0.0012 - - -
4.1578 35300 0.0012 - - -
4.1696 35400 0.0012 - - -
4.1814 35500 0.0012 - - -
4.1932 35600 0.0012 - - -
4.2050 35700 0.0012 - - -
4.2167 35800 0.0012 - - -
4.2285 35900 0.0012 - - -
4.2403 36000 0.0012 0.0012 -0.17866188 0.6629
4.2521 36100 0.0012 - - -
4.2639 36200 0.0012 - - -
4.2756 36300 0.0012 - - -
4.2874 36400 0.0012 - - -
4.2992 36500 0.0012 - - -
4.3110 36600 0.0012 - - -
4.3228 36700 0.0012 - - -
4.3345 36800 0.0012 - - -
4.3463 36900 0.0012 - - -
4.3581 37000 0.0012 0.0012 -0.17826374 0.6642
4.3699 37100 0.0012 - - -
4.3817 37200 0.0012 - - -
4.3934 37300 0.0012 - - -
4.4052 37400 0.0012 - - -
4.4170 37500 0.0012 - - -
4.4288 37600 0.0012 - - -
4.4406 37700 0.0012 - - -
4.4523 37800 0.0012 - - -
4.4641 37900 0.0012 - - -
4.4759 38000 0.0012 0.0012 -0.17813009 0.6654
4.4877 38100 0.0012 - - -
4.4995 38200 0.0012 - - -
4.5112 38300 0.0012 - - -
4.5230 38400 0.0012 - - -
4.5348 38500 0.0012 - - -
4.5466 38600 0.0012 - - -
4.5584 38700 0.0012 - - -
4.5701 38800 0.0012 - - -
4.5819 38900 0.0012 - - -
4.5937 39000 0.0012 0.0012 -0.17787422 0.6677
4.6055 39100 0.0012 - - -
4.6173 39200 0.0012 - - -
4.6290 39300 0.0012 - - -
4.6408 39400 0.0012 - - -
4.6526 39500 0.0012 - - -
4.6644 39600 0.0012 - - -
4.6761 39700 0.0012 - - -
4.6879 39800 0.0012 - - -
4.6997 39900 0.0012 - - -
4.7115 40000 0.0012 0.0012 -0.17763866 0.6631
4.7233 40100 0.0012 - - -
4.7350 40200 0.0012 - - -
4.7468 40300 0.0012 - - -
4.7586 40400 0.0012 - - -
4.7704 40500 0.0012 - - -
4.7822 40600 0.0012 - - -
4.7939 40700 0.0012 - - -
4.8057 40800 0.0012 - - -
4.8175 40900 0.0012 - - -
4.8293 41000 0.0012 0.0012 -0.17747028 0.6683
4.8411 41100 0.0012 - - -
4.8528 41200 0.0012 - - -
4.8646 41300 0.0012 - - -
4.8764 41400 0.0012 - - -
4.8882 41500 0.0012 - - -
4.9000 41600 0.0012 - - -
4.9117 41700 0.0012 - - -
4.9235 41800 0.0012 - - -
4.9353 41900 0.0012 - - -
4.9471 42000 0.0012 0.0012 -0.1769735 0.6677
4.9589 42100 0.0012 - - -
4.9706 42200 0.0012 - - -
4.9824 42300 0.0012 - - -
4.9942 42400 0.0012 - - -
5.0059 42500 0.0012 - - -
5.0177 42600 0.0012 - - -
5.0294 42700 0.0012 - - -
5.0412 42800 0.0012 - - -
5.0530 42900 0.0012 - - -
5.0648 43000 0.0012 0.0012 -0.17719762 0.6653
5.0766 43100 0.0012 - - -
5.0883 43200 0.0012 - - -
5.1001 43300 0.0012 - - -
5.1119 43400 0.0012 - - -
5.1237 43500 0.0012 - - -
5.1355 43600 0.0012 - - -
5.1472 43700 0.0012 - - -
5.1590 43800 0.0012 - - -
5.1708 43900 0.0012 - - -
5.1826 44000 0.0012 0.0012 -0.17653656 0.6693
5.1944 44100 0.0012 - - -
5.2061 44200 0.0012 - - -
5.2179 44300 0.0012 - - -
5.2297 44400 0.0012 - - -
5.2415 44500 0.0012 - - -
5.2533 44600 0.0012 - - -
5.2650 44700 0.0012 - - -
5.2768 44800 0.0012 - - -
5.2886 44900 0.0012 - - -
5.3004 45000 0.0012 0.0012 -0.17657608 0.6683
5.3122 45100 0.0012 - - -
5.3239 45200 0.0012 - - -
5.3357 45300 0.0012 - - -
5.3475 45400 0.0012 - - -
5.3593 45500 0.0012 - - -
5.3711 45600 0.0012 - - -
5.3828 45700 0.0012 - - -
5.3946 45800 0.0012 - - -
5.4064 45900 0.0012 - - -
5.4182 46000 0.0012 0.0012 -0.17643414 0.6694
5.4300 46100 0.0012 - - -
5.4417 46200 0.0012 - - -
5.4535 46300 0.0012 - - -
5.4653 46400 0.0012 - - -
5.4771 46500 0.0012 - - -
5.4889 46600 0.0012 - - -
5.5006 46700 0.0012 - - -
5.5124 46800 0.0012 - - -
5.5242 46900 0.0012 - - -
5.5360 47000 0.0012 0.0012 -0.17619923 0.6679
5.5478 47100 0.0012 - - -
5.5595 47200 0.0012 - - -
5.5713 47300 0.0012 - - -
5.5831 47400 0.0012 - - -
5.5949 47500 0.0012 - - -
5.6066 47600 0.0012 - - -
5.6184 47700 0.0012 - - -
5.6302 47800 0.0012 - - -
5.6420 47900 0.0012 - - -
5.6538 48000 0.0012 0.0012 -0.17586452 0.6694
5.6655 48100 0.0012 - - -
5.6773 48200 0.0012 - - -
5.6891 48300 0.0012 - - -
5.7009 48400 0.0012 - - -
5.7127 48500 0.0012 - - -
5.7244 48600 0.0012 - - -
5.7362 48700 0.0012 - - -
5.7480 48800 0.0012 - - -
5.7598 48900 0.0012 - - -
5.7716 49000 0.0012 0.0012 -0.1760546 0.6687
5.7833 49100 0.0012 - - -
5.7951 49200 0.0012 - - -
5.8069 49300 0.0012 - - -
5.8187 49400 0.0012 - - -
5.8305 49500 0.0012 - - -
5.8422 49600 0.0012 - - -
5.8540 49700 0.0012 - - -
5.8658 49800 0.0012 - - -
5.8776 49900 0.0012 - - -
5.8894 50000 0.0012 0.0012 -0.17587146 0.6677
5.9011 50100 0.0012 - - -
5.9129 50200 0.0012 - - -
5.9247 50300 0.0012 - - -
5.9365 50400 0.0012 - - -
5.9483 50500 0.0012 - - -
5.9600 50600 0.0012 - - -
5.9718 50700 0.0012 - - -
5.9836 50800 0.0012 - - -
5.9954 50900 0.0012 - - -
6.0071 51000 0.0011 0.0012 -0.17524663 0.6717
6.0188 51100 0.0012 - - -
6.0306 51200 0.0012 - - -
6.0424 51300 0.0012 - - -
6.0542 51400 0.0012 - - -
6.0660 51500 0.0012 - - -
6.0777 51600 0.0012 - - -
6.0895 51700 0.0012 - - -
6.1013 51800 0.0012 - - -
6.1131 51900 0.0012 - - -
6.1249 52000 0.0012 0.0012 -0.17546739 0.6696
6.1366 52100 0.0012 - - -
6.1484 52200 0.0012 - - -
6.1602 52300 0.0012 - - -
6.1720 52400 0.0012 - - -
6.1838 52500 0.0012 - - -
6.1955 52600 0.0012 - - -
6.2073 52700 0.0012 - - -
6.2191 52800 0.0012 - - -
6.2309 52900 0.0012 - - -
6.2427 53000 0.0012 0.0012 -0.17500012 0.6687
6.2544 53100 0.0012 - - -
6.2662 53200 0.0012 - - -
6.2780 53300 0.0012 - - -
6.2898 53400 0.0012 - - -
6.3016 53500 0.0011 - - -
6.3133 53600 0.0012 - - -
6.3251 53700 0.0012 - - -
6.3369 53800 0.0012 - - -
6.3487 53900 0.0012 - - -
6.3605 54000 0.0012 0.0012 -0.1750529 0.6728
6.3722 54100 0.0012 - - -
6.3840 54200 0.0012 - - -
6.3958 54300 0.0012 - - -
6.4076 54400 0.0012 - - -
6.4194 54500 0.0012 - - -
6.4311 54600 0.0011 - - -
6.4429 54700 0.0012 - - -
6.4547 54800 0.0012 - - -
6.4665 54900 0.0011 - - -
6.4783 55000 0.0011 0.0012 -0.17491975 0.6687
6.4900 55100 0.0012 - - -
6.5018 55200 0.0012 - - -
6.5136 55300 0.0011 - - -
6.5254 55400 0.0011 - - -
6.5371 55500 0.0011 - - -
6.5489 55600 0.0011 - - -
6.5607 55700 0.0011 - - -
6.5725 55800 0.0011 - - -
6.5843 55900 0.0012 - - -
6.5960 56000 0.0012 0.0012 -0.17456226 0.6726
6.6078 56100 0.0012 - - -
6.6196 56200 0.0011 - - -
6.6314 56300 0.0012 - - -
6.6432 56400 0.0011 - - -
6.6549 56500 0.0011 - - -
6.6667 56600 0.0011 - - -
6.6785 56700 0.0011 - - -
6.6903 56800 0.0011 - - -
6.7021 56900 0.0012 - - -
6.7138 57000 0.0011 0.0012 -0.17479357 0.6721
6.7256 57100 0.0011 - - -
6.7374 57200 0.0012 - - -
6.7492 57300 0.0011 - - -
6.7610 57400 0.0011 - - -
6.7727 57500 0.0011 - - -
6.7845 57600 0.0011 - - -
6.7963 57700 0.0011 - - -
6.8081 57800 0.0011 - - -
6.8199 57900 0.0011 - - -
6.8316 58000 0.0011 0.0012 -0.17439803 0.6710
6.8434 58100 0.0011 - - -
6.8552 58200 0.0011 - - -
6.8670 58300 0.0011 - - -
6.8788 58400 0.0011 - - -
6.8905 58500 0.0011 - - -
6.9023 58600 0.0011 - - -
6.9141 58700 0.0011 - - -
6.9259 58800 0.0011 - - -
6.9377 58900 0.0011 - - -
6.9494 59000 0.0011 0.0012 -0.17475043 0.6718
6.9612 59100 0.0011 - - -
6.9730 59200 0.0011 - - -
6.9848 59300 0.0011 - - -
6.9966 59400 0.0011 - - -
7.0082 59500 0.0011 - - -
7.0200 59600 0.0011 - - -
7.0318 59700 0.0011 - - -
7.0436 59800 0.0011 - - -
7.0554 59900 0.0011 - - -
7.0671 60000 0.0011 0.0012 -0.17430946 0.6722
7.0789 60100 0.0011 - - -
7.0907 60200 0.0011 - - -
7.1025 60300 0.0011 - - -
7.1143 60400 0.0011 - - -
7.1260 60500 0.0011 - - -
7.1378 60600 0.0011 - - -
7.1496 60700 0.0011 - - -
7.1614 60800 0.0011 - - -
7.1732 60900 0.0011 - - -
7.1849 61000 0.0011 0.0012 -0.17427135 0.6732
7.1967 61100 0.0011 - - -
7.2085 61200 0.0011 - - -
7.2203 61300 0.0011 - - -
7.2321 61400 0.0011 - - -
7.2438 61500 0.0011 - - -
7.2556 61600 0.0011 - - -
7.2674 61700 0.0011 - - -
7.2792 61800 0.0011 - - -
7.2910 61900 0.0011 - - -
7.3027 62000 0.0011 0.0012 -0.17400834 0.6719
7.3145 62100 0.0011 - - -
7.3263 62200 0.0011 - - -
7.3381 62300 0.0011 - - -
7.3499 62400 0.0011 - - -
7.3616 62500 0.0011 - - -
7.3734 62600 0.0011 - - -
7.3852 62700 0.0011 - - -
7.3970 62800 0.0011 - - -
7.4088 62900 0.0011 - - -
7.4205 63000 0.0011 0.0012 -0.1740251 0.6726
7.4323 63100 0.0011 - - -
7.4441 63200 0.0011 - - -
7.4559 63300 0.0011 - - -
7.4677 63400 0.0011 - - -
7.4794 63500 0.0011 - - -
7.4912 63600 0.0011 - - -
7.5030 63700 0.0011 - - -
7.5148 63800 0.0011 - - -
7.5265 63900 0.0011 - - -
7.5383 64000 0.0011 0.0012 -0.17380536 0.6712
7.5501 64100 0.0011 - - -
7.5619 64200 0.0011 - - -
7.5737 64300 0.0011 - - -
7.5854 64400 0.0011 - - -
7.5972 64500 0.0011 - - -
7.6090 64600 0.0011 - - -
7.6208 64700 0.0011 - - -
7.6326 64800 0.0011 - - -
7.6443 64900 0.0011 - - -
7.6561 65000 0.0011 0.0012 -0.17366579 0.6723
7.6679 65100 0.0011 - - -
7.6797 65200 0.0011 - - -
7.6915 65300 0.0011 - - -
7.7032 65400 0.0011 - - -
7.7150 65500 0.0011 - - -
7.7268 65600 0.0011 - - -
7.7386 65700 0.0011 - - -
7.7504 65800 0.0011 - - -
7.7621 65900 0.0011 - - -
7.7739 66000 0.0011 0.0012 -0.17369126 0.6746
7.7857 66100 0.0011 - - -
7.7975 66200 0.0011 - - -
7.8093 66300 0.0011 - - -
7.8210 66400 0.0011 - - -
7.8328 66500 0.0011 - - -
7.8446 66600 0.0011 - - -
7.8564 66700 0.0011 - - -
7.8682 66800 0.0011 - - -
7.8799 66900 0.0011 - - -
7.8917 67000 0.0011 0.0012 -0.17330332 0.6740
7.9035 67100 0.0011 - - -
7.9153 67200 0.0011 - - -
7.9271 67300 0.0011 - - -
7.9388 67400 0.0011 - - -
7.9506 67500 0.0011 - - -
7.9624 67600 0.0011 - - -
7.9742 67700 0.0011 - - -
7.9860 67800 0.0011 - - -
7.9977 67900 0.0011 - - -
8.0094 68000 0.0011 0.0012 -0.17326406 0.6732
8.0212 68100 0.0011 - - -
8.0330 68200 0.0011 - - -
8.0448 68300 0.0011 - - -
8.0565 68400 0.0011 - - -
8.0683 68500 0.0011 - - -
8.0801 68600 0.0011 - - -
8.0919 68700 0.0011 - - -
8.1037 68800 0.0011 - - -
8.1154 68900 0.0011 - - -
8.1272 69000 0.0011 0.0012 -0.17333928 0.6730
8.1390 69100 0.0011 - - -
8.1508 69200 0.0011 - - -
8.1626 69300 0.0011 - - -
8.1743 69400 0.0011 - - -
8.1861 69500 0.0011 - - -
8.1979 69600 0.0011 - - -
8.2097 69700 0.0011 - - -
8.2215 69800 0.0011 - - -
8.2332 69900 0.0011 - - -
8.2450 70000 0.0011 0.0012 -0.17320158 0.6726
8.2568 70100 0.0011 - - -
8.2686 70200 0.0011 - - -
8.2804 70300 0.0011 - - -
8.2921 70400 0.0011 - - -
8.3039 70500 0.0011 - - -
8.3157 70600 0.0011 - - -
8.3275 70700 0.0011 - - -
8.3393 70800 0.0011 - - -
8.3510 70900 0.0011 - - -
8.3628 71000 0.0011 0.0012 -0.17346236 0.6726
8.3746 71100 0.0011 - - -
8.3864 71200 0.0011 - - -
8.3982 71300 0.0011 - - -
8.4099 71400 0.0011 - - -
8.4217 71500 0.0011 - - -
8.4335 71600 0.0011 - - -
8.4453 71700 0.0011 - - -
8.4570 71800 0.0011 - - -
8.4688 71900 0.0011 - - -
8.4806 72000 0.0011 0.0012 -0.17322065 0.6749
8.4924 72100 0.0011 - - -
8.5042 72200 0.0011 - - -
8.5159 72300 0.0011 - - -
8.5277 72400 0.0011 - - -
8.5395 72500 0.0011 - - -
8.5513 72600 0.0011 - - -
8.5631 72700 0.0011 - - -
8.5748 72800 0.0011 - - -
8.5866 72900 0.0011 - - -
8.5984 73000 0.0011 0.0011 -0.1729941 0.6725
8.6102 73100 0.0011 - - -
8.6220 73200 0.0011 - - -
8.6337 73300 0.0011 - - -
8.6455 73400 0.0011 - - -
8.6573 73500 0.0011 - - -
8.6691 73600 0.0011 - - -
8.6809 73700 0.0011 - - -
8.6926 73800 0.0011 - - -
8.7044 73900 0.0011 - - -
8.7162 74000 0.0011 0.0011 -0.17297848 0.6719
8.7280 74100 0.0011 - - -
8.7398 74200 0.0011 - - -
8.7515 74300 0.0011 - - -
8.7633 74400 0.0011 - - -
8.7751 74500 0.0011 - - -
8.7869 74600 0.0011 - - -
8.7987 74700 0.0011 - - -
8.8104 74800 0.0011 - - -
8.8222 74900 0.0011 - - -
8.8340 75000 0.0011 0.0011 -0.17291391 0.6728
8.8458 75100 0.0011 - - -
8.8576 75200 0.0011 - - -
8.8693 75300 0.0011 - - -
8.8811 75400 0.0011 - - -
8.8929 75500 0.0011 - - -
8.9047 75600 0.0011 - - -
8.9165 75700 0.0011 - - -
8.9282 75800 0.0011 - - -
8.9400 75900 0.0011 - - -
8.9518 76000 0.0011 0.0011 -0.1728144 0.6741
8.9636 76100 0.0011 - - -
8.9754 76200 0.0011 - - -
8.9871 76300 0.0011 - - -
8.9989 76400 0.0011 - - -
9.0106 76500 0.0011 - - -
9.0224 76600 0.0011 - - -
9.0342 76700 0.0011 - - -
9.0459 76800 0.0011 - - -
9.0577 76900 0.0011 - - -
9.0695 77000 0.0011 0.0011 -0.1725926 0.6728
9.0813 77100 0.0011 - - -
9.0931 77200 0.0011 - - -
9.1048 77300 0.0011 - - -
9.1166 77400 0.0011 - - -
9.1284 77500 0.0011 - - -
9.1402 77600 0.0011 - - -
9.1520 77700 0.0011 - - -
9.1637 77800 0.0011 - - -
9.1755 77900 0.0011 - - -
9.1873 78000 0.0011 0.0011 -0.17272119 0.6735
9.1991 78100 0.0011 - - -
9.2109 78200 0.0011 - - -
9.2226 78300 0.0011 - - -
9.2344 78400 0.0011 - - -
9.2462 78500 0.0011 - - -
9.2580 78600 0.0011 - - -
9.2698 78700 0.0011 - - -
9.2815 78800 0.0011 - - -
9.2933 78900 0.0011 - - -
9.3051 79000 0.0011 0.0011 -0.1726715 0.6730
9.3169 79100 0.0011 - - -
9.3287 79200 0.0011 - - -
9.3404 79300 0.0011 - - -
9.3522 79400 0.0011 - - -
9.3640 79500 0.0011 - - -
9.3758 79600 0.0011 - - -
9.3875 79700 0.0011 - - -
9.3993 79800 0.0011 - - -
9.4111 79900 0.0011 - - -
9.4229 80000 0.0011 0.0011 -0.172526 0.6737
9.4347 80100 0.0011 - - -
9.4464 80200 0.0011 - - -
9.4582 80300 0.0011 - - -
9.4700 80400 0.0011 - - -
9.4818 80500 0.0011 - - -
9.4936 80600 0.0011 - - -
9.5053 80700 0.0011 - - -
9.5171 80800 0.0011 - - -
9.5289 80900 0.0011 - - -
9.5407 81000 0.0011 0.0011 -0.1724438 0.6749
9.5525 81100 0.0011 - - -
9.5642 81200 0.0011 - - -
9.5760 81300 0.0011 - - -
9.5878 81400 0.0011 - - -
9.5996 81500 0.0011 - - -
9.6114 81600 0.0011 - - -
9.6231 81700 0.0011 - - -
9.6349 81800 0.0011 - - -
9.6467 81900 0.0011 - - -
9.6585 82000 0.0011 0.0011 -0.17250738 0.6729
9.6703 82100 0.0011 - - -
9.6820 82200 0.0011 - - -
9.6938 82300 0.0011 - - -
9.7056 82400 0.0011 - - -
9.7174 82500 0.0011 - - -
9.7292 82600 0.0011 - - -
9.7409 82700 0.0011 - - -
9.7527 82800 0.0011 - - -
9.7645 82900 0.0011 - - -
9.7763 83000 0.0011 0.0011 -0.1723447 0.6728
9.7881 83100 0.0011 - - -
9.7998 83200 0.0011 - - -
9.8116 83300 0.0011 - - -
9.8234 83400 0.0011 - - -
9.8352 83500 0.0011 - - -
9.8470 83600 0.0011 - - -
9.8587 83700 0.0011 - - -
9.8705 83800 0.0011 - - -
9.8823 83900 0.0011 - - -
9.8941 84000 0.0011 0.0011 -0.17230988 0.6750
9.9059 84100 0.0011 - - -
9.9176 84200 0.0011 - - -
9.9294 84300 0.0011 - - -
9.9412 84400 0.0011 - - -
9.9530 84500 0.0011 - - -
9.9647 84600 0.0011 - - -
9.9765 84700 0.0011 - - -
9.9883 84800 0.0011 - - -

Framework Versions

  • Python: 3.10.16
  • Sentence Transformers: 3.3.1
  • Transformers: 4.48.0
  • PyTorch: 2.5.1+cu124
  • Accelerate: 1.2.1
  • Datasets: 3.2.0
  • Tokenizers: 0.21.0

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

MSELoss

@inproceedings{reimers-2020-multilingual-sentence-bert,
    title = "Making Monolingual Sentence Embeddings Multilingual using Knowledge Distillation",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2020",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/2004.09813",
}
Downloads last month
7
Safetensors
Model size
41.7M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for jvanhoof/all-miniLM-L6-en-ja

Finetuned
(198)
this model

Evaluation results