lv12's picture
full set multi loss ESCI triplets
db39ce8 verified
metadata
language: []
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - dataset_size:100K<n<1M
  - loss:CachedMultipleNegativesRankingLoss
  - loss:AnglELoss
base_model: nomic-ai/nomic-embed-text-v1.5
metrics:
  - cosine_accuracy
  - dot_accuracy
  - manhattan_accuracy
  - euclidean_accuracy
  - max_accuracy
  - pearson_cosine
  - spearman_cosine
  - pearson_manhattan
  - spearman_manhattan
  - pearson_euclidean
  - spearman_euclidean
  - pearson_dot
  - spearman_dot
  - pearson_max
  - spearman_max
widget:
  - source_sentence: mw-59
    sentences:
      - i9
      - beats headphones
      - tablet stands for 7in
  - source_sentence: totod
    sentences:
      - torxh
      - massage warehouse
      - ferry boats scrub cap
  - source_sentence: 'search_query: ecloth'
    sentences:
      - 'search_query: friend wine stopper'
      - 'search_query: pants for teen girls'
      - 'search_query: 11x14 frame without mat'
  - source_sentence: skull
    sentences:
      - dog kennel
      - duct tape colors
      - mustard tie
  - source_sentence: 'search_query: dab rig'
    sentences:
      - 'search_query: aga stove'
      - 'search_query: jerky slicer machine'
      - 'search_query: womens wallet phone'
pipeline_tag: sentence-similarity
model-index:
  - name: SentenceTransformer based on nomic-ai/nomic-embed-text-v1.5
    results:
      - task:
          type: triplet
          name: Triplet
        dataset:
          name: Unknown
          type: unknown
        metrics:
          - type: cosine_accuracy
            value: 0.702
            name: Cosine Accuracy
          - type: dot_accuracy
            value: 0.3047
            name: Dot Accuracy
          - type: manhattan_accuracy
            value: 0.7038
            name: Manhattan Accuracy
          - type: euclidean_accuracy
            value: 0.7034
            name: Euclidean Accuracy
          - type: max_accuracy
            value: 0.7038
            name: Max Accuracy
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: Unknown
          type: unknown
        metrics:
          - type: pearson_cosine
            value: 0.44005266442024726
            name: Pearson Cosine
          - type: spearman_cosine
            value: 0.42992442611334314
            name: Spearman Cosine
          - type: pearson_manhattan
            value: 0.40023272026373946
            name: Pearson Manhattan
          - type: spearman_manhattan
            value: 0.4006937930339286
            name: Spearman Manhattan
          - type: pearson_euclidean
            value: 0.400264197728783
            name: Pearson Euclidean
          - type: spearman_euclidean
            value: 0.4007885190533924
            name: Spearman Euclidean
          - type: pearson_dot
            value: 0.443162859807234
            name: Pearson Dot
          - type: spearman_dot
            value: 0.435512515703368
            name: Spearman Dot
          - type: pearson_max
            value: 0.443162859807234
            name: Pearson Max
          - type: spearman_max
            value: 0.435512515703368
            name: Spearman Max

SentenceTransformer based on nomic-ai/nomic-embed-text-v1.5

This is a sentence-transformers model finetuned from nomic-ai/nomic-embed-text-v1.5 on the triplets and pairs datasets. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: nomic-ai/nomic-embed-text-v1.5
  • Maximum Sequence Length: 8192 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Training Datasets:
    • triplets
    • pairs

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 8192, 'do_lower_case': False}) with Transformer model: NomicBertModel 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("sentence_transformers_model_id")
# Run inference
sentences = [
    'search_query: dab rig',
    'search_query: aga stove',
    'search_query: jerky slicer machine',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Triplet

Metric Value
cosine_accuracy 0.702
dot_accuracy 0.3047
manhattan_accuracy 0.7038
euclidean_accuracy 0.7034
max_accuracy 0.7038

Semantic Similarity

Metric Value
pearson_cosine 0.4401
spearman_cosine 0.4299
pearson_manhattan 0.4002
spearman_manhattan 0.4007
pearson_euclidean 0.4003
spearman_euclidean 0.4008
pearson_dot 0.4432
spearman_dot 0.4355
pearson_max 0.4432
spearman_max 0.4355

Training Details

Training Datasets

triplets

  • Dataset: triplets
  • Size: 261,250 training samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 11.57 tokens
    • max: 40 tokens
    • min: 17 tokens
    • mean: 43.83 tokens
    • max: 119 tokens
    • min: 15 tokens
    • mean: 43.31 tokens
    • max: 112 tokens
  • Samples:
    anchor positive negative
    search_query: ear warmers women north face search_document: The North Face Women's Oh-Mega Fur Pom Beanie, TNF Black, OS, The North Face, Tnf Black search_document: The North Face Shinsky Beanie, TNF Light Grey Heather, OS, The North Face, Tnf Light Grey Heather
    search_query: natural braided hairstyles without weave for black women search_document: Baseball Cap Wig Long Ombre Braids Cap Wig Hat with Synthetic Small Box Braiding Hair for Women Girls(B-53), Yunkang, B-53 search_document: K'ryssma Dark Brown Synthetic Wigs for women - Natural Looking Long Wavy Right Side Parting NONE Lace Heat Resistant Replacement Wig Full Machine Made 24 inch (#2), K'ryssma, Dark Brown
    search_query: boy siracha shirt search_document: Sriracha Distressed Label Graphic T-Shirt, Sriracha, Red search_document: Pho Sho
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

pairs

  • Dataset: pairs
  • Size: 261,250 training samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.73 tokens
    • max: 33 tokens
    • min: 10 tokens
    • mean: 40.14 tokens
    • max: 98 tokens
    • min: 0.0
    • mean: 0.77
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    I would choose a medium weight waterproof fabric, hip length jacket or longer, long sleeves, zip front, with a hood and deep pockets with zips ZSHOW Men's Winter Hooded Packable Down Jacket(Blue, XX-Large), ZSHOW, Blue 1.0
    sequin dance costume girls Yeahdor Big Girls' Lyrical Latin Ballet Dance Costumes Dresses Halter Sequins Irregular Tutu Skirted Leotard Dancewear Pink 12-14, Yeahdor, Pink 1.0
    paint easel bulk Artecho Artist Easel Display Easel Stand, 2 Pack Metal Tripod Stand Easel for Painting, Hold Canvas from 21" to 66", Floor and Tabletop Displaying, Painting with Portable Bag, Artecho, Black 1.0
  • Loss: AnglELoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_angle_sim"
    }
    

Evaluation Datasets

triplets

  • Dataset: triplets
  • Size: 10,000 evaluation samples
  • Columns: anchor, positive, and negative
  • Approximate statistics based on the first 1000 samples:
    anchor positive negative
    type string string string
    details
    • min: 7 tokens
    • mean: 11.52 tokens
    • max: 33 tokens
    • min: 18 tokens
    • mean: 42.45 tokens
    • max: 113 tokens
    • min: 15 tokens
    • mean: 42.7 tokens
    • max: 116 tokens
  • Samples:
    anchor positive negative
    search_query: non damaging eyelash glue search_document: Professional Eyelash Extension Remover Gel - Quickly And Easily Remove Individual Semi Permanent False Lashes - Works With Even The Strongest Fake Eyelash Glue or Adhesive, BEAU LASHES, Clear search_document: Premade Volume Eyelash Extensions 4D-D-0.10-14 Long Stem Premade Fans Eyelash Extensions C D Curl Volume Lash Extensions Pre made Lash Fans(4D-D-0.10, 14mm), B&Qaugen, 4D-0.10-D
    search_query: christmas tablecloths for rectangle tables 60 x 120 gold search_document: Aquazolax Damask Tablecloth for Rectangle Table 60 x 120 Damask Foliate Pattern Jacquard Heavy Weight Fabric Table Overlay, Gold, Aquazolax, 02 - Gold search_document: Benson Mills Harmony Scroll Woven Damask Fabric Tablecloth (60" X 104" Rectangular, Gold), Benson Mills, Gold
    search_query: #10 standard no tint no window not self seal search_document: #10 Security Tinted Self-Seal Envelopes - No Window - EnveGuard, Size 4-1/8 X 9-1/2 Inches - White - 24 LB - 100 Count (34100), Aimoh, White search_document: Chalktastic Liquid Chalk Markers for Kids - Set of 8 Washable, Dry Erase Pens for School, Menu Board & Car Window Glass - Neon, Erasable Chalkboard Pen Pack - Gifts for Artists, Chalktastic, Classic
  • Loss: CachedMultipleNegativesRankingLoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "cos_sim"
    }
    

pairs

  • Dataset: pairs
  • Size: 10,000 evaluation samples
  • Columns: sentence1, sentence2, and score
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2 score
    type string string float
    details
    • min: 3 tokens
    • mean: 6.8 tokens
    • max: 34 tokens
    • min: 9 tokens
    • mean: 39.7 tokens
    • max: 101 tokens
    • min: 0.0
    • mean: 0.73
    • max: 1.0
  • Samples:
    sentence1 sentence2 score
    outdoor ceiling fans without light 44" Plaza Industrial Indoor Outdoor Ceiling Fan with Remote Control Oil Rubbed Bronze Damp Rated for Patio Porch - Casa Vieja, Casa Vieja, No Light Kit - Bronze 1.0
    bathroom cabinet Homfa Bathroom Floor Cabinet Free Standing with Single Door Multifunctional Bathroom Storage Organizer Toiletries(Ivory White), Homfa, White 1.0
    fitbit charge 3 TreasureMax Compatible with Fitbit Charge 2 Bands for Women/Men,Silicone Fadeless Pattern Printed Replacement Floral Bands for Fitbit Charge 2 HR Wristbands, TreasureMax, Paw 2 0.2
  • Loss: AnglELoss with these parameters:
    {
        "scale": 20.0,
        "similarity_fct": "pairwise_angle_sim"
    }
    

Training Hyperparameters

Non-Default Hyperparameters

  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • gradient_accumulation_steps: 4
  • learning_rate: 1e-05
  • num_train_epochs: 5
  • lr_scheduler_type: cosine_with_restarts
  • warmup_ratio: 0.1
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: 2
  • load_best_model_at_end: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • prediction_loss_only: True
  • per_device_train_batch_size: 4
  • per_device_eval_batch_size: 4
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 4
  • eval_accumulation_steps: None
  • learning_rate: 1e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 5
  • max_steps: -1
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: False
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: True
  • dataloader_num_workers: 4
  • dataloader_prefetch_factor: 2
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: True
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss pairs loss triplets loss cosine_accuracy spearman_cosine
0.0031 100 0.9224 - - - -
0.0061 200 0.9823 - - - -
0.0092 300 0.906 - - - -
0.0122 400 0.9692 - - - -
0.0153 500 1.0174 - - - -
0.0184 600 0.9488 - - - -
0.0214 700 0.9094 - - - -
0.0245 800 1.086 - - - -
0.0276 900 0.9104 - - - -
0.0306 1000 0.8288 1.3267 0.7466 0.6776 0.3661
0.0337 1100 0.9905 - - - -
0.0367 1200 0.9511 - - - -
0.0398 1300 0.894 - - - -
0.0429 1400 0.7935 - - - -
0.0459 1500 0.9212 - - - -
0.0490 1600 0.846 - - - -
0.0521 1700 0.9323 - - - -
0.0551 1800 0.8216 - - - -
0.0582 1900 0.7616 - - - -
0.0612 2000 0.8028 1.1716 0.6940 0.6942 0.4072
0.0643 2100 0.8196 - - - -
0.0674 2200 0.8022 - - - -
0.0704 2300 0.814 - - - -
0.0735 2400 0.8388 - - - -
0.0766 2500 0.7658 - - - -
0.0796 2600 0.7226 - - - -
0.0827 2700 0.7802 - - - -
0.0857 2800 0.8148 - - - -
0.0888 2900 0.7444 - - - -
0.0919 3000 0.7463 1.0475 0.6718 0.7019 0.4410
0.0949 3100 0.7129 - - - -
0.0980 3200 0.6884 - - - -
0.1011 3300 0.7072 - - - -
0.1041 3400 0.7956 - - - -
0.1072 3500 0.7932 - - - -
0.1102 3600 0.6843 - - - -
0.1133 3700 0.8722 - - - -
0.1164 3800 0.6767 - - - -
0.1194 3900 0.6905 - - - -
0.1225 4000 0.7022 1.0538 0.6663 0.706 0.4501
0.1256 4100 0.6574 - - - -
0.1286 4200 0.8011 - - - -
0.1317 4300 0.6902 - - - -
0.1347 4400 0.836 - - - -
0.1378 4500 0.6457 - - - -
0.1409 4600 0.6786 - - - -
0.1439 4700 0.7356 - - - -
0.1470 4800 0.8078 - - - -
0.1500 4900 0.7157 - - - -
0.1531 5000 0.6629 1.0507 0.6669 0.7108 0.4493
0.1562 5100 0.7387 - - - -
0.1592 5200 0.7108 - - - -
0.1623 5300 0.6361 - - - -
0.1654 5400 0.6931 - - - -
0.1684 5500 0.7409 - - - -
0.1715 5600 0.7645 - - - -
0.1745 5700 0.6577 - - - -
0.1776 5800 0.7284 - - - -
0.1807 5900 0.6774 - - - -
0.1837 6000 0.7187 1.0089 0.6612 0.7112 0.4569
0.1868 6100 0.6003 - - - -
0.1899 6200 0.7028 - - - -
0.1929 6300 0.7195 - - - -
0.1960 6400 0.6823 - - - -
0.1990 6500 0.6665 - - - -
0.2021 6600 0.6206 - - - -
0.2052 6700 0.6442 - - - -
0.2082 6800 0.7191 - - - -
0.2113 6900 0.6074 - - - -
0.2144 7000 0.6311 1.0315 0.6657 0.7109 0.4451
0.2174 7100 0.6444 - - - -
0.2205 7200 0.6475 - - - -
0.2235 7300 0.5911 - - - -
0.2266 7400 0.6709 - - - -
0.2297 7500 0.6306 - - - -
0.2327 7600 0.7122 - - - -
0.2358 7700 0.6461 - - - -
0.2389 7800 0.6899 - - - -
0.2419 7900 0.6413 - - - -
0.2450 8000 0.691 1.0058 0.6705 0.7158 0.4520
0.2480 8100 0.609 - - - -
0.2511 8200 0.6054 - - - -
0.2542 8300 0.61 - - - -
0.2572 8400 0.596 - - - -
0.2603 8500 0.6999 - - - -
0.2634 8600 0.5909 - - - -
0.2664 8700 0.5965 - - - -
0.2695 8800 0.5951 - - - -
0.2725 8900 0.6058 - - - -
0.2756 9000 0.5979 1.0287 0.6904 0.7142 0.4628
0.2787 9100 0.6249 - - - -
0.2817 9200 0.6261 - - - -
0.2848 9300 0.6365 - - - -
0.2878 9400 0.5699 - - - -
0.2909 9500 0.6675 - - - -
0.2940 9600 0.5806 - - - -
0.2970 9700 0.5832 - - - -
0.3001 9800 0.6135 - - - -
0.3032 9900 0.6005 - - - -
0.3062 10000 0.6079 1.0232 0.7137 0.7163 0.4686
0.3093 10100 0.6452 - - - -
0.3123 10200 0.5765 - - - -
0.3154 10300 0.619 - - - -
0.3185 10400 0.5154 - - - -
0.3215 10500 0.6142 - - - -
0.3246 10600 0.574 - - - -
0.3277 10700 0.5569 - - - -
0.3307 10800 0.6233 - - - -
0.3338 10900 0.6183 - - - -
0.3368 11000 0.5953 1.0279 0.7040 0.71 0.4571
0.3399 11100 0.5146 - - - -
0.3430 11200 0.6029 - - - -
0.3460 11300 0.6054 - - - -
0.3491 11400 0.6324 - - - -
0.3522 11500 0.5459 - - - -
0.3552 11600 0.5721 - - - -
0.3583 11700 0.5224 - - - -
0.3613 11800 0.5979 - - - -
0.3644 11900 0.5832 - - - -
0.3675 12000 0.5638 1.0372 0.7184 0.7127 0.4475
0.3705 12100 0.4945 - - - -
0.3736 12200 0.6368 - - - -
0.3767 12300 0.5071 - - - -
0.3797 12400 0.626 - - - -
0.3828 12500 0.5986 - - - -
0.3858 12600 0.539 - - - -
0.3889 12700 0.5167 - - - -
0.3920 12800 0.657 - - - -
0.3950 12900 0.5264 - - - -
0.3981 13000 0.4996 1.0362 0.7276 0.7162 0.4534
0.4012 13100 0.4991 - - - -
0.4042 13200 0.5725 - - - -
0.4073 13300 0.5788 - - - -
0.4103 13400 0.5293 - - - -
0.4134 13500 0.5192 - - - -
0.4165 13600 0.5477 - - - -
0.4195 13700 0.5151 - - - -
0.4226 13800 0.5121 - - - -
0.4256 13900 0.6849 - - - -
0.4287 14000 0.5508 1.0062 0.7397 0.7103 0.4571
0.4318 14100 0.55 - - - -
0.4348 14200 0.5041 - - - -
0.4379 14300 0.5041 - - - -
0.4410 14400 0.5198 - - - -
0.4440 14500 0.5354 - - - -
0.4471 14600 0.5535 - - - -
0.4501 14700 0.5368 - - - -
0.4532 14800 0.5379 - - - -
0.4563 14900 0.47 - - - -
0.4593 15000 0.5567 1.0441 0.7531 0.715 0.4516
0.4624 15100 0.5157 - - - -
0.4655 15200 0.5698 - - - -
0.4685 15300 0.5436 - - - -
0.4716 15400 0.6344 - - - -
0.4746 15500 0.4351 - - - -
0.4777 15600 0.5286 - - - -
0.4808 15700 0.552 - - - -
0.4838 15800 0.508 - - - -
0.4869 15900 0.5111 - - - -
0.4900 16000 0.5411 1.0264 0.7570 0.7106 0.4506
0.4930 16100 0.5363 - - - -
0.4961 16200 0.5259 - - - -
0.4991 16300 0.5722 - - - -
0.5022 16400 0.5059 - - - -
0.5053 16500 0.5194 - - - -
0.5083 16600 0.5099 - - - -
0.5114 16700 0.4857 - - - -
0.5145 16800 0.4585 - - - -
0.5175 16900 0.5366 - - - -
0.5206 17000 0.4825 1.0299 0.7740 0.7048 0.4504
0.5236 17100 0.543 - - - -
0.5267 17200 0.5022 - - - -
0.5298 17300 0.4399 - - - -
0.5328 17400 0.5342 - - - -
0.5359 17500 0.5064 - - - -
0.5390 17600 0.5978 - - - -
0.5420 17700 0.4947 - - - -
0.5451 17800 0.4974 - - - -
0.5481 17900 0.5555 - - - -
0.5512 18000 0.5397 1.0885 0.7564 0.7143 0.4493
0.5543 18100 0.4415 - - - -
0.5573 18200 0.3887 - - - -
0.5604 18300 0.4956 - - - -
0.5634 18400 0.471 - - - -
0.5665 18500 0.4671 - - - -
0.5696 18600 0.4279 - - - -
0.5726 18700 0.5509 - - - -
0.5757 18800 0.5135 - - - -
0.5788 18900 0.595 - - - -
0.5818 19000 0.4531 1.0569 0.7628 0.708 0.4528
0.5849 19100 0.4926 - - - -
0.5879 19200 0.5718 - - - -
0.5910 19300 0.4963 - - - -
0.5941 19400 0.5222 - - - -
0.5971 19500 0.4079 - - - -
0.6002 19600 0.4662 - - - -
0.6033 19700 0.4838 - - - -
0.6063 19800 0.5238 - - - -
0.6094 19900 0.5475 - - - -
0.6124 20000 0.4 1.0716 0.7965 0.709 0.4691
0.6155 20100 0.5323 - - - -
0.6186 20200 0.4544 - - - -
0.6216 20300 0.4556 - - - -
0.6247 20400 0.5716 - - - -
0.6278 20500 0.5538 - - - -
0.6308 20600 0.4546 - - - -
0.6339 20700 0.4146 - - - -
0.6369 20800 0.4811 - - - -
0.6400 20900 0.4577 - - - -
0.6431 21000 0.4901 1.0721 0.7903 0.7084 0.4594
0.6461 21100 0.4999 - - - -
0.6492 21200 0.3999 - - - -
0.6523 21300 0.4587 - - - -
0.6553 21400 0.4737 - - - -
0.6584 21500 0.4913 - - - -
0.6614 21600 0.4612 - - - -
0.6645 21700 0.432 - - - -
0.6676 21800 0.4627 - - - -
0.6706 21900 0.5023 - - - -
0.6737 22000 0.4486 1.0888 0.7913 0.7056 0.4618
0.6768 22100 0.5068 - - - -
0.6798 22200 0.4843 - - - -
0.6829 22300 0.4687 - - - -
0.6859 22400 0.5123 - - - -
0.6890 22500 0.3802 - - - -
0.6921 22600 0.4883 - - - -
0.6951 22700 0.5069 - - - -
0.6982 22800 0.4859 - - - -
0.7012 22900 0.3931 - - - -
0.7043 23000 0.4675 1.1026 0.8213 0.7051 0.4513
0.7074 23100 0.4948 - - - -
0.7104 23200 0.4561 - - - -
0.7135 23300 0.3874 - - - -
0.7166 23400 0.4909 - - - -
0.7196 23500 0.521 - - - -
0.7227 23600 0.4997 - - - -
0.7257 23700 0.4104 - - - -
0.7288 23800 0.4801 - - - -
0.7319 23900 0.5237 - - - -
0.7349 24000 0.3782 1.0715 0.7962 0.7098 0.4583
0.7380 24100 0.493 - - - -
0.7411 24200 0.489 - - - -
0.7441 24300 0.4797 - - - -
0.7472 24400 0.4636 - - - -
0.7502 24500 0.406 - - - -
0.7533 24600 0.3765 - - - -
0.7564 24700 0.4746 - - - -
0.7594 24800 0.447 - - - -
0.7625 24900 0.5286 - - - -
0.7656 25000 0.4814 1.0794 0.7977 0.7134 0.4614
0.7686 25100 0.505 - - - -
0.7717 25200 0.4508 - - - -
0.7747 25300 0.4317 - - - -
0.7778 25400 0.5088 - - - -
0.7809 25500 0.3931 - - - -
0.7839 25600 0.4516 - - - -
0.7870 25700 0.4394 - - - -
0.7901 25800 0.4825 - - - -
0.7931 25900 0.4248 - - - -
0.7962 26000 0.4215 1.0887 0.8159 0.7065 0.4719
0.7992 26100 0.4674 - - - -
0.8023 26200 0.4634 - - - -
0.8054 26300 0.3975 - - - -
0.8084 26400 0.402 - - - -
0.8115 26500 0.4652 - - - -
0.8146 26600 0.487 - - - -
0.8176 26700 0.4677 - - - -
0.8207 26800 0.4662 - - - -
0.8237 26900 0.4658 - - - -
0.8268 27000 0.4922 1.0792 0.8019 0.7126 0.4544
0.8299 27100 0.4551 - - - -
0.8329 27200 0.4052 - - - -
0.8360 27300 0.3713 - - - -
0.8390 27400 0.4247 - - - -
0.8421 27500 0.4167 - - - -
0.8452 27600 0.4035 - - - -
0.8482 27700 0.5203 - - - -
0.8513 27800 0.4768 - - - -
0.8544 27900 0.4085 - - - -
0.8574 28000 0.3793 1.0920 0.7942 0.7146 0.4630
0.8605 28100 0.4188 - - - -
0.8635 28200 0.4492 - - - -
0.8666 28300 0.4534 - - - -
0.8697 28400 0.4188 - - - -
0.8727 28500 0.5298 - - - -
0.8758 28600 0.4907 - - - -
0.8789 28700 0.4415 - - - -
0.8819 28800 0.4436 - - - -
0.8850 28900 0.4105 - - - -
0.8880 29000 0.5498 1.0937 0.8023 0.7127 0.4492
0.8911 29100 0.4478 - - - -
0.8942 29200 0.4467 - - - -
0.8972 29300 0.3691 - - - -
0.9003 29400 0.358 - - - -
0.9034 29500 0.4101 - - - -
0.9064 29600 0.4568 - - - -
0.9095 29700 0.4776 - - - -
0.9125 29800 0.3909 - - - -
0.9156 29900 0.4731 - - - -
0.9187 30000 0.4407 1.1511 0.8187 0.7131 0.4423
0.9217 30100 0.5712 - - - -
0.9248 30200 0.457 - - - -
0.9279 30300 0.4141 - - - -
0.9309 30400 0.4779 - - - -
0.9340 30500 0.418 - - - -
0.9370 30600 0.4377 - - - -
0.9401 30700 0.3997 - - - -
0.9432 30800 0.3443 - - - -
0.9462 30900 0.5006 - - - -
0.9493 31000 0.4728 1.1302 0.8141 0.7137 0.4555
0.9524 31100 0.5103 - - - -
0.9554 31200 0.3898 - - - -
0.9585 31300 0.4132 - - - -
0.9615 31400 0.4567 - - - -
0.9646 31500 0.4226 - - - -
0.9677 31600 0.3669 - - - -
0.9707 31700 0.4707 - - - -
0.9738 31800 0.5012 - - - -
0.9768 31900 0.4114 - - - -
0.9799 32000 0.3666 1.1309 0.8225 0.7102 0.4632
0.9830 32100 0.4514 - - - -
0.9860 32200 0.4329 - - - -
0.9891 32300 0.4559 - - - -
0.9922 32400 0.412 - - - -
0.9952 32500 0.3883 - - - -
0.9983 32600 0.3854 - - - -
1.0013 32700 0.3886 - - - -
1.0044 32800 0.41 - - - -
1.0075 32900 0.4494 - - - -
1.0105 33000 0.4862 1.1124 0.8362 0.7079 0.4494
1.0136 33100 0.3951 - - - -
1.0167 33200 0.4714 - - - -
1.0197 33300 0.4037 - - - -
1.0228 33400 0.4534 - - - -
1.0258 33500 0.5265 - - - -
1.0289 33600 0.4432 - - - -
1.0320 33700 0.3665 - - - -
1.0350 33800 0.4235 - - - -
1.0381 33900 0.3905 - - - -
1.0412 34000 0.3532 1.1693 0.8203 0.7142 0.4420
1.0442 34100 0.3472 - - - -
1.0473 34200 0.4316 - - - -
1.0503 34300 0.3811 - - - -
1.0534 34400 0.4753 - - - -
1.0565 34500 0.3757 - - - -
1.0595 34600 0.417 - - - -
1.0626 34700 0.3727 - - - -
1.0657 34800 0.4127 - - - -
1.0687 34900 0.4487 - - - -
1.0718 35000 0.3786 1.1073 0.8310 0.7159 0.4678
1.0748 35100 0.4043 - - - -
1.0779 35200 0.4226 - - - -
1.0810 35300 0.3585 - - - -
1.0840 35400 0.407 - - - -
1.0871 35500 0.4682 - - - -
1.0902 35600 0.3273 - - - -
1.0932 35700 0.3594 - - - -
1.0963 35800 0.3795 - - - -
1.0993 35900 0.3633 - - - -
1.1024 36000 0.3729 1.1356 0.8248 0.7169 0.4525
1.1055 36100 0.4179 - - - -
1.1085 36200 0.3907 - - - -
1.1116 36300 0.4495 - - - -
1.1146 36400 0.4093 - - - -
1.1177 36500 0.327 - - - -
1.1208 36600 0.2868 - - - -
1.1238 36700 0.2917 - - - -
1.1269 36800 0.3753 - - - -
1.1300 36900 0.3508 - - - -
1.1330 37000 0.4483 1.1865 0.8488 0.7035 0.4559
1.1361 37100 0.4439 - - - -
1.1391 37200 0.3225 - - - -
1.1422 37300 0.401 - - - -
1.1453 37400 0.3858 - - - -
1.1483 37500 0.4877 - - - -
1.1514 37600 0.3456 - - - -
1.1545 37700 0.3827 - - - -
1.1575 37800 0.4412 - - - -
1.1606 37900 0.3679 - - - -
1.1636 38000 0.3465 1.1654 0.8383 0.7095 0.4498
1.1667 38100 0.3433 - - - -
1.1698 38200 0.3745 - - - -
1.1728 38300 0.3902 - - - -
1.1759 38400 0.2779 - - - -
1.1790 38500 0.3916 - - - -
1.1820 38600 0.346 - - - -
1.1851 38700 0.3742 - - - -
1.1881 38800 0.3424 - - - -
1.1912 38900 0.4042 - - - -
1.1943 39000 0.2993 1.2051 0.8313 0.7106 0.4571
1.1973 39100 0.3167 - - - -
1.2004 39200 0.3291 - - - -
1.2035 39300 0.245 - - - -
1.2065 39400 0.3289 - - - -
1.2096 39500 0.3969 - - - -
1.2126 39600 0.2511 - - - -
1.2157 39700 0.2972 - - - -
1.2188 39800 0.3434 - - - -
1.2218 39900 0.324 - - - -
1.2249 40000 0.2837 1.2372 0.8453 0.7121 0.4562
1.2280 40100 0.2727 - - - -
1.2310 40200 0.3327 - - - -
1.2341 40300 0.3468 - - - -
1.2371 40400 0.3029 - - - -
1.2402 40500 0.3583 - - - -
1.2433 40600 0.3664 - - - -
1.2463 40700 0.2661 - - - -
1.2494 40800 0.2768 - - - -
1.2524 40900 0.3065 - - - -
1.2555 41000 0.309 1.2609 0.8644 0.704 0.4467
1.2586 41100 0.377 - - - -
1.2616 41200 0.3031 - - - -
1.2647 41300 0.2317 - - - -
1.2678 41400 0.2504 - - - -
1.2708 41500 0.2546 - - - -
1.2739 41600 0.2859 - - - -
1.2769 41700 0.3507 - - - -
1.2800 41800 0.2578 - - - -
1.2831 41900 0.297 - - - -
1.2861 42000 0.3016 1.2546 0.8479 0.7115 0.4578
1.2892 42100 0.2067 - - - -
1.2923 42200 0.3729 - - - -
1.2953 42300 0.2365 - - - -
1.2984 42400 0.2855 - - - -
1.3014 42500 0.2272 - - - -
1.3045 42600 0.2688 - - - -
1.3076 42700 0.2285 - - - -
1.3106 42800 0.2615 - - - -
1.3137 42900 0.2599 - - - -
1.3168 43000 0.2968 1.2860 0.8800 0.7071 0.4599
1.3198 43100 0.2464 - - - -
1.3229 43200 0.2673 - - - -
1.3259 43300 0.2108 - - - -
1.3290 43400 0.2353 - - - -
1.3321 43500 0.2396 - - - -
1.3351 43600 0.237 - - - -
1.3382 43700 0.2083 - - - -
1.3413 43800 0.2638 - - - -
1.3443 43900 0.2888 - - - -
1.3474 44000 0.3166 1.2710 0.8729 0.7052 0.4501
1.3504 44100 0.1949 - - - -
1.3535 44200 0.2285 - - - -
1.3566 44300 0.1923 - - - -
1.3596 44400 0.1875 - - - -
1.3627 44500 0.2736 - - - -
1.3658 44600 0.2154 - - - -
1.3688 44700 0.1975 - - - -
1.3719 44800 0.1799 - - - -
1.3749 44900 0.2417 - - - -
1.3780 45000 0.3224 1.3032 0.8788 0.7072 0.4521
1.3811 45100 0.2433 - - - -
1.3841 45200 0.269 - - - -
1.3872 45300 0.2034 - - - -
1.3902 45400 0.236 - - - -
1.3933 45500 0.2599 - - - -
1.3964 45600 0.1798 - - - -
1.3994 45700 0.1412 - - - -
1.4025 45800 0.215 - - - -
1.4056 45900 0.2081 - - - -
1.4086 46000 0.2277 1.2555 0.8621 0.7075 0.4577
1.4117 46100 0.2005 - - - -
1.4147 46200 0.2051 - - - -
1.4178 46300 0.1588 - - - -
1.4209 46400 0.2318 - - - -
1.4239 46500 0.205 - - - -
1.4270 46600 0.2404 - - - -
1.4301 46700 0.2167 - - - -
1.4331 46800 0.1729 - - - -
1.4362 46900 0.1866 - - - -
1.4392 47000 0.2168 1.3006 0.8624 0.7094 0.4562
1.4423 47100 0.1615 - - - -
1.4454 47200 0.2104 - - - -
1.4484 47300 0.2051 - - - -
1.4515 47400 0.1904 - - - -
1.4546 47500 0.1773 - - - -
1.4576 47600 0.1494 - - - -
1.4607 47700 0.1668 - - - -
1.4637 47800 0.1527 - - - -
1.4668 47900 0.1724 - - - -
1.4699 48000 0.1707 1.3098 0.8911 0.7093 0.4421
1.4729 48100 0.2147 - - - -
1.4760 48200 0.1513 - - - -
1.4791 48300 0.2049 - - - -
1.4821 48400 0.171 - - - -
1.4852 48500 0.1283 - - - -
1.4882 48600 0.1768 - - - -
1.4913 48700 0.172 - - - -
1.4944 48800 0.2131 - - - -
1.4974 48900 0.1621 - - - -
1.5005 49000 0.1941 1.3623 0.8859 0.7091 0.4501
1.5036 49100 0.1493 - - - -
1.5066 49200 0.1544 - - - -
1.5097 49300 0.1524 - - - -
1.5127 49400 0.1137 - - - -
1.5158 49500 0.1611 - - - -
1.5189 49600 0.1396 - - - -
1.5219 49700 0.1462 - - - -
1.5250 49800 0.1261 - - - -
1.5280 49900 0.122 - - - -
1.5311 50000 0.1478 1.3027 0.8766 0.7093 0.4555
1.5342 50100 0.1324 - - - -
1.5372 50200 0.1468 - - - -
1.5403 50300 0.1795 - - - -
1.5434 50400 0.1308 - - - -
1.5464 50500 0.1796 - - - -
1.5495 50600 0.2207 - - - -
1.5525 50700 0.1383 - - - -
1.5556 50800 0.0884 - - - -
1.5587 50900 0.1208 - - - -
1.5617 51000 0.1139 1.4073 0.9156 0.7061 0.4502
1.5648 51100 0.169 - - - -
1.5679 51200 0.1142 - - - -
1.5709 51300 0.1269 - - - -
1.5740 51400 0.1664 - - - -
1.5770 51500 0.1191 - - - -
1.5801 51600 0.2078 - - - -
1.5832 51700 0.1045 - - - -
1.5862 51800 0.1564 - - - -
1.5893 51900 0.219 - - - -
1.5924 52000 0.1308 1.3284 0.9085 0.7003 0.4439
1.5954 52100 0.1002 - - - -
1.5985 52200 0.1133 - - - -
1.6015 52300 0.1612 - - - -
1.6046 52400 0.1216 - - - -
1.6077 52500 0.1767 - - - -
1.6107 52600 0.1198 - - - -
1.6138 52700 0.1426 - - - -
1.6169 52800 0.1505 - - - -
1.6199 52900 0.1503 - - - -
1.6230 53000 0.161 1.3557 0.9038 0.7073 0.4517
1.6260 53100 0.1799 - - - -
1.6291 53200 0.1794 - - - -
1.6322 53300 0.1527 - - - -
1.6352 53400 0.1093 - - - -
1.6383 53500 0.1338 - - - -
1.6414 53600 0.1515 - - - -
1.6444 53700 0.1415 - - - -
1.6475 53800 0.1083 - - - -
1.6505 53900 0.0896 - - - -
1.6536 54000 0.1524 1.4412 0.9069 0.7047 0.4428
1.6567 54100 0.1153 - - - -
1.6597 54200 0.1643 - - - -
1.6628 54300 0.0891 - - - -
1.6659 54400 0.1331 - - - -
1.6689 54500 0.14 - - - -
1.6720 54600 0.2027 - - - -
1.6750 54700 0.112 - - - -
1.6781 54800 0.1932 - - - -
1.6812 54900 0.1298 - - - -
1.6842 55000 0.1509 1.3844 0.8949 0.7094 0.4458
1.6873 55100 0.113 - - - -
1.6903 55200 0.1516 - - - -
1.6934 55300 0.1523 - - - -
1.6965 55400 0.1627 - - - -
1.6995 55500 0.1142 - - - -
1.7026 55600 0.1054 - - - -
1.7057 55700 0.1438 - - - -
1.7087 55800 0.0908 - - - -
1.7118 55900 0.1311 - - - -
1.7148 56000 0.0691 1.4079 0.9229 0.7051 0.4484
1.7179 56100 0.1617 - - - -
1.7210 56200 0.1709 - - - -
1.7240 56300 0.102 - - - -
1.7271 56400 0.1384 - - - -
1.7302 56500 0.1339 - - - -
1.7332 56600 0.1961 - - - -
1.7363 56700 0.1549 - - - -
1.7393 56800 0.1545 - - - -
1.7424 56900 0.1175 - - - -
1.7455 57000 0.1447 1.4055 0.9385 0.7006 0.4433
1.7485 57100 0.1392 - - - -
1.7516 57200 0.0765 - - - -
1.7547 57300 0.1444 - - - -
1.7577 57400 0.1617 - - - -
1.7608 57500 0.164 - - - -
1.7638 57600 0.1584 - - - -
1.7669 57700 0.1613 - - - -
1.7700 57800 0.1381 - - - -
1.7730 57900 0.132 - - - -
1.7761 58000 0.1373 1.4008 0.9141 0.7088 0.4456
1.7792 58100 0.1018 - - - -
1.7822 58200 0.0882 - - - -
1.7853 58300 0.1232 - - - -
1.7883 58400 0.1111 - - - -
1.7914 58500 0.0985 - - - -
1.7945 58600 0.1063 - - - -
1.7975 58700 0.0696 - - - -
1.8006 58800 0.113 - - - -
1.8037 58900 0.1048 - - - -
1.8067 59000 0.1305 1.4202 0.9253 0.7046 0.4450
1.8098 59100 0.1203 - - - -
1.8128 59200 0.0975 - - - -
1.8159 59300 0.1163 - - - -
1.8190 59400 0.163 - - - -
1.8220 59500 0.1438 - - - -
1.8251 59600 0.1465 - - - -
1.8281 59700 0.1345 - - - -
1.8312 59800 0.1726 - - - -
1.8343 59900 0.1268 - - - -
1.8373 60000 0.0755 1.4523 0.9355 0.7059 0.4424
1.8404 60100 0.1033 - - - -
1.8435 60200 0.1231 - - - -
1.8465 60300 0.1272 - - - -
1.8496 60400 0.1233 - - - -
1.8526 60500 0.1144 - - - -
1.8557 60600 0.1158 - - - -
1.8588 60700 0.1266 - - - -
1.8618 60800 0.0837 - - - -
1.8649 60900 0.1247 - - - -
1.8680 61000 0.1297 1.4443 0.9315 0.7037 0.4498
1.8710 61100 0.1014 - - - -
1.8741 61200 0.127 - - - -
1.8771 61300 0.128 - - - -
1.8802 61400 0.1021 - - - -
1.8833 61500 0.1625 - - - -
1.8863 61600 0.1177 - - - -
1.8894 61700 0.1241 - - - -
1.8925 61800 0.1289 - - - -
1.8955 61900 0.1144 - - - -
1.8986 62000 0.0968 1.4650 0.9320 0.7012 0.4421
1.9016 62100 0.0951 - - - -
1.9047 62200 0.1262 - - - -
1.9078 62300 0.1387 - - - -
1.9108 62400 0.129 - - - -
1.9139 62500 0.088 - - - -
1.9170 62600 0.1166 - - - -
1.9200 62700 0.1536 - - - -
1.9231 62800 0.1216 - - - -
1.9261 62900 0.1326 - - - -
1.9292 63000 0.1014 1.4315 0.9462 0.6982 0.4377
1.9323 63100 0.1152 - - - -
1.9353 63200 0.0821 - - - -
1.9384 63300 0.1374 - - - -
1.9415 63400 0.0827 - - - -
1.9445 63500 0.1104 - - - -
1.9476 63600 0.1578 - - - -
1.9506 63700 0.1232 - - - -
1.9537 63800 0.1482 - - - -
1.9568 63900 0.1156 - - - -
1.9598 64000 0.1177 1.4263 0.9434 0.706 0.4420
1.9629 64100 0.1074 - - - -
1.9659 64200 0.1385 - - - -
1.9690 64300 0.1083 - - - -
1.9721 64400 0.1138 - - - -
1.9751 64500 0.1383 - - - -
1.9782 64600 0.0786 - - - -
1.9813 64700 0.1043 - - - -
1.9843 64800 0.1112 - - - -
1.9874 64900 0.1237 - - - -
1.9904 65000 0.1073 1.3901 0.9587 0.7002 0.4438
1.9935 65100 0.1174 - - - -
1.9966 65200 0.1091 - - - -
1.9996 65300 0.1143 - - - -
2.0027 65400 0.1044 - - - -
2.0058 65500 0.1279 - - - -
2.0088 65600 0.13 - - - -
2.0119 65700 0.1299 - - - -
2.0149 65800 0.1017 - - - -
2.0180 65900 0.124 - - - -
2.0211 66000 0.1062 1.4390 0.9290 0.704 0.4440
2.0241 66100 0.1634 - - - -
2.0272 66200 0.1149 - - - -
2.0303 66300 0.0682 - - - -
2.0333 66400 0.1386 - - - -
2.0364 66500 0.0861 - - - -
2.0394 66600 0.0669 - - - -
2.0425 66700 0.0944 - - - -
2.0456 66800 0.1332 - - - -
2.0486 66900 0.0884 - - - -
2.0517 67000 0.122 1.5088 0.9513 0.7063 0.4492
2.0548 67100 0.0934 - - - -
2.0578 67200 0.102 - - - -
2.0609 67300 0.1402 - - - -
2.0639 67400 0.1394 - - - -
2.0670 67500 0.1067 - - - -
2.0701 67600 0.1052 - - - -
2.0731 67700 0.1267 - - - -
2.0762 67800 0.1048 - - - -
2.0793 67900 0.0962 - - - -
2.0823 68000 0.0929 1.4530 0.9247 0.7109 0.4491
2.0854 68100 0.1298 - - - -
2.0884 68200 0.1332 - - - -
2.0915 68300 0.0913 - - - -
2.0946 68400 0.0843 - - - -
2.0976 68500 0.0846 - - - -
2.1007 68600 0.1142 - - - -
2.1037 68700 0.1403 - - - -
2.1068 68800 0.0961 - - - -
2.1099 68900 0.0984 - - - -
2.1129 69000 0.1509 1.4343 0.9552 0.7039 0.4382
2.1160 69100 0.0947 - - - -
2.1191 69200 0.0877 - - - -
2.1221 69300 0.0786 - - - -
2.1252 69400 0.0754 - - - -
2.1282 69500 0.0765 - - - -
2.1313 69600 0.0632 - - - -
2.1344 69700 0.1792 - - - -
2.1374 69800 0.0666 - - - -
2.1405 69900 0.1225 - - - -
2.1436 70000 0.0922 1.4291 0.9393 0.7053 0.4357
2.1466 70100 0.126 - - - -
2.1497 70200 0.0991 - - - -
2.1527 70300 0.0759 - - - -
2.1558 70400 0.1024 - - - -
2.1589 70500 0.0894 - - - -
2.1619 70600 0.113 - - - -
2.1650 70700 0.1084 - - - -
2.1681 70800 0.1013 - - - -
2.1711 70900 0.111 - - - -
2.1742 71000 0.0965 1.3915 0.9477 0.7052 0.4437
2.1772 71100 0.0837 - - - -
2.1803 71200 0.0347 - - - -
2.1834 71300 0.1215 - - - -
2.1864 71400 0.0799 - - - -
2.1895 71500 0.1173 - - - -
2.1926 71600 0.0964 - - - -
2.1956 71700 0.1036 - - - -
2.1987 71800 0.0952 - - - -
2.2017 71900 0.0752 - - - -
2.2048 72000 0.0824 1.4657 0.9593 0.7011 0.4397
2.2079 72100 0.1081 - - - -
2.2109 72200 0.0718 - - - -
2.2140 72300 0.0644 - - - -
2.2171 72400 0.0919 - - - -
2.2201 72500 0.1099 - - - -
2.2232 72600 0.072 - - - -
2.2262 72700 0.0675 - - - -
2.2293 72800 0.0568 - - - -
2.2324 72900 0.0664 - - - -
2.2354 73000 0.0926 1.4526 0.9607 0.701 0.4383
2.2385 73100 0.1089 - - - -
2.2415 73200 0.1208 - - - -
2.2446 73300 0.0583 - - - -
2.2477 73400 0.0546 - - - -
2.2507 73500 0.086 - - - -
2.2538 73600 0.1029 - - - -
2.2569 73700 0.0803 - - - -
2.2599 73800 0.114 - - - -
2.2630 73900 0.0542 - - - -
2.2660 74000 0.0732 1.4112 0.9411 0.7028 0.4454
2.2691 74100 0.0641 - - - -
2.2722 74200 0.072 - - - -
2.2752 74300 0.0806 - - - -
2.2783 74400 0.0845 - - - -
2.2814 74500 0.0599 - - - -
2.2844 74600 0.069 - - - -
2.2875 74700 0.0808 - - - -
2.2905 74800 0.0903 - - - -
2.2936 74900 0.0693 - - - -
2.2967 75000 0.074 1.4487 0.9686 0.6945 0.4495
2.2997 75100 0.1261 - - - -
2.3028 75200 0.055 - - - -
2.3059 75300 0.0828 - - - -
2.3089 75400 0.0735 - - - -
2.3120 75500 0.0539 - - - -
2.3150 75600 0.0763 - - - -
2.3181 75700 0.0598 - - - -
2.3212 75800 0.0782 - - - -
2.3242 75900 0.0681 - - - -
2.3273 76000 0.0497 1.4493 0.9589 0.6972 0.4354
2.3304 76100 0.0353 - - - -
2.3334 76200 0.0669 - - - -
2.3365 76300 0.064 - - - -
2.3395 76400 0.0814 - - - -
2.3426 76500 0.0786 - - - -
2.3457 76600 0.091 - - - -
2.3487 76700 0.0861 - - - -
2.3518 76800 0.0445 - - - -
2.3549 76900 0.0589 - - - -
2.3579 77000 0.0318 1.4455 0.9647 0.7012 0.4390
2.3610 77100 0.0425 - - - -
2.3640 77200 0.0605 - - - -
2.3671 77300 0.0523 - - - -
2.3702 77400 0.0715 - - - -
2.3732 77500 0.0756 - - - -
2.3763 77600 0.0911 - - - -
2.3793 77700 0.1023 - - - -
2.3824 77800 0.0538 - - - -
2.3855 77900 0.0571 - - - -
2.3885 78000 0.0505 1.4434 0.9554 0.7048 0.4411
2.3916 78100 0.1114 - - - -
2.3947 78200 0.0368 - - - -
2.3977 78300 0.0636 - - - -
2.4008 78400 0.0419 - - - -
2.4038 78500 0.0691 - - - -
2.4069 78600 0.0814 - - - -
2.4100 78700 0.0644 - - - -
2.4130 78800 0.0584 - - - -
2.4161 78900 0.0745 - - - -
2.4192 79000 0.0558 1.4218 0.9631 0.7024 0.4399
2.4222 79100 0.0478 - - - -
2.4253 79200 0.1116 - - - -
2.4283 79300 0.0487 - - - -
2.4314 79400 0.0457 - - - -
2.4345 79500 0.0441 - - - -
2.4375 79600 0.037 - - - -
2.4406 79700 0.0382 - - - -
2.4437 79800 0.0453 - - - -
2.4467 79900 0.0625 - - - -
2.4498 80000 0.0649 1.4950 0.9400 0.7053 0.4516
2.4528 80100 0.0417 - - - -
2.4559 80200 0.03 - - - -
2.4590 80300 0.0281 - - - -
2.4620 80400 0.0637 - - - -
2.4651 80500 0.0415 - - - -
2.4682 80600 0.048 - - - -
2.4712 80700 0.0653 - - - -
2.4743 80800 0.0382 - - - -
2.4773 80900 0.0524 - - - -
2.4804 81000 0.0699 1.4317 0.9833 0.7024 0.4487
2.4835 81100 0.0728 - - - -
2.4865 81200 0.0346 - - - -
2.4896 81300 0.0448 - - - -
2.4927 81400 0.0425 - - - -
2.4957 81500 0.0941 - - - -
2.4988 81600 0.0385 - - - -
2.5018 81700 0.0802 - - - -
2.5049 81800 0.033 - - - -
2.5080 81900 0.0653 - - - -
2.5110 82000 0.0435 1.4630 0.9643 0.7006 0.4467
2.5141 82100 0.0301 - - - -
2.5171 82200 0.0392 - - - -
2.5202 82300 0.0402 - - - -
2.5233 82400 0.0584 - - - -
2.5263 82500 0.0334 - - - -
2.5294 82600 0.0409 - - - -
2.5325 82700 0.0587 - - - -
2.5355 82800 0.0412 - - - -
2.5386 82900 0.0477 - - - -
2.5416 83000 0.0485 1.4052 0.9665 0.7021 0.4382
2.5447 83100 0.0501 - - - -
2.5478 83200 0.0492 - - - -
2.5508 83300 0.0829 - - - -
2.5539 83400 0.0571 - - - -
2.5570 83500 0.0353 - - - -
2.5600 83600 0.0439 - - - -
2.5631 83700 0.0264 - - - -
2.5661 83800 0.0743 - - - -
2.5692 83900 0.0467 - - - -
2.5723 84000 0.0442 1.4663 0.9854 0.7022 0.4380
2.5753 84100 0.0461 - - - -
2.5784 84200 0.0542 - - - -
2.5815 84300 0.0842 - - - -
2.5845 84400 0.0361 - - - -
2.5876 84500 0.0577 - - - -
2.5906 84600 0.0345 - - - -
2.5937 84700 0.0296 - - - -
2.5968 84800 0.025 - - - -
2.5998 84900 0.0575 - - - -
2.6029 85000 0.0567 1.5179 0.9760 0.7008 0.4469
2.6060 85100 0.0528 - - - -
2.6090 85200 0.0437 - - - -
2.6121 85300 0.0291 - - - -
2.6151 85400 0.0689 - - - -
2.6182 85500 0.0328 - - - -
2.6213 85600 0.0473 - - - -
2.6243 85700 0.0682 - - - -
2.6274 85800 0.0544 - - - -
2.6305 85900 0.0641 - - - -
2.6335 86000 0.029 1.4558 0.9576 0.7072 0.4448
2.6366 86100 0.0364 - - - -
2.6396 86200 0.0365 - - - -
2.6427 86300 0.0514 - - - -
2.6458 86400 0.0417 - - - -
2.6488 86500 0.0309 - - - -
2.6519 86600 0.035 - - - -
2.6549 86700 0.044 - - - -
2.6580 86800 0.0694 - - - -
2.6611 86900 0.0194 - - - -
2.6641 87000 0.0373 1.5556 0.9819 0.7052 0.4339
2.6672 87100 0.0349 - - - -
2.6703 87200 0.0561 - - - -
2.6733 87300 0.0487 - - - -
2.6764 87400 0.0722 - - - -
2.6794 87500 0.0501 - - - -
2.6825 87600 0.0404 - - - -
2.6856 87700 0.0533 - - - -
2.6886 87800 0.0371 - - - -
2.6917 87900 0.0585 - - - -
2.6948 88000 0.0482 1.4305 0.9550 0.7053 0.4397
2.6978 88100 0.0731 - - - -
2.7009 88200 0.0312 - - - -
2.7039 88300 0.0339 - - - -
2.7070 88400 0.0348 - - - -
2.7101 88500 0.0509 - - - -
2.7131 88600 0.0343 - - - -
2.7162 88700 0.0282 - - - -
2.7193 88800 0.0518 - - - -
2.7223 88900 0.0569 - - - -
2.7254 89000 0.0427 1.4783 0.9810 0.7066 0.4263
2.7284 89100 0.0554 - - - -
2.7315 89200 0.0368 - - - -
2.7346 89300 0.0301 - - - -
2.7376 89400 0.0469 - - - -
2.7407 89500 0.0479 - - - -
2.7438 89600 0.0586 - - - -
2.7468 89700 0.0687 - - - -
2.7499 89800 0.0427 - - - -
2.7529 89900 0.0551 - - - -
2.7560 90000 0.0255 1.4759 0.9758 0.7011 0.4434
2.7591 90100 0.0348 - - - -
2.7621 90200 0.0536 - - - -
2.7652 90300 0.0554 - - - -
2.7683 90400 0.0367 - - - -
2.7713 90500 0.0185 - - - -
2.7744 90600 0.0498 - - - -
2.7774 90700 0.0296 - - - -
2.7805 90800 0.0132 - - - -
2.7836 90900 0.0607 - - - -
2.7866 91000 0.0303 1.4869 0.9817 0.704 0.4442
2.7897 91100 0.0463 - - - -
2.7927 91200 0.0302 - - - -
2.7958 91300 0.04 - - - -
2.7989 91400 0.0406 - - - -
2.8019 91500 0.0202 - - - -
2.8050 91600 0.0397 - - - -
2.8081 91700 0.0313 - - - -
2.8111 91800 0.0419 - - - -
2.8142 91900 0.055 - - - -
2.8172 92000 0.0377 1.5091 0.9752 0.7071 0.4336
2.8203 92100 0.0386 - - - -
2.8234 92200 0.0497 - - - -
2.8264 92300 0.061 - - - -
2.8295 92400 0.0683 - - - -
2.8326 92500 0.0846 - - - -
2.8356 92600 0.0121 - - - -
2.8387 92700 0.0268 - - - -
2.8417 92800 0.0205 - - - -
2.8448 92900 0.0414 - - - -
2.8479 93000 0.0443 1.5421 0.9798 0.708 0.4298
2.8509 93100 0.0365 - - - -
2.8540 93200 0.0431 - - - -
2.8571 93300 0.0254 - - - -
2.8601 93400 0.0348 - - - -
2.8632 93500 0.0408 - - - -
2.8662 93600 0.0481 - - - -
2.8693 93700 0.0303 - - - -
2.8724 93800 0.0512 - - - -
2.8754 93900 0.0563 - - - -
2.8785 94000 0.0506 1.6061 0.9909 0.7043 0.4392
2.8816 94100 0.0224 - - - -
2.8846 94200 0.0652 - - - -
2.8877 94300 0.0313 - - - -
2.8907 94400 0.0657 - - - -
2.8938 94500 0.0605 - - - -
2.8969 94600 0.0332 - - - -
2.8999 94700 0.0126 - - - -
2.9030 94800 0.0374 - - - -
2.9061 94900 0.051 - - - -
2.9091 95000 0.0477 1.5915 1.0124 0.702 0.4299

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.0
  • Transformers: 4.38.2
  • PyTorch: 2.1.2+cu121
  • Accelerate: 0.27.2
  • Datasets: 2.19.1
  • Tokenizers: 0.15.2

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}

CachedMultipleNegativesRankingLoss

@misc{gao2021scaling,
    title={Scaling Deep Contrastive Learning Batch Size under Memory Limited Setup}, 
    author={Luyu Gao and Yunyi Zhang and Jiawei Han and Jamie Callan},
    year={2021},
    eprint={2101.06983},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
}

AnglELoss

@misc{li2023angleoptimized,
    title={AnglE-optimized Text Embeddings}, 
    author={Xianming Li and Jing Li},
    year={2023},
    eprint={2309.12871},
    archivePrefix={arXiv},
    primaryClass={cs.CL}
}