bobox's picture
Cached Gist embedd loss
a16da11 verified
metadata
base_model: bobox/DeBERTa-small-ST-v1-test-step3
datasets: []
language: []
library_name: sentence-transformers
metrics:
  - pearson_cosine
  - spearman_cosine
  - pearson_manhattan
  - spearman_manhattan
  - pearson_euclidean
  - spearman_euclidean
  - pearson_dot
  - spearman_dot
  - pearson_max
  - spearman_max
pipeline_tag: sentence-similarity
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:260034
  - loss:CachedGISTEmbedLoss
widget:
  - source_sentence: who used to present one man and his dog
    sentences:
      - >-
        One Man and His Dog One Man and His Dog is a BBC television series in
        the United Kingdom featuring sheepdog trials, originally presented by
        Phil Drabble, with commentary by Eric Halsall and, later, by Ray
        Ollerenshaw. It was first aired on 17 February 1976 and continues today
        (since 2013) as a special annual edition of Countryfile. In 1994, Robin
        Page replaced Drabble as the main presenter. Gus Dermody took over as
        commentator until 2012.
      - "animal adjectives [was: ratto, Ratte, raton] - Google Groups animal adjectives [was: ratto, Ratte, raton] Showing 1-9 of 9 messages While trying find the pronunciation of the word \"munger\", I encountered the nearby word \_ \_murine [MYOO-ryn] = relating to mice or rats \_ \_[from Latin _murinus_, which derives from _mus_, \_ \_mouse, whose genetive form is _muris_] So if you need an adjective to refer to lab rodents like _ratto_ or _mausu_, \"murine\" it is. (I would never have discovered this except in an alphabetically arranged dictionary.) There are a lot of animal adjectives of this type, such as ovine (sheep), equine (horse), bovine (bull, cow, calf), aquiline (eagle), murine (rats and mice). \_ But what is needed is a way to lookup an animal and find what the proper adjective is. \_For example, is there an adjective form for \"goat\"? for \"seal\"? for \"elephant\"? for \"whale\"? for \"walrus\"? By the way, I never did find out how \"munger\" is pronounced; the answer is not found in"
      - >-
        A boat is docked and filled with bicycles next to a grassy area on a
        body of water.
  - source_sentence: There were 29 Muslims fatalities in the Cave of the Patriarchs massacre .
    sentences:
      - >-
        Urban Dictionary: Dog and Bone Dog and Bone Cockney rhyming slang for
        phone - the telephone. ''Pick up the dog and bone now'' by Brendan April
        05, 2003 Create a mug The Urban Dictionary Mug One side has the word,
        one side has the definition. Microwave and dishwasher safe. Lotsa space
        for your liquids. Buy the t-shirt The Urban Dictionary T-Shirt Smooth,
        soft, slim fit American Apparel shirt. Custom printed. 100% fine jersey
        cotton, except for heather grey (90% cotton). ^Same as above except can
        be shortened further to 'Dogs' or just 'dog' Get on the dogs and give us
        a bell when your ready. by Phaze October 14, 2004
      - >-
        RAF College Cranwell - Local Area Information RAF College Cranwell Local
        Area Information Local Area Information RAF College Cranwell is situated
        in the North Kesteven District Council area in the heart of rural
        Lincolnshire, 5 miles from Sleaford and 14 miles from the City of
        Lincoln, surrounded by bustling market towns, picturesque villages and
        landscapes steeped in aviation history. Lincolnshire is currently home
        to several operational RAF airfields and was a key location during WWII
        for bomber stations. Museums, memorials, former airfields, heritage and
        visitor centres bear witness to the bravery of the men and women of this
        time. The ancient City of Lincoln dates back at least to Roman times and
        boasts a spectacular Cathedral and Castle area, whilst Sleaford is the
        home to the National Centre for Craft & Design. Please click on the Logo
        to access website
      - >-
        29 Muslims were killed and more than 100 others wounded . [   Settlers
        remember gunman Goldstein ; Hebron riots continue ] .
  - source_sentence: What requires energy for growth?
    sentences:
      - >-
        an organism requires energy for growth. Fish Fish are the ultimate
        aquatic organism. 
         a fish require energy for growth
      - >-
        In August , after the end of the war in June 1902 , Higgins Southampton
        left the `` SSBavarian '' and returned to Cape Town the following month
        .
      - >-
        Rhinestone Cowboy "Rhinestone Cowboy" is a song written by Larry Weiss
        and most famously recorded by American country music singer Glen
        Campbell. The song enjoyed huge popularity with both country and pop
        audiences when it was released in 1975.
  - source_sentence: Burning wood is used to produce what type of energy?
    sentences:
      - >-
        Shawnee Trails Council was formed from the merger of the Four Rivers
        Council and the Audubon Council .
      - A Mercedes parked next to a parking meter on a street.
      - |-
        burning wood is used to produce heat. Heat is kinetic energy. 
         burning wood is used to produce kinetic energy.
  - source_sentence: >-
      As of March , more than 413,000 cases have been confirmed in more than 190
      countries with more than 107,000 recoveries .
    sentences:
      - >-
        As of 24 March , more than 414,000 cases of COVID-19 have been reported
        in more than 190 countries and territories , resulting in more than
        18,500 deaths and more than 108,000 recoveries .
      - >-
        Pope Francis makes first visit as head of state to Italy\'s president -
        YouTube Pope Francis makes first visit as head of state to Italy\'s
        president Want to watch this again later? Sign in to add this video to a
        playlist. Need to report the video? Sign in to report inappropriate
        content. The interactive transcript could not be loaded. Loading...
        Rating is available when the video has been rented. This feature is not
        available right now. Please try again later. Published on Nov 14, 2013
        Pope Francis stepped out of the Vatican, several hundred feet into the
        heart of Rome, to meet with Italian President Giorgio Napolitano, and
        the country\'s Council of Ministers. . --------------------- Suscríbete
        al canal: http://smarturl.it/RomeReports Visita nuestra web:
        http://www.romereports.com/ ROME REPORTS, www.romereports.com, is an
        independent international TV News Agency based in Rome covering the
        activity of the Pope, the life of the Vatican and current social,
        cultural and religious debates. Reporting on the Catholic Church
        requires proximity to the source, in-depth knowledge of the Institution,
        and a high standard of creativity and technical excellence. As few
        broadcasters have a permanent correspondent in Rome, ROME REPORTS is
        geared to inform the public and meet the needs of television
        broadcasting companies around the world through daily news packages,
        weekly newsprograms and documentaries. ---------------------
      - >-
        German shepherds and retrievers are commonly used, but the Belgian
        Malinois has proven to be one of the most outstanding working dogs used
        in military service. Around 85 percent of military working dogs are
        purchased in Germany or the Netherlands, where they have been breeding
        dogs for military purposes for hundreds of years. In addition, the Air
        Force Security Forces Center, Army Veterinary Corps and the 341st
        Training Squadron combine efforts to raise their own dogs; nearly 15
        percent of all military working dogs are now bred here.
model-index:
  - name: SentenceTransformer based on bobox/DeBERTa-small-ST-v1-test-step3
    results:
      - task:
          type: semantic-similarity
          name: Semantic Similarity
        dataset:
          name: sts test
          type: sts-test
        metrics:
          - type: pearson_cosine
            value: 0.875643593885091
            name: Pearson Cosine
          - type: spearman_cosine
            value: 0.9063415240472948
            name: Spearman Cosine
          - type: pearson_manhattan
            value: 0.9077403211524888
            name: Pearson Manhattan
          - type: spearman_manhattan
            value: 0.9055112293832712
            name: Spearman Manhattan
          - type: pearson_euclidean
            value: 0.9077080621981075
            name: Pearson Euclidean
          - type: spearman_euclidean
            value: 0.9061498543947556
            name: Spearman Euclidean
          - type: pearson_dot
            value: 0.8591462310934479
            name: Pearson Dot
          - type: spearman_dot
            value: 0.8674279304506193
            name: Spearman Dot
          - type: pearson_max
            value: 0.9077403211524888
            name: Pearson Max
          - type: spearman_max
            value: 0.9063415240472948
            name: Spearman Max

SentenceTransformer based on bobox/DeBERTa-small-ST-v1-test-step3

This is a sentence-transformers model finetuned from bobox/DeBERTa-small-ST-v1-test-step3 on the bobox/enhanced_nli-50_k dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: bobox/DeBERTa-small-ST-v1-test-step3
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 768 tokens
  • Similarity Function: Cosine Similarity
  • Training Dataset:
    • bobox/enhanced_nli-50_k

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: DebertaV2Model 
  (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("bobox/DeBERTa-small-ST-v1-test-UnifiedDatasets-Ft2")
# Run inference
sentences = [
    'As of March , more than 413,000 cases have been confirmed in more than 190 countries with more than 107,000 recoveries .',
    'As of 24 March , more than 414,000 cases of COVID-19 have been reported in more than 190 countries and territories , resulting in more than 18,500 deaths and more than 108,000 recoveries .',
    'German shepherds and retrievers are commonly used, but the Belgian Malinois has proven to be one of the most outstanding working dogs used in military service. Around 85 percent of military working dogs are purchased in Germany or the Netherlands, where they have been breeding dogs for military purposes for hundreds of years. In addition, the Air Force Security Forces Center, Army Veterinary Corps and the 341st Training Squadron combine efforts to raise their own dogs; nearly 15 percent of all military working dogs are now bred here.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Evaluation

Metrics

Semantic Similarity

Metric Value
pearson_cosine 0.8756
spearman_cosine 0.9063
pearson_manhattan 0.9077
spearman_manhattan 0.9055
pearson_euclidean 0.9077
spearman_euclidean 0.9061
pearson_dot 0.8591
spearman_dot 0.8674
pearson_max 0.9077
spearman_max 0.9063

Training Details

Training Dataset

bobox/enhanced_nli-50_k

  • Dataset: bobox/enhanced_nli-50_k
  • Size: 260,034 training samples
  • Columns: sentence1 and sentence2
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2
    type string string
    details
    • min: 4 tokens
    • mean: 39.12 tokens
    • max: 344 tokens
    • min: 2 tokens
    • mean: 60.17 tokens
    • max: 442 tokens
  • Samples:
    sentence1 sentence2
    Temple Meads Railway Station is in which English city? Bristol Temple Meads station roof to be replaced - BBC News BBC News Bristol Temple Meads station roof to be replaced 17 October 2013 Image caption Bristol Temple Meads was designed by Isambard Kingdom Brunel Image caption It will cost Network Rail £15m to replace the station's roof Image caption A pact has been signed to redevelop the station over the next 25 years The entire roof on Bristol Temple Meads railway station is to be replaced. Network Rail says it has secured £15m to carry out maintenance of the roof and install new lighting and cables. The announcement was made as a pact was signed to "significantly transform" the station over the next 25 years. Network Rail, Bristol City Council, the West of England Local Enterprise Partnership, Homes and Communities Agency and English Heritage are supporting the plan. Each has signed the 25-year memorandum of understanding to redevelop the station. Patrick Hallgate, of Network Rail Western, said: "Our plans for Bristol will see the railway significantly transformed by the end of the decade, with more seats, better connections and more frequent services." The railway station was designed by Isambard Kingdom Brunel and opened in 1840.
    Where do most of the digestion reactions occur? Most of the digestion reactions occur in the small intestine.
    Sacko, 22, joined Sporting from French top-flight side Bordeaux in 2014, but has so far been limited to playing for the Portuguese club's B team.
    The former France Under-20 player joined Ligue 2 side Sochaux on loan in February and scored twice in 14 games.
    He is Leeds' third signing of the transfer window, following the arrivals of Marcus Antonsson and Kyle Bartley.
    Find all the latest football transfers on our dedicated page.
    Leeds have signed Sporting Lisbon forward Hadi Sacko on a season-long loan with a view to a permanent deal.
  • Loss: CachedGISTEmbedLoss with these parameters:
    {'guide': SentenceTransformer(
      (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
      (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
      (2): Normalize()
    ), 'temperature': 0.025}
    

Evaluation Dataset

bobox/enhanced_nli-50_k

  • Dataset: bobox/enhanced_nli-50_k
  • Size: 1,506 evaluation samples
  • Columns: sentence1 and sentence2
  • Approximate statistics based on the first 1000 samples:
    sentence1 sentence2
    type string string
    details
    • min: 3 tokens
    • mean: 31.16 tokens
    • max: 340 tokens
    • min: 2 tokens
    • mean: 62.3 tokens
    • max: 455 tokens
  • Samples:
    sentence1 sentence2
    Interestingly, snakes use their forked tongues to smell. Snakes use their tongue to smell things.
    A voltaic cell generates an electric current through a reaction known as a(n) spontaneous redox. A voltaic cell uses what type of reaction to generate an electric current
    As of March 22 , there were more than 321,000 cases with over 13,600 deaths and more than 96,000 recoveries reported worldwide . As of 22 March , more than 321,000 cases of COVID-19 have been reported in over 180 countries and territories , resulting in more than 13,600 deaths and 96,000 recoveries .
  • Loss: CachedGISTEmbedLoss with these parameters:
    {'guide': SentenceTransformer(
      (0): Transformer({'max_seq_length': 512, 'do_lower_case': False}) with Transformer model: BertModel 
      (1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
      (2): Normalize()
    ), 'temperature': 0.025}
    

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 320
  • per_device_eval_batch_size: 128
  • learning_rate: 2e-05
  • weight_decay: 0.0001
  • num_train_epochs: 1
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {'num_cycles': 3}
  • warmup_ratio: 0.25
  • save_safetensors: False
  • fp16: True
  • push_to_hub: True
  • hub_model_id: bobox/DeBERTa-small-ST-v1-test-UnifiedDatasets-Ft2-checkpoints-tmp
  • hub_strategy: all_checkpoints
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 320
  • per_device_eval_batch_size: 128
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • torch_empty_cache_steps: None
  • learning_rate: 2e-05
  • weight_decay: 0.0001
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 1
  • max_steps: -1
  • lr_scheduler_type: cosine_with_restarts
  • lr_scheduler_kwargs: {'num_cycles': 3}
  • warmup_ratio: 0.25
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: False
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: True
  • resume_from_checkpoint: None
  • hub_model_id: bobox/DeBERTa-small-ST-v1-test-UnifiedDatasets-Ft2-checkpoints-tmp
  • hub_strategy: all_checkpoints
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • eval_on_start: False
  • eval_use_gather_object: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Click to expand
Epoch Step Training Loss loss sts-test_spearman_cosine
0.0012 1 0.3208 - -
0.0025 2 0.1703 - -
0.0037 3 0.3362 - -
0.0049 4 0.3346 - -
0.0062 5 0.2484 - -
0.0074 6 0.2249 - -
0.0086 7 0.2724 - -
0.0098 8 0.251 - -
0.0111 9 0.2413 - -
0.0123 10 0.382 - -
0.0135 11 0.2695 - -
0.0148 12 0.2392 - -
0.0160 13 0.3603 - -
0.0172 14 0.3282 - -
0.0185 15 0.2878 - -
0.0197 16 0.3046 - -
0.0209 17 0.3946 - -
0.0221 18 0.2038 - -
0.0234 19 0.3542 - -
0.0246 20 0.2369 - -
0.0258 21 0.1967 0.1451 0.9081
0.0271 22 0.2368 - -
0.0283 23 0.263 - -
0.0295 24 0.3595 - -
0.0308 25 0.3073 - -
0.0320 26 0.2232 - -
0.0332 27 0.1822 - -
0.0344 28 0.251 - -
0.0357 29 0.2677 - -
0.0369 30 0.3252 - -
0.0381 31 0.2058 - -
0.0394 32 0.3083 - -
0.0406 33 0.2109 - -
0.0418 34 0.2751 - -
0.0431 35 0.2269 - -
0.0443 36 0.2333 - -
0.0455 37 0.2747 - -
0.0467 38 0.1285 - -
0.0480 39 0.3659 - -
0.0492 40 0.3991 - -
0.0504 41 0.2647 - -
0.0517 42 0.3627 0.1373 0.9084
0.0529 43 0.2026 - -
0.0541 44 0.1923 - -
0.0554 45 0.2369 - -
0.0566 46 0.2268 - -
0.0578 47 0.2975 - -
0.0590 48 0.1922 - -
0.0603 49 0.1906 - -
0.0615 50 0.2379 - -
0.0627 51 0.3796 - -
0.0640 52 0.1821 - -
0.0652 53 0.1257 - -
0.0664 54 0.2368 - -
0.0677 55 0.294 - -
0.0689 56 0.2594 - -
0.0701 57 0.2972 - -
0.0713 58 0.2297 - -
0.0726 59 0.1487 - -
0.0738 60 0.182 - -
0.0750 61 0.2516 - -
0.0763 62 0.2809 - -
0.0775 63 0.1371 0.1308 0.9068
0.0787 64 0.2149 - -
0.0800 65 0.1806 - -
0.0812 66 0.1458 - -
0.0824 67 0.249 - -
0.0836 68 0.2787 - -
0.0849 69 0.288 - -
0.0861 70 0.1461 - -
0.0873 71 0.2304 - -
0.0886 72 0.3505 - -
0.0898 73 0.2227 - -
0.0910 74 0.1746 - -
0.0923 75 0.1484 - -
0.0935 76 0.1346 - -
0.0947 77 0.2112 - -
0.0959 78 0.3138 - -
0.0972 79 0.2675 - -
0.0984 80 0.2849 - -
0.0996 81 0.1719 - -
0.1009 82 0.2749 - -
0.1021 83 0.3097 - -
0.1033 84 0.2068 0.1260 0.9045
0.1046 85 0.22 - -
0.1058 86 0.2977 - -
0.1070 87 0.209 - -
0.1082 88 0.2215 - -
0.1095 89 0.1948 - -
0.1107 90 0.2084 - -
0.1119 91 0.1823 - -
0.1132 92 0.255 - -
0.1144 93 0.2675 - -
0.1156 94 0.18 - -
0.1169 95 0.2891 - -
0.1181 96 0.253 - -
0.1193 97 0.3481 - -
0.1205 98 0.1688 - -
0.1218 99 0.1808 - -
0.1230 100 0.2821 - -
0.1242 101 0.1856 - -
0.1255 102 0.1441 - -
0.1267 103 0.226 - -
0.1279 104 0.1662 - -
0.1292 105 0.2043 0.1187 0.9051
0.1304 106 0.3907 - -
0.1316 107 0.1332 - -
0.1328 108 0.2243 - -
0.1341 109 0.162 - -
0.1353 110 0.1481 - -
0.1365 111 0.2163 - -
0.1378 112 0.24 - -
0.1390 113 0.1406 - -
0.1402 114 0.1522 - -
0.1415 115 0.2593 - -
0.1427 116 0.2426 - -
0.1439 117 0.1781 - -
0.1451 118 0.264 - -
0.1464 119 0.1944 - -
0.1476 120 0.1341 - -
0.1488 121 0.155 - -
0.1501 122 0.2052 - -
0.1513 123 0.2023 - -
0.1525 124 0.1519 - -
0.1538 125 0.2118 - -
0.1550 126 0.2489 0.1147 0.9058
0.1562 127 0.1988 - -
0.1574 128 0.1541 - -
0.1587 129 0.1819 - -
0.1599 130 0.1582 - -
0.1611 131 0.2866 - -
0.1624 132 0.2766 - -
0.1636 133 0.1299 - -
0.1648 134 0.2558 - -
0.1661 135 0.1687 - -
0.1673 136 0.173 - -
0.1685 137 0.2276 - -
0.1697 138 0.2174 - -
0.1710 139 0.2666 - -
0.1722 140 0.1524 - -
0.1734 141 0.1179 - -
0.1747 142 0.2475 - -
0.1759 143 0.2662 - -
0.1771 144 0.1596 - -
0.1784 145 0.2331 - -
0.1796 146 0.2905 - -
0.1808 147 0.1342 0.1088 0.9051
0.1820 148 0.0839 - -
0.1833 149 0.2055 - -
0.1845 150 0.2196 - -
0.1857 151 0.2283 - -
0.1870 152 0.2105 - -
0.1882 153 0.1534 - -
0.1894 154 0.1954 - -
0.1907 155 0.1332 - -
0.1919 156 0.19 - -
0.1931 157 0.1878 - -
0.1943 158 0.1518 - -
0.1956 159 0.1906 - -
0.1968 160 0.155 - -
0.1980 161 0.1519 - -
0.1993 162 0.1726 - -
0.2005 163 0.1618 - -
0.2017 164 0.2767 - -
0.2030 165 0.1996 - -
0.2042 166 0.1907 - -
0.2054 167 0.1928 - -
0.2066 168 0.1507 0.1082 0.9045
0.2079 169 0.1637 - -
0.2091 170 0.1687 - -
0.2103 171 0.2181 - -
0.2116 172 0.1496 - -
0.2128 173 0.1749 - -
0.2140 174 0.2374 - -
0.2153 175 0.2122 - -
0.2165 176 0.1617 - -
0.2177 177 0.168 - -
0.2189 178 0.263 - -
0.2202 179 0.1328 - -
0.2214 180 0.3157 - -
0.2226 181 0.2164 - -
0.2239 182 0.1255 - -
0.2251 183 0.2863 - -
0.2263 184 0.155 - -
0.2276 185 0.1271 - -
0.2288 186 0.216 - -
0.2300 187 0.205 - -
0.2312 188 0.1575 - -
0.2325 189 0.1939 0.1057 0.9046
0.2337 190 0.2209 - -
0.2349 191 0.153 - -
0.2362 192 0.2187 - -
0.2374 193 0.1593 - -
0.2386 194 0.173 - -
0.2399 195 0.2377 - -
0.2411 196 0.2281 - -
0.2423 197 0.2651 - -
0.2435 198 0.118 - -
0.2448 199 0.1728 - -
0.2460 200 0.2299 - -
0.2472 201 0.2342 - -
0.2485 202 0.2413 - -
0.2497 203 0.168 - -
0.2509 204 0.1474 - -
0.2522 205 0.1102 - -
0.2534 206 0.2326 - -
0.2546 207 0.1787 - -
0.2558 208 0.1423 - -
0.2571 209 0.2069 - -
0.2583 210 0.136 0.1040 0.9056
0.2595 211 0.2407 - -
0.2608 212 0.212 - -
0.2620 213 0.1361 - -
0.2632 214 0.2356 - -
0.2645 215 0.1059 - -
0.2657 216 0.2501 - -
0.2669 217 0.1817 - -
0.2681 218 0.2022 - -
0.2694 219 0.2235 - -
0.2706 220 0.2437 - -
0.2718 221 0.1859 - -
0.2731 222 0.2167 - -
0.2743 223 0.1495 - -
0.2755 224 0.2876 - -
0.2768 225 0.1842 - -
0.2780 226 0.144 - -
0.2792 227 0.1571 - -
0.2804 228 0.209 - -
0.2817 229 0.2075 - -
0.2829 230 0.1722 - -
0.2841 231 0.1464 0.1039 0.9087
0.2854 232 0.2675 - -
0.2866 233 0.2585 - -
0.2878 234 0.134 - -
0.2891 235 0.1765 - -
0.2903 236 0.1826 - -
0.2915 237 0.222 - -
0.2927 238 0.134 - -
0.2940 239 0.1902 - -
0.2952 240 0.2461 - -
0.2964 241 0.3094 - -
0.2977 242 0.2252 - -
0.2989 243 0.2466 - -
0.3001 244 0.139 - -
0.3014 245 0.154 - -
0.3026 246 0.1979 - -
0.3038 247 0.1121 - -
0.3050 248 0.1361 - -
0.3063 249 0.2492 - -
0.3075 250 0.1903 - -
0.3087 251 0.2333 - -
0.3100 252 0.1805 0.1030 0.9099
0.3112 253 0.1929 - -
0.3124 254 0.1424 - -
0.3137 255 0.2318 - -
0.3149 256 0.1524 - -
0.3161 257 0.2195 - -
0.3173 258 0.1338 - -
0.3186 259 0.2543 - -
0.3198 260 0.202 - -
0.3210 261 0.1489 - -
0.3223 262 0.1937 - -
0.3235 263 0.2334 - -
0.3247 264 0.1942 - -
0.3260 265 0.2013 - -
0.3272 266 0.2954 - -
0.3284 267 0.188 - -
0.3296 268 0.1688 - -
0.3309 269 0.1415 - -
0.3321 270 0.2249 - -
0.3333 271 0.2606 - -
0.3346 272 0.2559 - -
0.3358 273 0.2673 0.1039 0.9078
0.3370 274 0.1618 - -
0.3383 275 0.2602 - -
0.3395 276 0.2339 - -
0.3407 277 0.1843 - -
0.3419 278 0.133 - -
0.3432 279 0.2345 - -
0.3444 280 0.2808 - -
0.3456 281 0.1044 - -
0.3469 282 0.1622 - -
0.3481 283 0.1303 - -
0.3493 284 0.1453 - -
0.3506 285 0.237 - -
0.3518 286 0.1726 - -
0.3530 287 0.2195 - -
0.3542 288 0.3016 - -
0.3555 289 0.1626 - -
0.3567 290 0.1902 - -
0.3579 291 0.1387 - -
0.3592 292 0.1047 - -
0.3604 293 0.1954 - -
0.3616 294 0.2089 0.1029 0.9083
0.3629 295 0.1485 - -
0.3641 296 0.1724 - -
0.3653 297 0.2017 - -
0.3665 298 0.1591 - -
0.3678 299 0.2396 - -
0.3690 300 0.1395 - -
0.3702 301 0.1806 - -
0.3715 302 0.1882 - -
0.3727 303 0.1188 - -
0.3739 304 0.1564 - -
0.3752 305 0.313 - -
0.3764 306 0.1455 - -
0.3776 307 0.1535 - -
0.3788 308 0.099 - -
0.3801 309 0.1733 - -
0.3813 310 0.1891 - -
0.3825 311 0.2128 - -
0.3838 312 0.2042 - -
0.3850 313 0.203 - -
0.3862 314 0.2249 - -
0.3875 315 0.1597 0.1014 0.9074
0.3887 316 0.1358 - -
0.3899 317 0.207 - -
0.3911 318 0.193 - -
0.3924 319 0.1141 - -
0.3936 320 0.2835 - -
0.3948 321 0.2589 - -
0.3961 322 0.088 - -
0.3973 323 0.1675 - -
0.3985 324 0.1525 - -
0.3998 325 0.1401 - -
0.4010 326 0.2109 - -
0.4022 327 0.1382 - -
0.4034 328 0.1724 - -
0.4047 329 0.1668 - -
0.4059 330 0.1606 - -
0.4071 331 0.2102 - -
0.4084 332 0.1737 - -
0.4096 333 0.1641 - -
0.4108 334 0.1984 - -
0.4121 335 0.1395 - -
0.4133 336 0.1236 0.1008 0.9066
0.4145 337 0.1405 - -
0.4157 338 0.1461 - -
0.4170 339 0.1151 - -
0.4182 340 0.1282 - -
0.4194 341 0.2155 - -
0.4207 342 0.1344 - -
0.4219 343 0.1854 - -
0.4231 344 0.1766 - -
0.4244 345 0.122 - -
0.4256 346 0.142 - -
0.4268 347 0.1434 - -
0.4280 348 0.1687 - -
0.4293 349 0.1751 - -
0.4305 350 0.1253 - -
0.4317 351 0.1387 - -
0.4330 352 0.181 - -
0.4342 353 0.101 - -
0.4354 354 0.1552 - -
0.4367 355 0.2676 - -
0.4379 356 0.1638 - -
0.4391 357 0.19 0.1008 0.9072
0.4403 358 0.1152 - -
0.4416 359 0.1639 - -
0.4428 360 0.1624 - -
0.4440 361 0.203 - -
0.4453 362 0.1856 - -
0.4465 363 0.1978 - -
0.4477 364 0.1457 - -
0.4490 365 0.176 - -
0.4502 366 0.1742 - -
0.4514 367 0.1599 - -
0.4526 368 0.2085 - -
0.4539 369 0.2255 - -
0.4551 370 0.1941 - -
0.4563 371 0.0769 - -
0.4576 372 0.2031 - -
0.4588 373 0.2151 - -
0.4600 374 0.2115 - -
0.4613 375 0.1241 - -
0.4625 376 0.1693 - -
0.4637 377 0.2086 - -
0.4649 378 0.1661 0.1004 0.9074
0.4662 379 0.1508 - -
0.4674 380 0.1802 - -
0.4686 381 0.1005 - -
0.4699 382 0.1948 - -
0.4711 383 0.1618 - -
0.4723 384 0.216 - -
0.4736 385 0.132 - -
0.4748 386 0.2461 - -
0.4760 387 0.1825 - -
0.4772 388 0.1912 - -
0.4785 389 0.1706 - -
0.4797 390 0.2599 - -
0.4809 391 0.1837 - -
0.4822 392 0.23 - -
0.4834 393 0.1523 - -
0.4846 394 0.1105 - -
0.4859 395 0.1478 - -
0.4871 396 0.2184 - -
0.4883 397 0.1977 - -
0.4895 398 0.1607 - -
0.4908 399 0.2183 0.1002 0.9077
0.4920 400 0.1155 - -
0.4932 401 0.2395 - -
0.4945 402 0.1194 - -
0.4957 403 0.1567 - -
0.4969 404 0.1037 - -
0.4982 405 0.2713 - -
0.4994 406 0.1742 - -
0.5006 407 0.221 - -
0.5018 408 0.1412 - -
0.5031 409 0.1482 - -
0.5043 410 0.1347 - -
0.5055 411 0.2345 - -
0.5068 412 0.1231 - -
0.5080 413 0.1418 - -
0.5092 414 0.152 - -
0.5105 415 0.1878 - -
0.5117 416 0.1683 - -
0.5129 417 0.1501 - -
0.5141 418 0.2589 - -
0.5154 419 0.1924 - -
0.5166 420 0.1166 0.0979 0.9078
0.5178 421 0.1509 - -
0.5191 422 0.1457 - -
0.5203 423 0.2244 - -
0.5215 424 0.1837 - -
0.5228 425 0.2649 - -
0.5240 426 0.1295 - -
0.5252 427 0.1776 - -
0.5264 428 0.1949 - -
0.5277 429 0.1262 - -
0.5289 430 0.1502 - -
0.5301 431 0.1927 - -
0.5314 432 0.2161 - -
0.5326 433 0.2082 - -
0.5338 434 0.2171 - -
0.5351 435 0.209 - -
0.5363 436 0.1841 - -
0.5375 437 0.1522 - -
0.5387 438 0.1644 - -
0.5400 439 0.1784 - -
0.5412 440 0.2041 - -
0.5424 441 0.1564 0.0968 0.9058
0.5437 442 0.2151 - -
0.5449 443 0.1797 - -
0.5461 444 0.1652 - -
0.5474 445 0.1561 - -
0.5486 446 0.1063 - -
0.5498 447 0.1584 - -
0.5510 448 0.2396 - -
0.5523 449 0.1952 - -
0.5535 450 0.1598 - -
0.5547 451 0.2093 - -
0.5560 452 0.1585 - -
0.5572 453 0.2311 - -
0.5584 454 0.1048 - -
0.5597 455 0.1571 - -
0.5609 456 0.1915 - -
0.5621 457 0.1625 - -
0.5633 458 0.1613 - -
0.5646 459 0.1845 - -
0.5658 460 0.2134 - -
0.5670 461 0.2059 - -
0.5683 462 0.1974 0.0947 0.9067
0.5695 463 0.1624 - -
0.5707 464 0.2005 - -
0.5720 465 0.1407 - -
0.5732 466 0.1175 - -
0.5744 467 0.1888 - -
0.5756 468 0.1423 - -
0.5769 469 0.1195 - -
0.5781 470 0.1525 - -
0.5793 471 0.2155 - -
0.5806 472 0.2048 - -
0.5818 473 0.2386 - -
0.5830 474 0.162 - -
0.5843 475 0.1735 - -
0.5855 476 0.2067 - -
0.5867 477 0.1395 - -
0.5879 478 0.1482 - -
0.5892 479 0.2399 - -
0.5904 480 0.1849 - -
0.5916 481 0.139 - -
0.5929 482 0.2089 - -
0.5941 483 0.2066 0.0934 0.9072
0.5953 484 0.2293 - -
0.5966 485 0.1919 - -
0.5978 486 0.1168 - -
0.5990 487 0.2057 - -
0.6002 488 0.1866 - -
0.6015 489 0.2277 - -
0.6027 490 0.1527 - -
0.6039 491 0.275 - -
0.6052 492 0.1212 - -
0.6064 493 0.1384 - -
0.6076 494 0.1611 - -
0.6089 495 0.145 - -
0.6101 496 0.1996 - -
0.6113 497 0.3 - -
0.6125 498 0.1117 - -
0.6138 499 0.1905 - -
0.6150 500 0.2221 - -
0.6162 501 0.1749 - -
0.6175 502 0.1533 - -
0.6187 503 0.2268 - -
0.6199 504 0.1879 0.0936 0.9066
0.6212 505 0.2956 - -
0.6224 506 0.1566 - -
0.6236 507 0.1612 - -
0.6248 508 0.2312 - -
0.6261 509 0.181 - -
0.6273 510 0.235 - -
0.6285 511 0.1376 - -
0.6298 512 0.1066 - -
0.6310 513 0.2235 - -
0.6322 514 0.2549 - -
0.6335 515 0.2676 - -
0.6347 516 0.1652 - -
0.6359 517 0.1573 - -
0.6371 518 0.2106 - -
0.6384 519 0.151 - -
0.6396 520 0.1491 - -
0.6408 521 0.2612 - -
0.6421 522 0.1287 - -
0.6433 523 0.2084 - -
0.6445 524 0.1545 - -
0.6458 525 0.1946 0.0931 0.9061
0.6470 526 0.1684 - -
0.6482 527 0.1974 - -
0.6494 528 0.2448 - -
0.6507 529 0.2255 - -
0.6519 530 0.2157 - -
0.6531 531 0.1948 - -
0.6544 532 0.1418 - -
0.6556 533 0.1683 - -
0.6568 534 0.193 - -
0.6581 535 0.2341 - -
0.6593 536 0.131 - -
0.6605 537 0.1733 - -
0.6617 538 0.1489 - -
0.6630 539 0.1918 - -
0.6642 540 0.1953 - -
0.6654 541 0.1421 - -
0.6667 542 0.2214 - -
0.6679 543 0.2152 - -
0.6691 544 0.209 - -
0.6704 545 0.1735 - -
0.6716 546 0.2048 0.0918 0.9060
0.6728 547 0.1721 - -
0.6740 548 0.1838 - -
0.6753 549 0.1614 - -
0.6765 550 0.1999 - -
0.6777 551 0.0984 - -
0.6790 552 0.1351 - -
0.6802 553 0.1886 - -
0.6814 554 0.1148 - -
0.6827 555 0.1766 - -
0.6839 556 0.19 - -
0.6851 557 0.2082 - -
0.6863 558 0.222 - -
0.6876 559 0.2032 - -
0.6888 560 0.1854 - -
0.6900 561 0.1473 - -
0.6913 562 0.2003 - -
0.6925 563 0.1223 - -
0.6937 564 0.2319 - -
0.6950 565 0.0761 - -
0.6962 566 0.2835 - -
0.6974 567 0.2331 0.0920 0.9061
0.6986 568 0.1698 - -
0.6999 569 0.203 - -
0.7011 570 0.2344 - -
0.7023 571 0.1823 - -
0.7036 572 0.2043 - -
0.7048 573 0.1881 - -
0.7060 574 0.1599 - -
0.7073 575 0.0829 - -
0.7085 576 0.1816 - -
0.7097 577 0.1801 - -
0.7109 578 0.1707 - -
0.7122 579 0.2306 - -
0.7134 580 0.1503 - -
0.7146 581 0.1779 - -
0.7159 582 0.1422 - -
0.7171 583 0.1358 - -
0.7183 584 0.0978 - -
0.7196 585 0.1713 - -
0.7208 586 0.1771 - -
0.7220 587 0.1241 - -
0.7232 588 0.1267 0.0918 0.9064
0.7245 589 0.1126 - -
0.7257 590 0.0858 - -
0.7269 591 0.1335 - -
0.7282 592 0.1958 - -
0.7294 593 0.1448 - -
0.7306 594 0.2679 - -
0.7319 595 0.153 - -
0.7331 596 0.1523 - -
0.7343 597 0.1988 - -
0.7355 598 0.157 - -
0.7368 599 0.146 - -
0.7380 600 0.2043 - -
0.7392 601 0.1508 - -
0.7405 602 0.1946 - -
0.7417 603 0.1481 - -
0.7429 604 0.0995 - -
0.7442 605 0.149 - -
0.7454 606 0.1686 - -
0.7466 607 0.1555 - -
0.7478 608 0.1662 - -
0.7491 609 0.1217 0.0917 0.9064
0.7503 610 0.0748 - -
0.7515 611 0.1723 - -
0.7528 612 0.2354 - -
0.7540 613 0.1315 - -
0.7552 614 0.2913 - -
0.7565 615 0.0991 - -
0.7577 616 0.1052 - -
0.7589 617 0.1496 - -
0.7601 618 0.1399 - -
0.7614 619 0.1329 - -
0.7626 620 0.2287 - -
0.7638 621 0.1085 - -
0.7651 622 0.1864 - -
0.7663 623 0.1577 - -
0.7675 624 0.143 - -
0.7688 625 0.1886 - -
0.7700 626 0.1683 - -
0.7712 627 0.212 - -
0.7724 628 0.1643 - -
0.7737 629 0.1632 - -
0.7749 630 0.1384 0.0925 0.9054
0.7761 631 0.2133 - -
0.7774 632 0.1732 - -
0.7786 633 0.1218 - -
0.7798 634 0.1581 - -
0.7811 635 0.1337 - -
0.7823 636 0.1859 - -
0.7835 637 0.1616 - -
0.7847 638 0.1799 - -
0.7860 639 0.1193 - -
0.7872 640 0.1471 - -
0.7884 641 0.1235 - -
0.7897 642 0.1221 - -
0.7909 643 0.1379 - -
0.7921 644 0.238 - -
0.7934 645 0.1671 - -
0.7946 646 0.1652 - -
0.7958 647 0.1828 - -
0.7970 648 0.2207 - -
0.7983 649 0.2109 - -
0.7995 650 0.1105 - -
0.8007 651 0.129 0.0933 0.9069
0.8020 652 0.1633 - -
0.8032 653 0.201 - -
0.8044 654 0.1041 - -
0.8057 655 0.1838 - -
0.8069 656 0.3044 - -
0.8081 657 0.1736 - -
0.8093 658 0.1909 - -
0.8106 659 0.1413 - -
0.8118 660 0.1138 - -
0.8130 661 0.1163 - -
0.8143 662 0.1725 - -
0.8155 663 0.2248 - -
0.8167 664 0.1019 - -
0.8180 665 0.1138 - -
0.8192 666 0.1652 - -
0.8204 667 0.1361 - -
0.8216 668 0.1769 - -
0.8229 669 0.1241 - -
0.8241 670 0.1683 - -
0.8253 671 0.1315 - -
0.8266 672 0.1046 0.0940 0.9055
0.8278 673 0.1984 - -
0.8290 674 0.1766 - -
0.8303 675 0.1245 - -
0.8315 676 0.1953 - -
0.8327 677 0.1506 - -
0.8339 678 0.1145 - -
0.8352 679 0.1366 - -
0.8364 680 0.1071 - -
0.8376 681 0.2142 - -
0.8389 682 0.2029 - -
0.8401 683 0.1171 - -
0.8413 684 0.176 - -
0.8426 685 0.1052 - -
0.8438 686 0.1892 - -
0.8450 687 0.1499 - -
0.8462 688 0.1414 - -
0.8475 689 0.1193 - -
0.8487 690 0.1516 - -
0.8499 691 0.1552 - -
0.8512 692 0.1168 - -
0.8524 693 0.2326 0.0932 0.9071
0.8536 694 0.2112 - -
0.8549 695 0.0835 - -
0.8561 696 0.1512 - -
0.8573 697 0.1379 - -
0.8585 698 0.1045 - -
0.8598 699 0.2045 - -
0.8610 700 0.1909 - -
0.8622 701 0.1895 - -
0.8635 702 0.2077 - -
0.8647 703 0.1199 - -
0.8659 704 0.1606 - -
0.8672 705 0.1501 - -
0.8684 706 0.1711 - -
0.8696 707 0.222 - -
0.8708 708 0.1414 - -
0.8721 709 0.1972 - -
0.8733 710 0.1074 - -
0.8745 711 0.2044 - -
0.8758 712 0.0997 - -
0.8770 713 0.1178 - -
0.8782 714 0.1376 0.0929 0.9058
0.8795 715 0.1302 - -
0.8807 716 0.1252 - -
0.8819 717 0.2365 - -
0.8831 718 0.1405 - -
0.8844 719 0.1806 - -
0.8856 720 0.1495 - -
0.8868 721 0.1987 - -
0.8881 722 0.096 - -
0.8893 723 0.1728 - -
0.8905 724 0.2104 - -
0.8918 725 0.1562 - -
0.8930 726 0.1358 - -
0.8942 727 0.1723 - -
0.8954 728 0.1947 - -
0.8967 729 0.1572 - -
0.8979 730 0.1124 - -
0.8991 731 0.2272 - -
0.9004 732 0.1356 - -
0.9016 733 0.1816 - -
0.9028 734 0.1011 - -
0.9041 735 0.124 0.0911 0.9051
0.9053 736 0.1873 - -
0.9065 737 0.0702 - -
0.9077 738 0.15 - -
0.9090 739 0.221 - -
0.9102 740 0.1511 - -
0.9114 741 0.195 - -
0.9127 742 0.1473 - -
0.9139 743 0.1311 - -
0.9151 744 0.1869 - -
0.9164 745 0.1433 - -
0.9176 746 0.1286 - -
0.9188 747 0.1316 - -
0.9200 748 0.1669 - -
0.9213 749 0.1691 - -
0.9225 750 0.1853 - -
0.9237 751 0.1813 - -
0.9250 752 0.1754 - -
0.9262 753 0.2282 - -
0.9274 754 0.1248 - -
0.9287 755 0.1182 - -
0.9299 756 0.1601 0.0903 0.9059
0.9311 757 0.2377 - -
0.9323 758 0.1799 - -
0.9336 759 0.2016 - -
0.9348 760 0.1293 - -
0.9360 761 0.2038 - -
0.9373 762 0.1384 - -
0.9385 763 0.1856 - -
0.9397 764 0.2775 - -
0.9410 765 0.1651 - -
0.9422 766 0.2072 - -
0.9434 767 0.1459 - -
0.9446 768 0.1277 - -
0.9459 769 0.1742 - -
0.9471 770 0.1978 - -
0.9483 771 0.1992 - -
0.9496 772 0.1649 - -
0.9508 773 0.2195 - -
0.9520 774 0.1348 - -
0.9533 775 0.1556 - -
0.9545 776 0.2293 - -
0.9557 777 0.1585 0.0904 0.9062
0.9569 778 0.1029 - -
0.9582 779 0.1027 - -
0.9594 780 0.1165 - -
0.9606 781 0.1654 - -
0.9619 782 0.1706 - -
0.9631 783 0.102 - -
0.9643 784 0.1697 - -
0.9656 785 0.177 - -
0.9668 786 0.1718 - -
0.9680 787 0.1542 - -
0.9692 788 0.1654 - -
0.9705 789 0.1672 - -
0.9717 790 0.1867 - -
0.9729 791 0.1717 - -
0.9742 792 0.1701 - -
0.9754 793 0.1542 - -
0.9766 794 0.2153 - -
0.9779 795 0.131 - -
0.9791 796 0.1448 - -
0.9803 797 0.1171 - -
0.9815 798 0.1585 0.0904 0.9063
0.9828 799 0.1352 - -
0.9840 800 0.1146 - -
0.9852 801 0.1366 - -
0.9865 802 0.1375 - -
0.9877 803 0.1588 - -
0.9889 804 0.1429 - -
0.9902 805 0.1541 - -
0.9914 806 0.1171 - -
0.9926 807 0.1352 - -
0.9938 808 0.1948 - -
0.9951 809 0.1628 - -
0.9963 810 0.1115 - -
0.9975 811 0.0929 - -
0.9988 812 0.0955 - -
1.0 813 0.0 0.0904 0.9063

Framework Versions

  • Python: 3.10.14
  • Sentence Transformers: 3.0.1
  • Transformers: 4.44.0
  • PyTorch: 2.4.0
  • Accelerate: 0.33.0
  • Datasets: 2.21.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}