Ananthu357's picture
Add new SentenceTransformer model.
2063bad verified
metadata
language: []
library_name: sentence-transformers
tags:
  - sentence-transformers
  - sentence-similarity
  - feature-extraction
  - generated_from_trainer
  - dataset_size:360
  - loss:CosineSimilarityLoss
base_model: BAAI/bge-large-en
datasets: []
widget:
  - source_sentence: Deadline for submitting project schedule.
    sentences:
      - Variation
      - "The Railway shall have the right to let other contracts in connection with the works. The Contractor shall afford other Contractors reasonable opportunity for the storage of their materials and the execution of their works and shall properly connect and coordinate his work with theirs. If any part of the Contractor\x92s work depends upon proper execution or result upon the work of another Contractor(s), the Contractor shall inspect and promptly report to the Engineer any defects in such works that render it unsuitable for such proper execution and results. The Contractor's failure so-to inspect and report shall constitute an acceptance of the other Contractor's work as fit and proper for the reception of his work, except as to defects which may develop in the other Contractor's work after the execution of his work."
      - >-
        The quantities set out in the accepted Schedule of Rates with items of
        works quantified are the estimated quantities of the works
  - source_sentence: "\_What is the deadline to submit the proposed project schedule?"
    sentences:
      - "having value more than Rs 20 crore and original period of completion 12 months or more, when there is no reduction in original scope of work by more than 10%, and no extension granted on either railway or Contractor\x92s account,"
      - >-
        Can the stones/rocks/bounders obtained during excavation be used for
        construction if found technically satisfactory?
      - >-
        Chart/PERT/CPM. He shall also submit the details of organisation (in
        terms of labour and supervisors), plant and machinery that he intends to
        utilize (from time to time) for execution of the work within stipulated
        date of completion.
  - source_sentence: "Does the contract document contain a \x91third-party liability relationship\x92 provision?"
    sentences:
      - >-
        The Contractor shall indemnify and save harmless the Railway from and
        against all actions, suit, proceedings, losses, costs, damages, charges,
        claims and demands of every nature and description brought or recovered
        against the Railways by reason of any act or omission of the Contractor,
        his agents or employees, in the execution of the works or in his
        guarding of the same. All sums payable by way of compensation under any
        of these conditions shall be considered as reasonable compensation to be
        applied to the actual loss or damage sustained, and whether or not any
        damage shall have been sustained.
      - >-
        the Railway shall not in any way be liable for the supply of materials
        or for the non-supply thereof for any reasons whatsoever nor for any
        loss or damage arising in consequence of such delay or non-supply.
      - >-
        The Railway shall have the right to let other contracts in connection
        with the works.
  - source_sentence: Liquidated Damages
    sentences:
      - >-
        The Contractor shall commence the works within 15 days after the receipt
        by him of an order in writing to this effect from the Engineer and shall
        proceed with the same with due expedition and without delay
      - >-
        Any bribe, commission, gift or advantage given, promised or offered by
        or on behalf of the Contractor or his partner or agent or servant or
        anyone on his behalf
      - purpose of works either free of cost or pay thecost of the same.
  - source_sentence: What is mentioned regarding the patent errors?
    sentences:
      - >-
        the Security Deposit already with railways under the contract shall be
        forfeited.
      - >-
        This clause mentions Special Conditions, which might be additional
        documents relevant to the contract.
      - >-
        shall take upon himself and provide for the risk of any error which may
        subsequently be discovered and shall make no subsequent claim on account
        thereof.
pipeline_tag: sentence-similarity

SentenceTransformer based on BAAI/bge-large-en

This is a sentence-transformers model finetuned from BAAI/bge-large-en. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.

Model Details

Model Description

  • Model Type: Sentence Transformer
  • Base model: BAAI/bge-large-en
  • Maximum Sequence Length: 512 tokens
  • Output Dimensionality: 1024 tokens
  • Similarity Function: Cosine Similarity

Model Sources

Full Model Architecture

SentenceTransformer(
  (0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel 
  (1): Pooling({'word_embedding_dimension': 1024, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
  (2): Normalize()
)

Usage

Direct Usage (Sentence Transformers)

First install the Sentence Transformers library:

pip install -U sentence-transformers

Then you can load this model and run inference.

from sentence_transformers import SentenceTransformer

# Download from the 🤗 Hub
model = SentenceTransformer("Ananthu357/Ananthus-BAAI-for-contracts")
# Run inference
sentences = [
    'What is mentioned regarding the patent errors?',
    'shall take upon himself and provide for the risk of any error which may subsequently be discovered and shall make no subsequent claim on account thereof.',
    'This clause mentions Special Conditions, which might be additional documents relevant to the contract.',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 1024]

# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]

Training Details

Training Hyperparameters

Non-Default Hyperparameters

  • eval_strategy: steps
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • num_train_epochs: 40
  • warmup_ratio: 0.1
  • fp16: True
  • batch_sampler: no_duplicates

All Hyperparameters

Click to expand
  • overwrite_output_dir: False
  • do_predict: False
  • eval_strategy: steps
  • prediction_loss_only: True
  • per_device_train_batch_size: 16
  • per_device_eval_batch_size: 16
  • per_gpu_train_batch_size: None
  • per_gpu_eval_batch_size: None
  • gradient_accumulation_steps: 1
  • eval_accumulation_steps: None
  • learning_rate: 5e-05
  • weight_decay: 0.0
  • adam_beta1: 0.9
  • adam_beta2: 0.999
  • adam_epsilon: 1e-08
  • max_grad_norm: 1.0
  • num_train_epochs: 40
  • max_steps: -1
  • lr_scheduler_type: linear
  • lr_scheduler_kwargs: {}
  • warmup_ratio: 0.1
  • warmup_steps: 0
  • log_level: passive
  • log_level_replica: warning
  • log_on_each_node: True
  • logging_nan_inf_filter: True
  • save_safetensors: True
  • save_on_each_node: False
  • save_only_model: False
  • restore_callback_states_from_checkpoint: False
  • no_cuda: False
  • use_cpu: False
  • use_mps_device: False
  • seed: 42
  • data_seed: None
  • jit_mode_eval: False
  • use_ipex: False
  • bf16: False
  • fp16: True
  • fp16_opt_level: O1
  • half_precision_backend: auto
  • bf16_full_eval: False
  • fp16_full_eval: False
  • tf32: None
  • local_rank: 0
  • ddp_backend: None
  • tpu_num_cores: None
  • tpu_metrics_debug: False
  • debug: []
  • dataloader_drop_last: False
  • dataloader_num_workers: 0
  • dataloader_prefetch_factor: None
  • past_index: -1
  • disable_tqdm: False
  • remove_unused_columns: True
  • label_names: None
  • load_best_model_at_end: False
  • ignore_data_skip: False
  • fsdp: []
  • fsdp_min_num_params: 0
  • fsdp_config: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}
  • fsdp_transformer_layer_cls_to_wrap: None
  • accelerator_config: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}
  • deepspeed: None
  • label_smoothing_factor: 0.0
  • optim: adamw_torch
  • optim_args: None
  • adafactor: False
  • group_by_length: False
  • length_column_name: length
  • ddp_find_unused_parameters: None
  • ddp_bucket_cap_mb: None
  • ddp_broadcast_buffers: False
  • dataloader_pin_memory: True
  • dataloader_persistent_workers: False
  • skip_memory_metrics: True
  • use_legacy_prediction_loop: False
  • push_to_hub: False
  • resume_from_checkpoint: None
  • hub_model_id: None
  • hub_strategy: every_save
  • hub_private_repo: False
  • hub_always_push: False
  • gradient_checkpointing: False
  • gradient_checkpointing_kwargs: None
  • include_inputs_for_metrics: False
  • eval_do_concat_batches: True
  • fp16_backend: auto
  • push_to_hub_model_id: None
  • push_to_hub_organization: None
  • mp_parameters:
  • auto_find_batch_size: False
  • full_determinism: False
  • torchdynamo: None
  • ray_scope: last
  • ddp_timeout: 1800
  • torch_compile: False
  • torch_compile_backend: None
  • torch_compile_mode: None
  • dispatch_batches: None
  • split_batches: None
  • include_tokens_per_second: False
  • include_num_input_tokens_seen: False
  • neftune_noise_alpha: None
  • optim_target_modules: None
  • batch_eval_metrics: False
  • batch_sampler: no_duplicates
  • multi_dataset_batch_sampler: proportional

Training Logs

Epoch Step Training Loss loss
3.5652 100 0.0564 0.0940
7.1304 200 0.0122 0.0713
10.4348 300 0.0051 0.0655
14.0 400 0.0026 0.0678
17.3043 500 0.001 0.0668
20.8696 600 0.0009 0.0666
24.1739 700 0.0008 0.0671
27.7391 800 0.0007 0.0674
31.0435 900 0.0007 0.0671

Framework Versions

  • Python: 3.10.12
  • Sentence Transformers: 3.0.1
  • Transformers: 4.41.2
  • PyTorch: 2.3.0+cu121
  • Accelerate: 0.31.0
  • Datasets: 2.20.0
  • Tokenizers: 0.19.1

Citation

BibTeX

Sentence Transformers

@inproceedings{reimers-2019-sentence-bert,
    title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
    author = "Reimers, Nils and Gurevych, Iryna",
    booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
    month = "11",
    year = "2019",
    publisher = "Association for Computational Linguistics",
    url = "https://arxiv.org/abs/1908.10084",
}