SentenceTransformer based on sentence-transformers/all-MiniLM-L12-v2
This is a sentence-transformers model finetuned from sentence-transformers/all-MiniLM-L12-v2. It maps sentences & paragraphs to a 384-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/all-MiniLM-L12-v2
- Maximum Sequence Length: 128 tokens
- Output Dimensionality: 384 tokens
- Similarity Function: Cosine Similarity
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 128, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 384, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Trelis/all-MiniLM-L12-v2-ft-pairs-balanced-cpu")
# Run inference
sentences = [
'What happens if a player deliberately delays the changeover procedure after a Change of Possession?',
' Registration\n5\n03 I\nThe Ball\n6\n04 I\nPlaying Uniform\n6\n05 I\nTeam Composition\n6\n06 I\nTeam Coach and Team Officials\n7\n07\nI\nCommencement and Recommencement of Play\n7\n08\nI\nMatch Duration\n8\n09 I\nPossession\n8\n10\nI\nThe Touch\n9\n11\nI\nPassing\n10\n12\nI\nBall Touched in Flight\n10\n13\nI\nThe Rollball\n11\n14\nI\nScoring\n13\n15\nI\nOffside\n13\n16\nI\nObstruction\n14\n17\nI\nInterchange\n14\n18\nI\nPenalty\n15\n19\nI\nAdvantage\n16\n20\nI\nMisconduct\n16\n21\nI\nForced Interchange\n16\n22\nI\nSin Bin\n16\n23\nI\nDismissal\n17\n24\nI\nDrop-Off\n17\n25\nI\nMatch Officials\n18\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\n Definitions and Terminology \nUnless the contrary intention appears, the following definitions and terminology apply \nto the game of Touch:\nTERM/PHRASE\nDEFINITION/DESCRIPTION\nAdvantage\nThe period of time after an Infringement in which the non-offending \nside has the opportunity to gain Advantage either territorial, tactical \nor in the form of a Try.\nAttacking Try Line\nThe line on or over which a player has to place the ball to \nscore a Try.\nAttacking Team\nThe Team which has or is gaining Possession.\nBehind\nA position or direction towards a Team’s Defending Try Line.\nChange of Possession\nThe act of moving control of the ball from one Team to the other.\nDead/Dead Ball\nWhen the ball is out of play including the period following a Try and \nuntil the match is recommenced and when the ball goes to ground \nand/or outside the boundaries of the Field of Play prior to the \nsubsequent Rollball.\nDead Ball Line\nThe end boundaries of the Field of Play. There is one at each end of \nthe Field of Play. See Appendix 1.\nDef',
' Registration\n5\n03 I\nThe Ball\n6\n04 I\nPlaying Uniform\n6\n05 I\nTeam Composition\n6\n06 I\nTeam Coach and Team Officials\n7\n07\nI\nCommencement and Recommencement of Play\n7\n08\nI\nMatch Duration\n8\n09 I\nPossession\n8\n10\nI\nThe Touch\n9\n11\nI\nPassing\n10\n12\nI\nBall Touched in Flight\n10\n13\nI\nThe Rollball\n11\n14\nI\nScoring\n13\n15\nI\nOffside\n13\n16\nI\nObstruction\n14\n17\nI\nInterchange\n14\n18\nI\nPenalty\n15\n19\nI\nAdvantage\n16\n20\nI\nMisconduct\n16\n21\nI\nForced Interchange\n16\n22\nI\nSin Bin\n16\n23\nI\nDismissal\n17\n24\nI\nDrop-Off\n17\n25\nI\nMatch Officials\n18\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\nFIT Playing Rules - 5th Edition\nCOPYRIGHT © Touch Football Australia 2020\n Definitions and Terminology \nUnless the contrary intention appears, the following definitions and terminology apply \nto the game of Touch:\nTERM/PHRASE\nDEFINITION/DESCRIPTION\nAdvantage\nThe period of time after an Infringement in which the non-offending \nside has the opportunity to gain Advantage either territorial, tactical \nor in the form of a Try.\nAttacking Try Line\nThe line on or over which a player has to place the ball to \nscore a Try.\nAttacking Team\nThe Team which has or is gaining Possession.\nBehind\nA position or direction towards a Team’s Defending Try Line.\nChange of Possession\nThe act of moving control of the ball from one Team to the other.\nDead/Dead Ball\nWhen the ball is out of play including the period following a Try and \nuntil the match is recommenced and when the ball goes to ground \nand/or outside the boundaries of the Field of Play prior to the \nsubsequent Rollball.\nDead Ball Line\nThe end boundaries of the Field of Play. There is one at each end of \nthe Field of Play. See Appendix 1.\nDef',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 384]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Training Details
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 4per_device_eval_batch_size
: 4learning_rate
: 1e-05num_train_epochs
: 1lr_scheduler_type
: cosinewarmup_ratio
: 0.3bf16
: True
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 4per_device_eval_batch_size
: 4per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonelearning_rate
: 1e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 1max_steps
: -1lr_scheduler_type
: cosinelr_scheduler_kwargs
: {}warmup_ratio
: 0.3warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falsebatch_sampler
: batch_samplermulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | loss |
---|---|---|---|
0.1053 | 2 | 4.6868 | - |
0.1579 | 3 | - | 2.7075 |
0.2105 | 4 | 5.703 | - |
0.3158 | 6 | 2.1691 | 2.6412 |
0.4211 | 8 | 1.705 | - |
0.4737 | 9 | - | 2.6254 |
0.5263 | 10 | 1.7985 | - |
0.6316 | 12 | 3.4822 | 2.6087 |
0.7368 | 14 | 4.2724 | - |
0.7895 | 15 | - | 2.6000 |
0.8421 | 16 | 3.1489 | - |
0.9474 | 18 | 5.7594 | 2.6032 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.1.1+cu121
- Accelerate: 0.31.0
- Datasets: 2.17.1
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
CoSENTLoss
@online{kexuefm-8847,
title={CoSENT: A more efficient sentence vector scheme than Sentence-BERT},
author={Su Jianlin},
year={2022},
month={Jan},
url={https://kexue.fm/archives/8847},
}
- Downloads last month
- 8
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.
Model tree for Trelis/all-MiniLM-L12-v2-ft-pairs-balanced-cpu
Base model
sentence-transformers/all-MiniLM-L12-v2