mpnet-base-financial-rag-matryoshka
This is a sentence-transformers model finetuned from sentence-transformers/all-mpnet-base-v2. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: sentence-transformers/all-mpnet-base-v2
- Maximum Sequence Length: 384 tokens
- Output Dimensionality: 768 tokens
- Similarity Function: Cosine Similarity
- Language: en
- License: apache-2.0
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 384, 'do_lower_case': False}) with Transformer model: MPNetModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("rbhatia46/mpnet-base-financial-rag-matryoshka")
# Run inference
sentences = [
"It's simple, really: Practice. Fiscal responsibility is not a trick you can learn look up on Google, or a service you can buy from your accountant. Being responsible with your money is a skill that is learned over a lifetime. The only way to get better at it is to practice, and not get discouraged when you make mistakes.",
'why do people have to be fiscally responsible',
'how long does it take for a loan to get paid interest',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Dataset:
dim_768
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.181 |
cosine_accuracy@3 | 0.4935 |
cosine_accuracy@5 | 0.5734 |
cosine_accuracy@10 | 0.6639 |
cosine_precision@1 | 0.181 |
cosine_precision@3 | 0.1645 |
cosine_precision@5 | 0.1147 |
cosine_precision@10 | 0.0664 |
cosine_recall@1 | 0.181 |
cosine_recall@3 | 0.4935 |
cosine_recall@5 | 0.5734 |
cosine_recall@10 | 0.6639 |
cosine_ndcg@10 | 0.4175 |
cosine_mrr@10 | 0.3385 |
cosine_map@100 | 0.3464 |
Information Retrieval
- Dataset:
dim_512
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.1904 |
cosine_accuracy@3 | 0.49 |
cosine_accuracy@5 | 0.5687 |
cosine_accuracy@10 | 0.6533 |
cosine_precision@1 | 0.1904 |
cosine_precision@3 | 0.1633 |
cosine_precision@5 | 0.1137 |
cosine_precision@10 | 0.0653 |
cosine_recall@1 | 0.1904 |
cosine_recall@3 | 0.49 |
cosine_recall@5 | 0.5687 |
cosine_recall@10 | 0.6533 |
cosine_ndcg@10 | 0.4174 |
cosine_mrr@10 | 0.3417 |
cosine_map@100 | 0.3504 |
Information Retrieval
- Dataset:
dim_256
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.1798 |
cosine_accuracy@3 | 0.4747 |
cosine_accuracy@5 | 0.5452 |
cosine_accuracy@10 | 0.6439 |
cosine_precision@1 | 0.1798 |
cosine_precision@3 | 0.1582 |
cosine_precision@5 | 0.109 |
cosine_precision@10 | 0.0644 |
cosine_recall@1 | 0.1798 |
cosine_recall@3 | 0.4747 |
cosine_recall@5 | 0.5452 |
cosine_recall@10 | 0.6439 |
cosine_ndcg@10 | 0.4068 |
cosine_mrr@10 | 0.3308 |
cosine_map@100 | 0.3395 |
Information Retrieval
- Dataset:
dim_128
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.1857 |
cosine_accuracy@3 | 0.4536 |
cosine_accuracy@5 | 0.5241 |
cosine_accuracy@10 | 0.6216 |
cosine_precision@1 | 0.1857 |
cosine_precision@3 | 0.1512 |
cosine_precision@5 | 0.1048 |
cosine_precision@10 | 0.0622 |
cosine_recall@1 | 0.1857 |
cosine_recall@3 | 0.4536 |
cosine_recall@5 | 0.5241 |
cosine_recall@10 | 0.6216 |
cosine_ndcg@10 | 0.396 |
cosine_mrr@10 | 0.3243 |
cosine_map@100 | 0.3333 |
Information Retrieval
- Dataset:
dim_64
- Evaluated with
InformationRetrievalEvaluator
Metric | Value |
---|---|
cosine_accuracy@1 | 0.1633 |
cosine_accuracy@3 | 0.4242 |
cosine_accuracy@5 | 0.4912 |
cosine_accuracy@10 | 0.5781 |
cosine_precision@1 | 0.1633 |
cosine_precision@3 | 0.1414 |
cosine_precision@5 | 0.0982 |
cosine_precision@10 | 0.0578 |
cosine_recall@1 | 0.1633 |
cosine_recall@3 | 0.4242 |
cosine_recall@5 | 0.4912 |
cosine_recall@10 | 0.5781 |
cosine_ndcg@10 | 0.3662 |
cosine_mrr@10 | 0.2984 |
cosine_map@100 | 0.3078 |
Training Details
Training Dataset
Unnamed Dataset
- Size: 169,213 training samples
- Columns:
positive
andanchor
- Approximate statistics based on the first 1000 samples:
positive anchor type string string details - min: 7 tokens
- mean: 158.03 tokens
- max: 384 tokens
- min: 4 tokens
- mean: 10.16 tokens
- max: 30 tokens
- Samples:
positive anchor International Trade, the exchange of goods and services between nations. “Goods” can be defined as finished products, as intermediate goods used in producing other goods, or as raw materials such as minerals, agricultural products, and other such commodities. International trade commerce enables a nation to specialize in those goods it can produce most cheaply and efficiently, and sell those that are surplus to its requirements. Trade also enables a country to consume more than it would be able to produce if it depended only on its own resources. Finally, trade encourages economic development by increasing the size of the market to which products can be sold. Trade has always been the major force behind the economic relations among nations; it is a measure of national strength.
what does international trade
My wife and I meet in the first few days of each month to create a budget for the coming month. During that meeting we reconcile any spending for the previous month and make sure the amount money in our accounts matches the amount of money in our budget record to the penny. (We use an excel spreadsheet, how you track it matters less than the need to track it and see how much you spent in each category during the previous month.) After we have have reviewed the previous month's spending, we allocate money we made during that previous month to each of the categories. What categories you track and how granular you are is less important than regularly seeing how much you spend so that you can evaluate whether your spending is really matching your priorities. We keep a running total for each category so if we go over on groceries one month, then the following month we have to add more to bring the category back to black as well as enough for our anticipated needs in the coming month. If there is one category that we are consistently underestimating (or overestimating) we talk about why. If there are large purchases that we are planning in the coming month, or even in a few months, we talk about them, why we want them, and we talk about how much we're planning to spend. If we want a new TV or to go on a trip, we may start adding money to the category with no plans to spend in the coming month. The biggest benefit to this process has been that we don't make a lot of impulse purchases, or if we do, they are for small dollar amounts. The simple need to explain what I want and why means I have to put the thought into it myself, and I talk myself out of a lot of purchases during that train of thought. The time spent regularly evaluating what we get for our money has cut waste that wasn't really bringing much happiness. We still buy what we want, but we agree that we want it first.
how to make a budget
I just finished my bachelor and I'm doing my masters in Computer Science at a french school in Quebec. I consider myself being in the top 5% and I have an excellent curriculum, having studied abroad, learned 4 languages, participated in student committees, etc. I'm leaning towards IT or business strategy/development...but I'm not sure yet. I guess I'm not that prepared, that's why I wanted a little help.
what school do you want to attend for a masters
- Loss:
MatryoshkaLoss
with these parameters:{ "loss": "MultipleNegativesRankingLoss", "matryoshka_dims": [ 768, 512, 256, 128, 64 ], "matryoshka_weights": [ 1, 1, 1, 1, 1 ], "n_dims_per_step": -1 }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: epochper_device_train_batch_size
: 32per_device_eval_batch_size
: 16gradient_accumulation_steps
: 16learning_rate
: 2e-05num_train_epochs
: 10lr_scheduler_type
: cosinewarmup_ratio
: 0.1bf16
: Truetf32
: Trueload_best_model_at_end
: Trueoptim
: adamw_torch_fusedbatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: epochprediction_loss_only
: Trueper_device_train_batch_size
: 32per_device_eval_batch_size
: 16per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 16eval_accumulation_steps
: Nonelearning_rate
: 2e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 10max_steps
: -1lr_scheduler_type
: cosinelr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Truefp16
: Falsefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Truelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Trueignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torch_fusedoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Falsehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseeval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falsebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Click to expand
Epoch | Step | Training Loss | dim_128_cosine_map@100 | dim_256_cosine_map@100 | dim_512_cosine_map@100 | dim_64_cosine_map@100 | dim_768_cosine_map@100 |
---|---|---|---|---|---|---|---|
0.0303 | 10 | 2.2113 | - | - | - | - | - |
0.0605 | 20 | 2.1051 | - | - | - | - | - |
0.0908 | 30 | 1.9214 | - | - | - | - | - |
0.1210 | 40 | 1.744 | - | - | - | - | - |
0.1513 | 50 | 1.5873 | - | - | - | - | - |
0.1815 | 60 | 1.3988 | - | - | - | - | - |
0.2118 | 70 | 1.263 | - | - | - | - | - |
0.2421 | 80 | 1.1082 | - | - | - | - | - |
0.2723 | 90 | 1.0061 | - | - | - | - | - |
0.3026 | 100 | 1.0127 | - | - | - | - | - |
0.3328 | 110 | 0.8644 | - | - | - | - | - |
0.3631 | 120 | 0.8006 | - | - | - | - | - |
0.3933 | 130 | 0.8067 | - | - | - | - | - |
0.4236 | 140 | 0.7624 | - | - | - | - | - |
0.4539 | 150 | 0.799 | - | - | - | - | - |
0.4841 | 160 | 0.7025 | - | - | - | - | - |
0.5144 | 170 | 0.7467 | - | - | - | - | - |
0.5446 | 180 | 0.7509 | - | - | - | - | - |
0.5749 | 190 | 0.7057 | - | - | - | - | - |
0.6051 | 200 | 0.6929 | - | - | - | - | - |
0.6354 | 210 | 0.6948 | - | - | - | - | - |
0.6657 | 220 | 0.6477 | - | - | - | - | - |
0.6959 | 230 | 0.6562 | - | - | - | - | - |
0.7262 | 240 | 0.6278 | - | - | - | - | - |
0.7564 | 250 | 0.6249 | - | - | - | - | - |
0.7867 | 260 | 0.6057 | - | - | - | - | - |
0.8169 | 270 | 0.6258 | - | - | - | - | - |
0.8472 | 280 | 0.5007 | - | - | - | - | - |
0.8775 | 290 | 0.5998 | - | - | - | - | - |
0.9077 | 300 | 0.5958 | - | - | - | - | - |
0.9380 | 310 | 0.5568 | - | - | - | - | - |
0.9682 | 320 | 0.5236 | - | - | - | - | - |
0.9985 | 330 | 0.6239 | 0.3189 | 0.3389 | 0.3645 | 0.3046 | 0.3700 |
1.0287 | 340 | 0.5106 | - | - | - | - | - |
1.0590 | 350 | 0.6022 | - | - | - | - | - |
1.0893 | 360 | 0.5822 | - | - | - | - | - |
1.1195 | 370 | 0.5094 | - | - | - | - | - |
1.1498 | 380 | 0.5037 | - | - | - | - | - |
1.1800 | 390 | 0.5415 | - | - | - | - | - |
1.2103 | 400 | 0.5011 | - | - | - | - | - |
1.2405 | 410 | 0.4571 | - | - | - | - | - |
1.2708 | 420 | 0.4587 | - | - | - | - | - |
1.3011 | 430 | 0.5065 | - | - | - | - | - |
1.3313 | 440 | 0.4589 | - | - | - | - | - |
1.3616 | 450 | 0.4165 | - | - | - | - | - |
1.3918 | 460 | 0.4215 | - | - | - | - | - |
1.4221 | 470 | 0.4302 | - | - | - | - | - |
1.4523 | 480 | 0.4556 | - | - | - | - | - |
1.4826 | 490 | 0.3793 | - | - | - | - | - |
1.5129 | 500 | 0.4586 | - | - | - | - | - |
1.5431 | 510 | 0.4327 | - | - | - | - | - |
1.5734 | 520 | 0.4207 | - | - | - | - | - |
1.6036 | 530 | 0.4042 | - | - | - | - | - |
1.6339 | 540 | 0.4019 | - | - | - | - | - |
1.6641 | 550 | 0.3804 | - | - | - | - | - |
1.6944 | 560 | 0.3796 | - | - | - | - | - |
1.7247 | 570 | 0.3476 | - | - | - | - | - |
1.7549 | 580 | 0.3871 | - | - | - | - | - |
1.7852 | 590 | 0.3602 | - | - | - | - | - |
1.8154 | 600 | 0.3711 | - | - | - | - | - |
1.8457 | 610 | 0.2879 | - | - | - | - | - |
1.8759 | 620 | 0.3497 | - | - | - | - | - |
1.9062 | 630 | 0.3346 | - | - | - | - | - |
1.9365 | 640 | 0.3426 | - | - | - | - | - |
1.9667 | 650 | 0.2977 | - | - | - | - | - |
1.9970 | 660 | 0.3783 | - | - | - | - | - |
2.0 | 661 | - | 0.3282 | 0.3485 | 0.3749 | 0.2960 | 0.3666 |
2.0272 | 670 | 0.3012 | - | - | - | - | - |
2.0575 | 680 | 0.3491 | - | - | - | - | - |
2.0877 | 690 | 0.3589 | - | - | - | - | - |
2.1180 | 700 | 0.2998 | - | - | - | - | - |
2.1483 | 710 | 0.2925 | - | - | - | - | - |
2.1785 | 720 | 0.3261 | - | - | - | - | - |
2.2088 | 730 | 0.2917 | - | - | - | - | - |
2.2390 | 740 | 0.2685 | - | - | - | - | - |
2.2693 | 750 | 0.2674 | - | - | - | - | - |
2.2995 | 760 | 0.3136 | - | - | - | - | - |
2.3298 | 770 | 0.2631 | - | - | - | - | - |
2.3601 | 780 | 0.2509 | - | - | - | - | - |
2.3903 | 790 | 0.2518 | - | - | - | - | - |
2.4206 | 800 | 0.2603 | - | - | - | - | - |
2.4508 | 810 | 0.2773 | - | - | - | - | - |
2.4811 | 820 | 0.245 | - | - | - | - | - |
2.5113 | 830 | 0.2746 | - | - | - | - | - |
2.5416 | 840 | 0.2747 | - | - | - | - | - |
2.5719 | 850 | 0.2426 | - | - | - | - | - |
2.6021 | 860 | 0.2593 | - | - | - | - | - |
2.6324 | 870 | 0.2482 | - | - | - | - | - |
2.6626 | 880 | 0.2344 | - | - | - | - | - |
2.6929 | 890 | 0.2452 | - | - | - | - | - |
2.7231 | 900 | 0.218 | - | - | - | - | - |
2.7534 | 910 | 0.2319 | - | - | - | - | - |
2.7837 | 920 | 0.2366 | - | - | - | - | - |
2.8139 | 930 | 0.2265 | - | - | - | - | - |
2.8442 | 940 | 0.1753 | - | - | - | - | - |
2.8744 | 950 | 0.2153 | - | - | - | - | - |
2.9047 | 960 | 0.201 | - | - | - | - | - |
2.9349 | 970 | 0.2205 | - | - | - | - | - |
2.9652 | 980 | 0.1933 | - | - | - | - | - |
2.9955 | 990 | 0.2301 | - | - | - | - | - |
2.9985 | 991 | - | 0.3285 | 0.3484 | 0.3636 | 0.2966 | 0.3660 |
3.0257 | 1000 | 0.1946 | - | - | - | - | - |
3.0560 | 1010 | 0.203 | - | - | - | - | - |
3.0862 | 1020 | 0.2385 | - | - | - | - | - |
3.1165 | 1030 | 0.1821 | - | - | - | - | - |
3.1467 | 1040 | 0.1858 | - | - | - | - | - |
3.1770 | 1050 | 0.2057 | - | - | - | - | - |
3.2073 | 1060 | 0.18 | - | - | - | - | - |
3.2375 | 1070 | 0.1751 | - | - | - | - | - |
3.2678 | 1080 | 0.1539 | - | - | - | - | - |
3.2980 | 1090 | 0.2153 | - | - | - | - | - |
3.3283 | 1100 | 0.1739 | - | - | - | - | - |
3.3585 | 1110 | 0.1621 | - | - | - | - | - |
3.3888 | 1120 | 0.1541 | - | - | - | - | - |
3.4191 | 1130 | 0.1642 | - | - | - | - | - |
3.4493 | 1140 | 0.1893 | - | - | - | - | - |
3.4796 | 1150 | 0.16 | - | - | - | - | - |
3.5098 | 1160 | 0.1839 | - | - | - | - | - |
3.5401 | 1170 | 0.1748 | - | - | - | - | - |
3.5703 | 1180 | 0.1499 | - | - | - | - | - |
3.6006 | 1190 | 0.1706 | - | - | - | - | - |
3.6309 | 1200 | 0.1541 | - | - | - | - | - |
3.6611 | 1210 | 0.1592 | - | - | - | - | - |
3.6914 | 1220 | 0.1683 | - | - | - | - | - |
3.7216 | 1230 | 0.1408 | - | - | - | - | - |
3.7519 | 1240 | 0.1595 | - | - | - | - | - |
3.7821 | 1250 | 0.1585 | - | - | - | - | - |
3.8124 | 1260 | 0.1521 | - | - | - | - | - |
3.8427 | 1270 | 0.1167 | - | - | - | - | - |
3.8729 | 1280 | 0.1416 | - | - | - | - | - |
3.9032 | 1290 | 0.1386 | - | - | - | - | - |
3.9334 | 1300 | 0.1513 | - | - | - | - | - |
3.9637 | 1310 | 0.1329 | - | - | - | - | - |
3.9939 | 1320 | 0.1565 | - | - | - | - | - |
4.0 | 1322 | - | 0.3270 | 0.3575 | 0.3636 | 0.3053 | 0.3660 |
4.0242 | 1330 | 0.1253 | - | - | - | - | - |
4.0545 | 1340 | 0.1325 | - | - | - | - | - |
4.0847 | 1350 | 0.1675 | - | - | - | - | - |
4.1150 | 1360 | 0.1291 | - | - | - | - | - |
4.1452 | 1370 | 0.1259 | - | - | - | - | - |
4.1755 | 1380 | 0.1359 | - | - | - | - | - |
4.2057 | 1390 | 0.1344 | - | - | - | - | - |
4.2360 | 1400 | 0.1187 | - | - | - | - | - |
4.2663 | 1410 | 0.1062 | - | - | - | - | - |
4.2965 | 1420 | 0.1653 | - | - | - | - | - |
4.3268 | 1430 | 0.1164 | - | - | - | - | - |
4.3570 | 1440 | 0.103 | - | - | - | - | - |
4.3873 | 1450 | 0.1093 | - | - | - | - | - |
4.4175 | 1460 | 0.1156 | - | - | - | - | - |
4.4478 | 1470 | 0.1195 | - | - | - | - | - |
4.4781 | 1480 | 0.1141 | - | - | - | - | - |
4.5083 | 1490 | 0.1233 | - | - | - | - | - |
4.5386 | 1500 | 0.1169 | - | - | - | - | - |
4.5688 | 1510 | 0.0957 | - | - | - | - | - |
4.5991 | 1520 | 0.1147 | - | - | - | - | - |
4.6293 | 1530 | 0.1134 | - | - | - | - | - |
4.6596 | 1540 | 0.1143 | - | - | - | - | - |
4.6899 | 1550 | 0.1125 | - | - | - | - | - |
4.7201 | 1560 | 0.0988 | - | - | - | - | - |
4.7504 | 1570 | 0.1149 | - | - | - | - | - |
4.7806 | 1580 | 0.1154 | - | - | - | - | - |
4.8109 | 1590 | 0.1043 | - | - | - | - | - |
4.8411 | 1600 | 0.0887 | - | - | - | - | - |
4.8714 | 1610 | 0.0921 | - | - | - | - | - |
4.9017 | 1620 | 0.1023 | - | - | - | - | - |
4.9319 | 1630 | 0.1078 | - | - | - | - | - |
4.9622 | 1640 | 0.1053 | - | - | - | - | - |
4.9924 | 1650 | 0.1135 | - | - | - | - | - |
4.9985 | 1652 | - | 0.3402 | 0.3620 | 0.3781 | 0.3236 | 0.3842 |
5.0227 | 1660 | 0.0908 | - | - | - | - | - |
5.0530 | 1670 | 0.0908 | - | - | - | - | - |
5.0832 | 1680 | 0.1149 | - | - | - | - | - |
5.1135 | 1690 | 0.0991 | - | - | - | - | - |
5.1437 | 1700 | 0.0864 | - | - | - | - | - |
5.1740 | 1710 | 0.0987 | - | - | - | - | - |
5.2042 | 1720 | 0.0949 | - | - | - | - | - |
5.2345 | 1730 | 0.0893 | - | - | - | - | - |
5.2648 | 1740 | 0.0806 | - | - | - | - | - |
5.2950 | 1750 | 0.1187 | - | - | - | - | - |
5.3253 | 1760 | 0.0851 | - | - | - | - | - |
5.3555 | 1770 | 0.0814 | - | - | - | - | - |
5.3858 | 1780 | 0.0803 | - | - | - | - | - |
5.4160 | 1790 | 0.0816 | - | - | - | - | - |
5.4463 | 1800 | 0.0916 | - | - | - | - | - |
5.4766 | 1810 | 0.0892 | - | - | - | - | - |
5.5068 | 1820 | 0.0935 | - | - | - | - | - |
5.5371 | 1830 | 0.0963 | - | - | - | - | - |
5.5673 | 1840 | 0.0759 | - | - | - | - | - |
5.5976 | 1850 | 0.0908 | - | - | - | - | - |
5.6278 | 1860 | 0.0896 | - | - | - | - | - |
5.6581 | 1870 | 0.0855 | - | - | - | - | - |
5.6884 | 1880 | 0.0849 | - | - | - | - | - |
5.7186 | 1890 | 0.0805 | - | - | - | - | - |
5.7489 | 1900 | 0.0872 | - | - | - | - | - |
5.7791 | 1910 | 0.0853 | - | - | - | - | - |
5.8094 | 1920 | 0.0856 | - | - | - | - | - |
5.8396 | 1930 | 0.064 | - | - | - | - | - |
5.8699 | 1940 | 0.0748 | - | - | - | - | - |
5.9002 | 1950 | 0.0769 | - | - | - | - | - |
5.9304 | 1960 | 0.0868 | - | - | - | - | - |
5.9607 | 1970 | 0.0842 | - | - | - | - | - |
5.9909 | 1980 | 0.0825 | - | - | - | - | - |
6.0 | 1983 | - | 0.3412 | 0.3542 | 0.3615 | 0.3171 | 0.3676 |
6.0212 | 1990 | 0.073 | - | - | - | - | - |
6.0514 | 2000 | 0.0708 | - | - | - | - | - |
6.0817 | 2010 | 0.0908 | - | - | - | - | - |
6.1120 | 2020 | 0.0807 | - | - | - | - | - |
6.1422 | 2030 | 0.0665 | - | - | - | - | - |
6.1725 | 2040 | 0.0773 | - | - | - | - | - |
6.2027 | 2050 | 0.0798 | - | - | - | - | - |
6.2330 | 2060 | 0.0743 | - | - | - | - | - |
6.2632 | 2070 | 0.0619 | - | - | - | - | - |
6.2935 | 2080 | 0.0954 | - | - | - | - | - |
6.3238 | 2090 | 0.0682 | - | - | - | - | - |
6.3540 | 2100 | 0.0594 | - | - | - | - | - |
6.3843 | 2110 | 0.0621 | - | - | - | - | - |
6.4145 | 2120 | 0.0674 | - | - | - | - | - |
6.4448 | 2130 | 0.069 | - | - | - | - | - |
6.4750 | 2140 | 0.0741 | - | - | - | - | - |
6.5053 | 2150 | 0.0757 | - | - | - | - | - |
6.5356 | 2160 | 0.0781 | - | - | - | - | - |
6.5658 | 2170 | 0.0632 | - | - | - | - | - |
6.5961 | 2180 | 0.07 | - | - | - | - | - |
6.6263 | 2190 | 0.0767 | - | - | - | - | - |
6.6566 | 2200 | 0.0674 | - | - | - | - | - |
6.6868 | 2210 | 0.0704 | - | - | - | - | - |
6.7171 | 2220 | 0.065 | - | - | - | - | - |
6.7474 | 2230 | 0.066 | - | - | - | - | - |
6.7776 | 2240 | 0.0752 | - | - | - | - | - |
6.8079 | 2250 | 0.07 | - | - | - | - | - |
6.8381 | 2260 | 0.0602 | - | - | - | - | - |
6.8684 | 2270 | 0.0595 | - | - | - | - | - |
6.8986 | 2280 | 0.065 | - | - | - | - | - |
6.9289 | 2290 | 0.0677 | - | - | - | - | - |
6.9592 | 2300 | 0.0708 | - | - | - | - | - |
6.9894 | 2310 | 0.0651 | - | - | - | - | - |
6.9985 | 2313 | - | 0.3484 | 0.3671 | 0.3645 | 0.3214 | 0.3773 |
7.0197 | 2320 | 0.0657 | - | - | - | - | - |
7.0499 | 2330 | 0.0588 | - | - | - | - | - |
7.0802 | 2340 | 0.0701 | - | - | - | - | - |
7.1104 | 2350 | 0.0689 | - | - | - | - | - |
7.1407 | 2360 | 0.0586 | - | - | - | - | - |
7.1710 | 2370 | 0.0626 | - | - | - | - | - |
7.2012 | 2380 | 0.0723 | - | - | - | - | - |
7.2315 | 2390 | 0.0602 | - | - | - | - | - |
7.2617 | 2400 | 0.0541 | - | - | - | - | - |
7.2920 | 2410 | 0.0823 | - | - | - | - | - |
7.3222 | 2420 | 0.0592 | - | - | - | - | - |
7.3525 | 2430 | 0.0535 | - | - | - | - | - |
7.3828 | 2440 | 0.0548 | - | - | - | - | - |
7.4130 | 2450 | 0.0598 | - | - | - | - | - |
7.4433 | 2460 | 0.0554 | - | - | - | - | - |
7.4735 | 2470 | 0.0663 | - | - | - | - | - |
7.5038 | 2480 | 0.0645 | - | - | - | - | - |
7.5340 | 2490 | 0.0638 | - | - | - | - | - |
7.5643 | 2500 | 0.0574 | - | - | - | - | - |
7.5946 | 2510 | 0.0608 | - | - | - | - | - |
7.6248 | 2520 | 0.0633 | - | - | - | - | - |
7.6551 | 2530 | 0.0576 | - | - | - | - | - |
7.6853 | 2540 | 0.0613 | - | - | - | - | - |
7.7156 | 2550 | 0.054 | - | - | - | - | - |
7.7458 | 2560 | 0.0591 | - | - | - | - | - |
7.7761 | 2570 | 0.0659 | - | - | - | - | - |
7.8064 | 2580 | 0.0601 | - | - | - | - | - |
7.8366 | 2590 | 0.053 | - | - | - | - | - |
7.8669 | 2600 | 0.0536 | - | - | - | - | - |
7.8971 | 2610 | 0.0581 | - | - | - | - | - |
7.9274 | 2620 | 0.0603 | - | - | - | - | - |
7.9576 | 2630 | 0.0661 | - | - | - | - | - |
7.9879 | 2640 | 0.0588 | - | - | - | - | - |
8.0 | 2644 | - | 0.3340 | 0.3533 | 0.3541 | 0.3163 | 0.3651 |
8.0182 | 2650 | 0.0559 | - | - | - | - | - |
8.0484 | 2660 | 0.0566 | - | - | - | - | - |
8.0787 | 2670 | 0.0666 | - | - | - | - | - |
8.1089 | 2680 | 0.0601 | - | - | - | - | - |
8.1392 | 2690 | 0.0522 | - | - | - | - | - |
8.1694 | 2700 | 0.0527 | - | - | - | - | - |
8.1997 | 2710 | 0.0622 | - | - | - | - | - |
8.2300 | 2720 | 0.0577 | - | - | - | - | - |
8.2602 | 2730 | 0.0467 | - | - | - | - | - |
8.2905 | 2740 | 0.0762 | - | - | - | - | - |
8.3207 | 2750 | 0.0562 | - | - | - | - | - |
8.3510 | 2760 | 0.0475 | - | - | - | - | - |
8.3812 | 2770 | 0.0482 | - | - | - | - | - |
8.4115 | 2780 | 0.0536 | - | - | - | - | - |
8.4418 | 2790 | 0.0534 | - | - | - | - | - |
8.4720 | 2800 | 0.0588 | - | - | - | - | - |
8.5023 | 2810 | 0.0597 | - | - | - | - | - |
8.5325 | 2820 | 0.0587 | - | - | - | - | - |
8.5628 | 2830 | 0.0544 | - | - | - | - | - |
8.5930 | 2840 | 0.0577 | - | - | - | - | - |
8.6233 | 2850 | 0.0592 | - | - | - | - | - |
8.6536 | 2860 | 0.0554 | - | - | - | - | - |
8.6838 | 2870 | 0.0541 | - | - | - | - | - |
8.7141 | 2880 | 0.0495 | - | - | - | - | - |
8.7443 | 2890 | 0.0547 | - | - | - | - | - |
8.7746 | 2900 | 0.0646 | - | - | - | - | - |
8.8048 | 2910 | 0.0574 | - | - | - | - | - |
8.8351 | 2920 | 0.0486 | - | - | - | - | - |
8.8654 | 2930 | 0.0517 | - | - | - | - | - |
8.8956 | 2940 | 0.0572 | - | - | - | - | - |
8.9259 | 2950 | 0.0518 | - | - | - | - | - |
8.9561 | 2960 | 0.0617 | - | - | - | - | - |
8.9864 | 2970 | 0.0572 | - | - | - | - | - |
8.9985 | 2974 | - | 0.3434 | 0.3552 | 0.3694 | 0.3253 | 0.3727 |
9.0166 | 2980 | 0.0549 | - | - | - | - | - |
9.0469 | 2990 | 0.0471 | - | - | - | - | - |
9.0772 | 3000 | 0.0629 | - | - | - | - | - |
9.1074 | 3010 | 0.058 | - | - | - | - | - |
9.1377 | 3020 | 0.0531 | - | - | - | - | - |
9.1679 | 3030 | 0.051 | - | - | - | - | - |
9.1982 | 3040 | 0.0593 | - | - | - | - | - |
9.2284 | 3050 | 0.056 | - | - | - | - | - |
9.2587 | 3060 | 0.0452 | - | - | - | - | - |
9.2890 | 3070 | 0.0672 | - | - | - | - | - |
9.3192 | 3080 | 0.0547 | - | - | - | - | - |
9.3495 | 3090 | 0.0477 | - | - | - | - | - |
9.3797 | 3100 | 0.0453 | - | - | - | - | - |
9.4100 | 3110 | 0.0542 | - | - | - | - | - |
9.4402 | 3120 | 0.0538 | - | - | - | - | - |
9.4705 | 3130 | 0.0552 | - | - | - | - | - |
9.5008 | 3140 | 0.0586 | - | - | - | - | - |
9.5310 | 3150 | 0.0567 | - | - | - | - | - |
9.5613 | 3160 | 0.0499 | - | - | - | - | - |
9.5915 | 3170 | 0.0598 | - | - | - | - | - |
9.6218 | 3180 | 0.0546 | - | - | - | - | - |
9.6520 | 3190 | 0.0513 | - | - | - | - | - |
9.6823 | 3200 | 0.0549 | - | - | - | - | - |
9.7126 | 3210 | 0.0513 | - | - | - | - | - |
9.7428 | 3220 | 0.0536 | - | - | - | - | - |
9.7731 | 3230 | 0.0588 | - | - | - | - | - |
9.8033 | 3240 | 0.0531 | - | - | - | - | - |
9.8336 | 3250 | 0.0472 | - | - | - | - | - |
9.8638 | 3260 | 0.0486 | - | - | - | - | - |
9.8941 | 3270 | 0.0576 | - | - | - | - | - |
9.9244 | 3280 | 0.0526 | - | - | - | - | - |
9.9546 | 3290 | 0.0568 | - | - | - | - | - |
9.9849 | 3300 | 0.0617 | 0.3333 | 0.3395 | 0.3504 | 0.3078 | 0.3464 |
- The bold row denotes the saved checkpoint.
Framework Versions
- Python: 3.10.8
- Sentence Transformers: 3.0.1
- Transformers: 4.41.2
- PyTorch: 2.1.2+cu121
- Accelerate: 0.33.0
- Datasets: 2.19.1
- Tokenizers: 0.19.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
MatryoshkaLoss
@misc{kusupati2024matryoshka,
title={Matryoshka Representation Learning},
author={Aditya Kusupati and Gantavya Bhatt and Aniket Rege and Matthew Wallingford and Aditya Sinha and Vivek Ramanujan and William Howard-Snyder and Kaifeng Chen and Sham Kakade and Prateek Jain and Ali Farhadi},
year={2024},
eprint={2205.13147},
archivePrefix={arXiv},
primaryClass={cs.LG}
}
MultipleNegativesRankingLoss
@misc{henderson2017efficient,
title={Efficient Natural Language Response Suggestion for Smart Reply},
author={Matthew Henderson and Rami Al-Rfou and Brian Strope and Yun-hsuan Sung and Laszlo Lukacs and Ruiqi Guo and Sanjiv Kumar and Balint Miklos and Ray Kurzweil},
year={2017},
eprint={1705.00652},
archivePrefix={arXiv},
primaryClass={cs.CL}
}
- Downloads last month
- 6
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
the model is not deployed on the HF Inference API.
Model tree for rbhatia46/mpnet-base-financial-rag-matryoshka
Base model
sentence-transformers/all-mpnet-base-v2Evaluation results
- Cosine Accuracy@1 on dim 768self-reported0.181
- Cosine Accuracy@3 on dim 768self-reported0.494
- Cosine Accuracy@5 on dim 768self-reported0.573
- Cosine Accuracy@10 on dim 768self-reported0.664
- Cosine Precision@1 on dim 768self-reported0.181
- Cosine Precision@3 on dim 768self-reported0.165
- Cosine Precision@5 on dim 768self-reported0.115
- Cosine Precision@10 on dim 768self-reported0.066
- Cosine Recall@1 on dim 768self-reported0.181
- Cosine Recall@3 on dim 768self-reported0.494