---
language:
- en
license: apache-2.0
tags:
- sentence-transformers
- sentence-similarity
- feature-extraction
- generated_from_trainer
- dataset_size:6300
- loss:MatryoshkaLoss
- loss:MultipleNegativesRankingLoss
base_model: BAAI/bge-base-en-v1.5
widget:
- source_sentence: The cumulative basis adjustments associated with these hedging
relationships are a reduction of the amortized cost basis of the closed portfolios
of $19 million.
sentences:
- What are the main factors that influence the timing and cost of the company's
inventory purchases?
- What was the reduction in the amortized cost basis of the closed portfolios due
to cumulative basis adjustments in these hedging relationships?
- What was Garmin Ltd.'s net income for the fiscal year ended December 30, 2023?
- source_sentence: 'The components of the provision for income taxes were as follows:
U.S. Federal $ (314,757), U.S. State and Local $ (85,355), Foreign $ (1,162).
Effective income tax rate | 24.2% | | 23.9% | | ''19.7% | for the years 2021,
2022, and 2023.'
sentences:
- How much of the lease obligations is payable within 12 months as of December 31,
2023?
- What are the components and the effective tax rates for the year 2023 as reported
in the financial statements?
- How many Dollar Tree Plus stores were there as of January 28, 2023?
- source_sentence: The Company may receive advanced royalty payments from licensees,
either in advance of a licensee’s subsequent sales to customers or, prior to the
completion of the Company’s performance obligation. The Wizards of the Coast and
Digital Gaming segment may also receive advanced payments from end users of its
digital games at the time of the initial purchase, through in-application purchases,
or through subscription services. Revenues on all licensee and digital gaming
advanced payments are deferred until the respective performance obligations are
satisfied, and these digital gaming revenues are recognized over a period of time,
determined based on either player usage patterns or the estimated playing life
of the user, or when additional downloadable content is made available, or as
with subscription services, ratably over the subscription term.
sentences:
- How does the Company recognize revenue from advanced royalty payments and digital
game purchases?
- What is the primary role of Canopy technology in the Health Services segment?
- Which section of a financial document provides an index to Financial Statements
and Supplementary Data?
- source_sentence: Item 8 covers Financial Statements and Supplementary Data.
sentences:
- How much did the prepaid expenses increase from 2022 to 2023?
- What strategies are outlined in the Company's human capital management?
- What type of data does Item 8 cover in the company's filing?
- source_sentence: When points are issued as a result of a stay by a Hilton Honors
member at an owned or leased hotel, we recognize a reduction in owned and leased
hotels revenues, since we are also the program sponsor.
sentences:
- What financial impact does the redemption of Hilton Honors points have on the
revenue of owned and leased hotels?
- What original companies formed IBM in 1911?
- What was the global gender equity status at Meta in July 2023?
pipeline_tag: sentence-similarity
library_name: sentence-transformers
metrics:
- cosine_accuracy@1
- cosine_accuracy@3
- cosine_accuracy@5
- cosine_accuracy@10
- cosine_precision@1
- cosine_precision@3
- cosine_precision@5
- cosine_precision@10
- cosine_recall@1
- cosine_recall@3
- cosine_recall@5
- cosine_recall@10
- cosine_ndcg@10
- cosine_mrr@10
- cosine_map@100
model-index:
- name: BGE base Financial Matryoshka
results:
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 768
type: dim_768
metrics:
- type: cosine_accuracy@1
value: 0.6714285714285714
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.8114285714285714
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8485714285714285
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.9
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6714285714285714
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2704761904761904
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16971428571428568
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.09
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6714285714285714
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.8114285714285714
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8485714285714285
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.9
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7869239024966277
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7507120181405897
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7550416257512982
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 512
type: dim_512
metrics:
- type: cosine_accuracy@1
value: 0.6657142857142857
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.81
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8542857142857143
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8928571428571429
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6657142857142857
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.27
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.17085714285714285
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08928571428571426
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6657142857142857
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.81
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8542857142857143
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8928571428571429
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7812019485050782
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7451230158730157
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7500357971583163
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 256
type: dim_256
metrics:
- type: cosine_accuracy@1
value: 0.6628571428571428
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.7928571428571428
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8428571428571429
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8842857142857142
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6628571428571428
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.2642857142857143
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16857142857142854
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08842857142857141
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6628571428571428
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.7928571428571428
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8428571428571429
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8842857142857142
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7743199196082401
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7389903628117913
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7442531468911058
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 128
type: dim_128
metrics:
- type: cosine_accuracy@1
value: 0.6671428571428571
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.77
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.8228571428571428
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8685714285714285
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6671428571428571
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.25666666666666665
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.16457142857142856
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08685714285714285
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6671428571428571
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.77
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.8228571428571428
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8685714285714285
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7655373626539865
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.7328270975056688
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7378874490017019
name: Cosine Map@100
- task:
type: information-retrieval
name: Information Retrieval
dataset:
name: dim 64
type: dim_64
metrics:
- type: cosine_accuracy@1
value: 0.6285714285714286
name: Cosine Accuracy@1
- type: cosine_accuracy@3
value: 0.75
name: Cosine Accuracy@3
- type: cosine_accuracy@5
value: 0.7842857142857143
name: Cosine Accuracy@5
- type: cosine_accuracy@10
value: 0.8285714285714286
name: Cosine Accuracy@10
- type: cosine_precision@1
value: 0.6285714285714286
name: Cosine Precision@1
- type: cosine_precision@3
value: 0.25
name: Cosine Precision@3
- type: cosine_precision@5
value: 0.15685714285714283
name: Cosine Precision@5
- type: cosine_precision@10
value: 0.08285714285714285
name: Cosine Precision@10
- type: cosine_recall@1
value: 0.6285714285714286
name: Cosine Recall@1
- type: cosine_recall@3
value: 0.75
name: Cosine Recall@3
- type: cosine_recall@5
value: 0.7842857142857143
name: Cosine Recall@5
- type: cosine_recall@10
value: 0.8285714285714286
name: Cosine Recall@10
- type: cosine_ndcg@10
value: 0.7300345502506145
name: Cosine Ndcg@10
- type: cosine_mrr@10
value: 0.6984109977324261
name: Cosine Mrr@10
- type: cosine_map@100
value: 0.7040560866496234
name: Cosine Map@100
---
# BGE base Financial Matryoshka
This is a [sentence-transformers](https://www.SBERT.net) model finetuned from [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5) on the json dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
## Model Details
### Model Description
- **Model Type:** Sentence Transformer
- **Base model:** [BAAI/bge-base-en-v1.5](https://huggingface.co/BAAI/bge-base-en-v1.5)
- **Maximum Sequence Length:** 512 tokens
- **Output Dimensionality:** 768 dimensions
- **Similarity Function:** Cosine Similarity
- **Training Dataset:**
- json
- **Language:** en
- **License:** apache-2.0
### Model Sources
- **Documentation:** [Sentence Transformers Documentation](https://sbert.net)
- **Repository:** [Sentence Transformers on GitHub](https://github.com/UKPLab/sentence-transformers)
- **Hugging Face:** [Sentence Transformers on Hugging Face](https://huggingface.co/models?library=sentence-transformers)
### Full Model Architecture
```
SentenceTransformer(
(0): Transformer({'max_seq_length': 512, 'do_lower_case': True}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': True, 'pooling_mode_mean_tokens': False, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
(2): Normalize()
)
```
## Usage
### Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
```bash
pip install -U sentence-transformers
```
Then you can load this model and run inference.
```python
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("Ram934/bge-base-financial-matryoshka")
# Run inference
sentences = [
'When points are issued as a result of a stay by a Hilton Honors member at an owned or leased hotel, we recognize a reduction in owned and leased hotels revenues, since we are also the program sponsor.',
'What financial impact does the redemption of Hilton Honors points have on the revenue of owned and leased hotels?',
'What original companies formed IBM in 1911?',
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
```
## Evaluation
### Metrics
#### Information Retrieval
* Datasets: `dim_768`, `dim_512`, `dim_256`, `dim_128` and `dim_64`
* Evaluated with [InformationRetrievalEvaluator
](https://sbert.net/docs/package_reference/sentence_transformer/evaluation.html#sentence_transformers.evaluation.InformationRetrievalEvaluator)
| Metric | dim_768 | dim_512 | dim_256 | dim_128 | dim_64 |
|:--------------------|:-----------|:-----------|:-----------|:-----------|:---------|
| cosine_accuracy@1 | 0.6714 | 0.6657 | 0.6629 | 0.6671 | 0.6286 |
| cosine_accuracy@3 | 0.8114 | 0.81 | 0.7929 | 0.77 | 0.75 |
| cosine_accuracy@5 | 0.8486 | 0.8543 | 0.8429 | 0.8229 | 0.7843 |
| cosine_accuracy@10 | 0.9 | 0.8929 | 0.8843 | 0.8686 | 0.8286 |
| cosine_precision@1 | 0.6714 | 0.6657 | 0.6629 | 0.6671 | 0.6286 |
| cosine_precision@3 | 0.2705 | 0.27 | 0.2643 | 0.2567 | 0.25 |
| cosine_precision@5 | 0.1697 | 0.1709 | 0.1686 | 0.1646 | 0.1569 |
| cosine_precision@10 | 0.09 | 0.0893 | 0.0884 | 0.0869 | 0.0829 |
| cosine_recall@1 | 0.6714 | 0.6657 | 0.6629 | 0.6671 | 0.6286 |
| cosine_recall@3 | 0.8114 | 0.81 | 0.7929 | 0.77 | 0.75 |
| cosine_recall@5 | 0.8486 | 0.8543 | 0.8429 | 0.8229 | 0.7843 |
| cosine_recall@10 | 0.9 | 0.8929 | 0.8843 | 0.8686 | 0.8286 |
| **cosine_ndcg@10** | **0.7869** | **0.7812** | **0.7743** | **0.7655** | **0.73** |
| cosine_mrr@10 | 0.7507 | 0.7451 | 0.739 | 0.7328 | 0.6984 |
| cosine_map@100 | 0.755 | 0.75 | 0.7443 | 0.7379 | 0.7041 |
## Training Details
### Training Dataset
#### json
* Dataset: json
* Size: 6,300 training samples
* Columns: positive
and anchor
* Approximate statistics based on the first 1000 samples:
| | positive | anchor |
|:--------|:-----------------------------------------------------------------------------------|:----------------------------------------------------------------------------------|
| type | string | string |
| details |
All of our Company’s facilities and other operations in the United States and elsewhere around the world are subject to various environmental protection statutes and regulations, including those relating to the use and treatment of water resources, discharge of wastewater, and air emissions.
| What types of environmental regulations does the company need to comply with?
|
| Domestically, diesel fuel prices were higher in fiscal 2022 than in the prior year and may increase further in fiscal 2023 because of international tensions.
| How did diesel fuel prices affect the company’s freight costs in fiscal 2022?
|
| Our common stock trades on the NASDAQ Global Select Market, under the symbol “COST.”
| What is the trading symbol for Costco's common stock on the NASDAQ Global Select Market?
|
* Loss: [MatryoshkaLoss
](https://sbert.net/docs/package_reference/sentence_transformer/losses.html#matryoshkaloss) with these parameters:
```json
{
"loss": "MultipleNegativesRankingLoss",
"matryoshka_dims": [
768,
512,
256,
128,
64
],
"matryoshka_weights": [
1,
1,
1,
1,
1
],
"n_dims_per_step": -1
}
```
### Training Hyperparameters
#### Non-Default Hyperparameters
- `eval_strategy`: epoch
- `per_device_train_batch_size`: 32
- `per_device_eval_batch_size`: 16
- `gradient_accumulation_steps`: 16
- `learning_rate`: 2e-05
- `num_train_epochs`: 4
- `lr_scheduler_type`: cosine
- `warmup_ratio`: 0.1
- `tf32`: False
- `load_best_model_at_end`: True
- `optim`: adamw_torch_fused
- `batch_sampler`: no_duplicates
#### All Hyperparameters