SentenceTransformer based on google-bert/bert-base-uncased
This is a sentence-transformers model finetuned from google-bert/bert-base-uncased on the reason_unfiltered dataset. It maps sentences & paragraphs to a 768-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
Model Details
Model Description
- Model Type: Sentence Transformer
- Base model: google-bert/bert-base-uncased
- Maximum Sequence Length: 196 tokens
- Output Dimensionality: 768 dimensions
- Similarity Function: Cosine Similarity
- Training Dataset:
Model Sources
- Documentation: Sentence Transformers Documentation
- Repository: Sentence Transformers on GitHub
- Hugging Face: Sentence Transformers on Hugging Face
Full Model Architecture
SentenceTransformer(
(0): Transformer({'max_seq_length': 196, 'do_lower_case': False}) with Transformer model: BertModel
(1): Pooling({'word_embedding_dimension': 768, 'pooling_mode_cls_token': False, 'pooling_mode_mean_tokens': True, 'pooling_mode_max_tokens': False, 'pooling_mode_mean_sqrt_len_tokens': False, 'pooling_mode_weightedmean_tokens': False, 'pooling_mode_lasttoken': False, 'include_prompt': True})
)
Usage
Direct Usage (Sentence Transformers)
First install the Sentence Transformers library:
pip install -U sentence-transformers
Then you can load this model and run inference.
from sentence_transformers import SentenceTransformer
# Download from the 🤗 Hub
model = SentenceTransformer("bwang0911/reasoning-bert-ccnews")
# Run inference
sentences = [
'Energy advocates call for new commitment to renewable growth',
'The piece below was submitted by CFE, VoteSolar, and Environment Connecticut in response to the latest delay in the shared solar pilot program.\nSolar and environmental advocates are calling for a new community solar program in Connecticut that will expand solar access, energy choices and consumer savings for families, municipalities, and businesses statewide. The demand follows today’s Department of Energy and Environmental Protection (DEEP) technical hearing where attendees reviewed the state’s current Shared Clean Energy Facilities pilot program. The pilot has stalled several times over the last two years, most recently following DEEP’s decision to scrap all the proposals they have received and issue a new request for projects. DEEP heard from many advocates and developers at the hearing who are frustrated with this latest delay and skeptical about the long term success of the pilot.\nThe current pilot program was meant to expand solar access to Connecticut energy customers who can’t put solar on their own roof, but it contained flaws that have prevented any development to date. As set out in the legislation, the program has several poor design elements and a goal too small to draw significant private sector interest. Below are statements from stakeholders in Connecticut’s clean energy economy:\n“For years, Connecticut has missed out on the opportunity to bring solar energy choices to all consumers and more clean energy jobs to the state,” said Sean Garren, Northeast Regional director for Vote Solar. “Connecticut’s lackluster community solar program hasn’t unlocked the benefits of solar access for a single resident to date due to poor design and a lack of ambition at the scale needed, brought about by the electric utilities’ intervention. We’re calling on the legislature to catch up to the rest of New England — and the nation — with a smart, well-structured community solar program designed to serve consumers statewide.”\n“Two years of foot dragging and refusal by the Department of Energy and Environmental Protection to follow the law and implement a community solar program is preventing tens of thousands of Connecticut families from gaining access to clean, affordable, secure solar power,” said Chris Phelps, State Director for Environment Connecticut. “Community solar is helping other states accelerate solar growth, create jobs, and cut pollution. Connecticut policy makers should take action now to create a bold community solar program.”\n“Shared solar programs have been sweeping the nation for the last decade, but Connecticut has been left in the shade — losing out on healthier air, investment dollars, and green jobs that would accompany a full-scale, statewide shared solar program,” said Claire Coleman, Climate and Energy Attorney for Connecticut Fund for the Environment. “DEEP’s decision to start over with the already overly-restrictive shared solar pilot puts Connecticut further in the dark. Our climate and economy cannot wait any longer. Connecticut’s leaders must move quickly to ramp up in-state renewables through a full-scale shared solar program if Connecticut is going to have any chance of meeting its obligations under the Global Warming Solutions Act to reduce greenhouse gas emissions.”\nVote Solar is a nonprofit organization working to foster economic development and energy independence by bringing solar energy to the mainstream nationwide. Learn more at votesolar.org.',
"The second text elaborates on the first by providing details about the specific context of the energy advocates' call for renewable growth. It identifies the advocates (CFE, VoteSolar, Environment Connecticut), the specific renewable energy program (community solar), and the reasons for their call, including program delays and design flaws.",
]
embeddings = model.encode(sentences)
print(embeddings.shape)
# [3, 768]
# Get the similarity scores for the embeddings
similarities = model.similarity(embeddings, embeddings)
print(similarities.shape)
# [3, 3]
Evaluation
Metrics
Information Retrieval
- Datasets:
mteb/nfcorpus
,mteb/trec-covid
,mteb/fiqa
andmteb/quora
- Evaluated with
InformationRetrievalEvaluator
Metric | mteb/nfcorpus | mteb/trec-covid | mteb/fiqa | mteb/quora |
---|---|---|---|---|
cosine_accuracy@1 | 0.3127 | 0.62 | 0.1373 | 0.7256 |
cosine_accuracy@3 | 0.4768 | 0.82 | 0.2284 | 0.8531 |
cosine_accuracy@5 | 0.5325 | 0.92 | 0.2701 | 0.8898 |
cosine_accuracy@10 | 0.5975 | 0.94 | 0.3457 | 0.9263 |
cosine_precision@1 | 0.3127 | 0.62 | 0.1373 | 0.7256 |
cosine_precision@3 | 0.2549 | 0.56 | 0.0931 | 0.3332 |
cosine_precision@5 | 0.2099 | 0.552 | 0.0694 | 0.2198 |
cosine_precision@10 | 0.1656 | 0.512 | 0.0465 | 0.1215 |
cosine_recall@1 | 0.0312 | 0.0005 | 0.0698 | 0.6303 |
cosine_recall@3 | 0.0562 | 0.0014 | 0.1265 | 0.79 |
cosine_recall@5 | 0.0688 | 0.0024 | 0.1566 | 0.8381 |
cosine_recall@10 | 0.097 | 0.0044 | 0.2 | 0.8875 |
cosine_ndcg@10 | 0.2185 | 0.5323 | 0.1575 | 0.8013 |
cosine_mrr@10 | 0.4016 | 0.7307 | 0.1957 | 0.796 |
cosine_map@100 | 0.0895 | 0.2299 | 0.1281 | 0.7648 |
Training Details
Training Dataset
reason_unfiltered
- Dataset: reason_unfiltered at 2e4fb05
- Size: 44,978 training samples
- Columns:
title
,body
, andreason
- Approximate statistics based on the first 1000 samples:
title body reason type string string string details - min: 6 tokens
- mean: 15.34 tokens
- max: 42 tokens
- min: 21 tokens
- mean: 178.04 tokens
- max: 196 tokens
- min: 28 tokens
- mean: 59.19 tokens
- max: 88 tokens
- Samples:
title body reason Fight Leaves Wayne Simmonds Shirtless
Reed Saxon/AP Images
Kevin Bieksa and Wayne Simmonds dropped the gloves just 95 seconds into last night’s 4-3 Ducks shootout win over the Flyers, and Bieksa immediately yanked his opponent’s jersey over his head, to the delight of the crowd and to grins from Simmonds and the officials.
That’s not supposed to happen. NHL players wear something called a fight strap, which binds the back of the jersey to the pants, preventing the jersey from being pulled off. (Losing a jersey is an advantage in a fight, as it gives the shirtless player’s opponent nothing to grab on to. Sabres enforcer Rob Ray was notorious for losing his gear in a fight, occasionally taking it off himself before clinching.) Any player who engaged in a fight without wearing a fight strap is subject to an automatic game misconduct.
Advertisement
Simmonds wasn’t ejected, though; at the one-minute mark of the video above, you can see he did have his fight strap properly attached. It just broke, which happens on occasion.The article describes a hockey fight involving Wayne Simmonds, confirming the title's claim. It details the fight, including Simmonds' jersey being pulled off, and explains the rules and context around the incident, directly elaborating on the event suggested by the title.
Merck CEO Kenneth Frazier ditches Trump over Charlottesville silence
Merck CEO Kenneth C. Frazier resigned from the president’s council on manufacturing Monday in direct protest of President Donald Trump’s lack of condemnation of white nationalist actions in Charlottesville, Va. over the weekend.
In a statement, Frazier, who is African-American, said he believes the country’s strength comes from the diversity of its citizens and that he feels personally compelled to stand up for that diversity and against intolerance.
“America’s leaders must honor our fundamental values by clearly rejecting expressions of hatred, bigotry and group supremacy, which run counter to the American ideal that all people are created equal,” he wrote. “As CEO of Merck, and as a matter of personal conscience, I feel a responsibility to take a stand against intolerance and extremism.”
RELATED: At least one death has been confirmed after a car plowed into a crowd of protesters in Charlottesville
Trump immediately fired back at Frazier on Twitter, saying the Merck CEO now “will have...The second text provides a detailed elaboration of the first. It explains the context of Kenneth Frazier's resignation, the reasons behind it (Trump's silence on Charlottesville), and includes Frazier's statement. It also provides additional background information about Frazier and the President's Manufacturing Council.
Lightning's Braydon Coburn: Joining road trip
Coburn (lower body) will travel with the team on its upcoming four-game road trip and is hoping to play at some point in the second half of the trip, Bryan Burns of the Lightning's official site reports.
The veteran blueliner is yet to play in the month of December, having already missed four games. However, the fact that Coburn is traveling with the team and has been given a chance to play at some point within the next week will be music to the ears of fantasy owners who benefited from Coburn's surprising production -- seven points in 25 games -- earlier in the season. Keep an eye out for updates as the trip progresses.The second text elaborates on the first by providing details about Braydon Coburn's situation. It specifies that he will join the team on a road trip and offers context about his injury, recovery timeline, and potential for playing, directly expanding on the initial announcement.
- Loss:
ReasoningGuidedRankingLoss
with these parameters:{ "scale": 20.0, "similarity_fct": "cos_sim" }
Training Hyperparameters
Non-Default Hyperparameters
eval_strategy
: stepsper_device_train_batch_size
: 256learning_rate
: 1e-05warmup_ratio
: 0.1fp16
: Truebatch_sampler
: no_duplicates
All Hyperparameters
Click to expand
overwrite_output_dir
: Falsedo_predict
: Falseeval_strategy
: stepsprediction_loss_only
: Trueper_device_train_batch_size
: 256per_device_eval_batch_size
: 8per_gpu_train_batch_size
: Noneper_gpu_eval_batch_size
: Nonegradient_accumulation_steps
: 1eval_accumulation_steps
: Nonetorch_empty_cache_steps
: Nonelearning_rate
: 1e-05weight_decay
: 0.0adam_beta1
: 0.9adam_beta2
: 0.999adam_epsilon
: 1e-08max_grad_norm
: 1.0num_train_epochs
: 3max_steps
: -1lr_scheduler_type
: linearlr_scheduler_kwargs
: {}warmup_ratio
: 0.1warmup_steps
: 0log_level
: passivelog_level_replica
: warninglog_on_each_node
: Truelogging_nan_inf_filter
: Truesave_safetensors
: Truesave_on_each_node
: Falsesave_only_model
: Falserestore_callback_states_from_checkpoint
: Falseno_cuda
: Falseuse_cpu
: Falseuse_mps_device
: Falseseed
: 42data_seed
: Nonejit_mode_eval
: Falseuse_ipex
: Falsebf16
: Falsefp16
: Truefp16_opt_level
: O1half_precision_backend
: autobf16_full_eval
: Falsefp16_full_eval
: Falsetf32
: Nonelocal_rank
: 0ddp_backend
: Nonetpu_num_cores
: Nonetpu_metrics_debug
: Falsedebug
: []dataloader_drop_last
: Falsedataloader_num_workers
: 0dataloader_prefetch_factor
: Nonepast_index
: -1disable_tqdm
: Falseremove_unused_columns
: Truelabel_names
: Noneload_best_model_at_end
: Falseignore_data_skip
: Falsefsdp
: []fsdp_min_num_params
: 0fsdp_config
: {'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}tp_size
: 0fsdp_transformer_layer_cls_to_wrap
: Noneaccelerator_config
: {'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None}deepspeed
: Nonelabel_smoothing_factor
: 0.0optim
: adamw_torchoptim_args
: Noneadafactor
: Falsegroup_by_length
: Falselength_column_name
: lengthddp_find_unused_parameters
: Noneddp_bucket_cap_mb
: Noneddp_broadcast_buffers
: Falsedataloader_pin_memory
: Truedataloader_persistent_workers
: Falseskip_memory_metrics
: Trueuse_legacy_prediction_loop
: Falsepush_to_hub
: Falseresume_from_checkpoint
: Nonehub_model_id
: Nonehub_strategy
: every_savehub_private_repo
: Nonehub_always_push
: Falsegradient_checkpointing
: Falsegradient_checkpointing_kwargs
: Noneinclude_inputs_for_metrics
: Falseinclude_for_metrics
: []eval_do_concat_batches
: Truefp16_backend
: autopush_to_hub_model_id
: Nonepush_to_hub_organization
: Nonemp_parameters
:auto_find_batch_size
: Falsefull_determinism
: Falsetorchdynamo
: Noneray_scope
: lastddp_timeout
: 1800torch_compile
: Falsetorch_compile_backend
: Nonetorch_compile_mode
: Nonedispatch_batches
: Nonesplit_batches
: Noneinclude_tokens_per_second
: Falseinclude_num_input_tokens_seen
: Falseneftune_noise_alpha
: Noneoptim_target_modules
: Nonebatch_eval_metrics
: Falseeval_on_start
: Falseuse_liger_kernel
: Falseeval_use_gather_object
: Falseaverage_tokens_across_devices
: Falseprompts
: Nonebatch_sampler
: no_duplicatesmulti_dataset_batch_sampler
: proportional
Training Logs
Epoch | Step | Training Loss | mteb/nfcorpus_cosine_ndcg@10 | mteb/trec-covid_cosine_ndcg@10 | mteb/fiqa_cosine_ndcg@10 | mteb/quora_cosine_ndcg@10 |
---|---|---|---|---|---|---|
-1 | -1 | - | 0.0583 | 0.2174 | 0.0237 | 0.6103 |
0.0568 | 10 | 3.443 | - | - | - | - |
0.1136 | 20 | 2.9692 | - | - | - | - |
0.1705 | 30 | 2.1061 | - | - | - | - |
0.2273 | 40 | 1.3012 | 0.0901 | 0.3585 | 0.0642 | 0.7024 |
0.2841 | 50 | 0.9825 | - | - | - | - |
0.3409 | 60 | 0.7112 | - | - | - | - |
0.3977 | 70 | 0.5853 | - | - | - | - |
0.4545 | 80 | 0.5555 | 0.1714 | 0.5160 | 0.1287 | 0.7800 |
0.5114 | 90 | 0.4633 | - | - | - | - |
0.5682 | 100 | 0.4216 | - | - | - | - |
0.625 | 110 | 0.3846 | - | - | - | - |
0.6818 | 120 | 0.4017 | 0.1923 | 0.5446 | 0.1417 | 0.7890 |
0.7386 | 130 | 0.3606 | - | - | - | - |
0.7955 | 140 | 0.3731 | - | - | - | - |
0.8523 | 150 | 0.3451 | - | - | - | - |
0.9091 | 160 | 0.3352 | 0.2017 | 0.5343 | 0.1472 | 0.7951 |
0.9659 | 170 | 0.3364 | - | - | - | - |
1.0227 | 180 | 0.2606 | - | - | - | - |
1.0795 | 190 | 0.2627 | - | - | - | - |
1.1364 | 200 | 0.2641 | 0.2065 | 0.5449 | 0.1499 | 0.7963 |
1.1932 | 210 | 0.2448 | - | - | - | - |
1.25 | 220 | 0.2394 | - | - | - | - |
1.3068 | 230 | 0.2433 | - | - | - | - |
1.3636 | 240 | 0.2236 | 0.2096 | 0.5432 | 0.1519 | 0.7975 |
1.4205 | 250 | 0.221 | - | - | - | - |
1.4773 | 260 | 0.2215 | - | - | - | - |
1.5341 | 270 | 0.2291 | - | - | - | - |
1.5909 | 280 | 0.2433 | 0.2102 | 0.5322 | 0.1543 | 0.7994 |
1.6477 | 290 | 0.219 | - | - | - | - |
1.7045 | 300 | 0.2207 | - | - | - | - |
1.7614 | 310 | 0.2102 | - | - | - | - |
1.8182 | 320 | 0.2138 | 0.2163 | 0.5289 | 0.1553 | 0.8006 |
1.875 | 330 | 0.2076 | - | - | - | - |
1.9318 | 340 | 0.2076 | - | - | - | - |
1.9886 | 350 | 0.2066 | - | - | - | - |
2.0455 | 360 | 0.2046 | 0.2154 | 0.5339 | 0.1558 | 0.8006 |
2.1023 | 370 | 0.1844 | - | - | - | - |
2.1591 | 380 | 0.17 | - | - | - | - |
2.2159 | 390 | 0.1913 | - | - | - | - |
2.2727 | 400 | 0.165 | 0.2165 | 0.5339 | 0.1547 | 0.8014 |
2.3295 | 410 | 0.1878 | - | - | - | - |
2.3864 | 420 | 0.1841 | - | - | - | - |
2.4432 | 430 | 0.1683 | - | - | - | - |
2.5 | 440 | 0.1767 | 0.2178 | 0.5307 | 0.1565 | 0.8014 |
2.5568 | 450 | 0.1627 | - | - | - | - |
2.6136 | 460 | 0.161 | - | - | - | - |
2.6705 | 470 | 0.1717 | - | - | - | - |
2.7273 | 480 | 0.1832 | 0.2169 | 0.5341 | 0.1570 | 0.8012 |
2.7841 | 490 | 0.1673 | - | - | - | - |
2.8409 | 500 | 0.1517 | - | - | - | - |
2.8977 | 510 | 0.1797 | - | - | - | - |
2.9545 | 520 | 0.1862 | 0.2185 | 0.5323 | 0.1575 | 0.8013 |
Framework Versions
- Python: 3.10.12
- Sentence Transformers: 3.5.0.dev0
- Transformers: 4.50.0
- PyTorch: 2.6.0+cu124
- Accelerate: 1.5.2
- Datasets: 3.4.1
- Tokenizers: 0.21.1
Citation
BibTeX
Sentence Transformers
@inproceedings{reimers-2019-sentence-bert,
title = "Sentence-BERT: Sentence Embeddings using Siamese BERT-Networks",
author = "Reimers, Nils and Gurevych, Iryna",
booktitle = "Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing",
month = "11",
year = "2019",
publisher = "Association for Computational Linguistics",
url = "https://arxiv.org/abs/1908.10084",
}
- Downloads last month
- 21
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for bwang0911/reasoning-bert-ccnews
Base model
google-bert/bert-base-uncasedDataset used to train bwang0911/reasoning-bert-ccnews
Evaluation results
- Cosine Accuracy@1 on mteb/nfcorpusself-reported0.313
- Cosine Accuracy@3 on mteb/nfcorpusself-reported0.477
- Cosine Accuracy@5 on mteb/nfcorpusself-reported0.533
- Cosine Accuracy@10 on mteb/nfcorpusself-reported0.598
- Cosine Precision@1 on mteb/nfcorpusself-reported0.313
- Cosine Precision@3 on mteb/nfcorpusself-reported0.255
- Cosine Precision@5 on mteb/nfcorpusself-reported0.210
- Cosine Precision@10 on mteb/nfcorpusself-reported0.166
- Cosine Recall@1 on mteb/nfcorpusself-reported0.031
- Cosine Recall@3 on mteb/nfcorpusself-reported0.056