Sentence Similarity
sentence-transformers
Safetensors
bert
feature-extraction
dense
Generated from Trainer
dataset_size:262023
loss:MultipleNegativesRankingLoss
text-embeddings-inference
Instructions to use dpshade22/hf-e5-bible-75 with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- sentence-transformers
How to use dpshade22/hf-e5-bible-75 with sentence-transformers:
from sentence_transformers import SentenceTransformer model = SentenceTransformer("dpshade22/hf-e5-bible-75") sentences = [ "query: what happened at holy week", "passage: He replied, “Go into the city to a certain man and tell him, ‘The Teacher says: My appointed time is near. I am going to celebrate the Passover with my disciples at your house.’”", "passage: Many nations will come and say,\n“Come, let us go up to the mountain of the Lord,\n to the temple of the God of Jacob.\nHe will teach us his ways,\n so that we may walk in his paths.”\nThe law will go out from Zion,\n the word of the Lord from Jerusalem.", "passage: But seek first his kingdom and his righteousness, and all these things will be given to you as well." ] embeddings = model.encode(sentences) similarities = model.similarity(embeddings, embeddings) print(similarities.shape) # [4, 4] - Notebooks
- Google Colab
- Kaggle
| { | |
| "architectures": [ | |
| "BertModel" | |
| ], | |
| "attention_probs_dropout_prob": 0.1, | |
| "classifier_dropout": null, | |
| "dtype": "float32", | |
| "gradient_checkpointing": false, | |
| "hidden_act": "gelu", | |
| "hidden_dropout_prob": 0.1, | |
| "hidden_size": 768, | |
| "initializer_range": 0.02, | |
| "intermediate_size": 3072, | |
| "layer_norm_eps": 1e-12, | |
| "max_position_embeddings": 512, | |
| "model_type": "bert", | |
| "num_attention_heads": 12, | |
| "num_hidden_layers": 12, | |
| "pad_token_id": 0, | |
| "position_embedding_type": "absolute", | |
| "transformers_version": "4.57.6", | |
| "type_vocab_size": 2, | |
| "use_cache": true, | |
| "vocab_size": 30522 | |
| } | |