Update README.md
Browse files
README.md
CHANGED
@@ -73,6 +73,13 @@ model-index:
|
|
73 |
|
74 |
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
75 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
76 |
## Model Details
|
77 |
|
78 |
### Model Description
|
@@ -115,7 +122,7 @@ Then you can load this model and run inference.
|
|
115 |
from sentence_transformers import SentenceTransformer
|
116 |
|
117 |
# Download from the 🤗 Hub
|
118 |
-
model = SentenceTransformer("dbourget/
|
119 |
# Run inference
|
120 |
sentences = [
|
121 |
'scientific revolutions',
|
|
|
73 |
|
74 |
This is a [sentence-transformers](https://www.SBERT.net) model trained. It maps sentences & paragraphs to a 1024-dimensional dense vector space and can be used for semantic textual similarity, semantic search, paraphrase mining, text classification, clustering, and more.
|
75 |
|
76 |
+
1. bert-base-uncased was pretrained on a large corpus of open access philosophy text.
|
77 |
+
2. This model was further trained using TSDAE on a subset of sentences from this corpus for 6 epochs.
|
78 |
+
3. Resulting model was finetuned using cosine similarity objective on the "philsim" private dataset.
|
79 |
+
4. Resulting model was finetuned using cosine similarity objective on the beatai-philosophy dataset.
|
80 |
+
|
81 |
+
Model internal name: pb-small-10e-tsdae6e-philsim-cosine-6e-beatai-20e
|
82 |
+
|
83 |
## Model Details
|
84 |
|
85 |
### Model Description
|
|
|
122 |
from sentence_transformers import SentenceTransformer
|
123 |
|
124 |
# Download from the 🤗 Hub
|
125 |
+
model = SentenceTransformer("dbourget/philai-embeddings-2.0")
|
126 |
# Run inference
|
127 |
sentences = [
|
128 |
'scientific revolutions',
|