AgaMiko commited on
Commit
11a880b
1 Parent(s): ea1f972

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -0
README.md CHANGED
@@ -1,5 +1,11 @@
1
  ---
2
  license: cc-by-4.0
 
 
 
 
 
 
3
  ---
4
  # SHerbert - Polish SentenceBERT
5
  SentenceBERT is a modification of the pretrained BERT network that use siamese and triplet network structures to derive semantically meaningful sentence embeddings that can be compared using cosine-similarity. Training was based on the original paper [Siamese BERT models for the task of semantic textual similarity (STS)](https://arxiv.org/abs/1908.10084) with a slight modification of how the training data was used. The goal of the model is to generate different embeddings based on the semantic and topic similarity of the given text.
 
1
  ---
2
  license: cc-by-4.0
3
+ language:
4
+ - pl
5
+ datasets:
6
+ - Wikipedia
7
+ tags:
8
+ - sentence similarity
9
  ---
10
  # SHerbert - Polish SentenceBERT
11
  SentenceBERT is a modification of the pretrained BERT network that use siamese and triplet network structures to derive semantically meaningful sentence embeddings that can be compared using cosine-similarity. Training was based on the original paper [Siamese BERT models for the task of semantic textual similarity (STS)](https://arxiv.org/abs/1908.10084) with a slight modification of how the training data was used. The goal of the model is to generate different embeddings based on the semantic and topic similarity of the given text.