Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
ceggian
/
sbert_pt_reddit_softmax_256
like
0
Sentence Similarity
sentence-transformers
PyTorch
Transformers
bert
feature-extraction
text-embeddings-inference
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
sbert_pt_reddit_softmax_256
1 contributor
History:
14 commits
ceggian
Upload pytorch_model.bin with git-lfs
7772ca3
over 2 years ago
1_Pooling
Delete 1_Pooling/test
over 2 years ago
.gitattributes
1.17 kB
initial commit
over 2 years ago
README.md
3.69 kB
Upload README.md
over 2 years ago
config.json
682 Bytes
Upload config.json
over 2 years ago
config_sentence_transformers.json
124 Bytes
Upload config_sentence_transformers.json
over 2 years ago
modules.json
229 Bytes
Upload modules.json
over 2 years ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch.LongStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
438 MB
LFS
Upload pytorch_model.bin with git-lfs
over 2 years ago
sentence_bert_config.json
53 Bytes
Upload sentence_bert_config.json
over 2 years ago
special_tokens_map.json
112 Bytes
Upload special_tokens_map.json
over 2 years ago
tokenizer.json
712 kB
Upload tokenizer.json
over 2 years ago
tokenizer_config.json
393 Bytes
Upload tokenizer_config.json
over 2 years ago
vocab.txt
232 kB
Upload vocab.txt
over 2 years ago