Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
chrommium
/
two-step-finetuning-sbert
like
0
Text Classification
Transformers
PyTorch
bert
Inference Endpoints
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
two-step-finetuning-sbert
/
special_tokens_map.json
Commit History
add tokenizer
b55007b
chrommium
commited on
Nov 23, 2021