Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
alenaa
/
hack_fulldata
like
0
Text Classification
Transformers
Safetensors
bert
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
hack_fulldata
1 contributor
History:
5 commits
alenaa
Upload tokenizer
e4ad4a9
verified
5 months ago
.gitattributes
1.52 kB
initial commit
5 months ago
README.md
5.17 kB
Upload BertForSequenceClassification
5 months ago
config.json
918 Bytes
Upload BertForSequenceClassification
5 months ago
model.safetensors
711 MB
LFS
Upload BertForSequenceClassification
5 months ago
special_tokens_map.json
125 Bytes
Upload tokenizer
5 months ago
tokenizer_config.json
1.27 kB
Upload tokenizer
5 months ago
vocab.txt
1.65 MB
Upload tokenizer
5 months ago