Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Vashesh
/
BERT_finetuned2
like
0
Token Classification
Transformers
Safetensors
bert
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
BERT_finetuned2
1 contributor
History:
3 commits
Vashesh
Upload tokenizer
a144f3a
verified
about 1 month ago
.gitattributes
1.52 kB
initial commit
about 1 month ago
README.md
5.17 kB
Upload BertForTokenClassification
about 1 month ago
config.json
17.5 kB
Upload BertForTokenClassification
about 1 month ago
model.safetensors
432 MB
LFS
Upload BertForTokenClassification
about 1 month ago
special_tokens_map.json
125 Bytes
Upload tokenizer
about 1 month ago
tokenizer.json
669 kB
Upload tokenizer
about 1 month ago
tokenizer_config.json
1.19 kB
Upload tokenizer
about 1 month ago
vocab.txt
213 kB
Upload tokenizer
about 1 month ago