Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
ankitv42
/
NER_Model
like
1
Token Classification
Transformers
Safetensors
bert
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
NER_Model
1 contributor
History:
3 commits
ankitv42
Upload tokenizer
a5719e1
verified
20 days ago
.gitattributes
Safe
1.52 kB
initial commit
20 days ago
README.md
Safe
5.17 kB
Upload BertForTokenClassification
20 days ago
config.json
Safe
1.04 kB
Upload BertForTokenClassification
20 days ago
model.safetensors
Safe
436 MB
LFS
Upload BertForTokenClassification
20 days ago
special_tokens_map.json
Safe
695 Bytes
Upload tokenizer
20 days ago
tokenizer.json
Safe
711 kB
Upload tokenizer
20 days ago
tokenizer_config.json
Safe
1.26 kB
Upload tokenizer
20 days ago
vocab.txt
Safe
232 kB
Upload tokenizer
20 days ago