Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
kabir5297
/
Deberta_Huge_data
like
0
Token Classification
Transformers
Safetensors
deberta-v2
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
Deberta_Huge_data
1 contributor
History:
7 commits
kabir5297
Upload tokenizer
fa125e4
verified
9 months ago
.gitattributes
1.52 kB
initial commit
9 months ago
README.md
5.18 kB
Upload DebertaV2ForTokenClassification
9 months ago
added_tokens.json
23 Bytes
Upload tokenizer
9 months ago
config.json
1.54 kB
Upload DebertaV2ForTokenClassification
9 months ago
model.safetensors
735 MB
LFS
Upload DebertaV2ForTokenClassification
9 months ago
special_tokens_map.json
286 Bytes
Upload tokenizer
9 months ago
spm.model
2.46 MB
LFS
Upload tokenizer
9 months ago
tokenizer.json
8.66 MB
Upload tokenizer
9 months ago
tokenizer_config.json
1.28 kB
Upload tokenizer
9 months ago