Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
mecoaoge2
/
ViHOS1
like
0
Token Classification
Transformers
Safetensors
xlm-roberta
Inference Endpoints
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
ViHOS1
1 contributor
History:
3 commits
mecoaoge2
Upload tokenizer
da7a1b8
verified
14 days ago
.gitattributes
1.52 kB
initial commit
14 days ago
README.md
5.17 kB
Upload XLMRobertaForTokenClassification
14 days ago
config.json
826 Bytes
Upload XLMRobertaForTokenClassification
14 days ago
model.safetensors
388 MB
LFS
Upload XLMRobertaForTokenClassification
14 days ago
sentencepiece.bpe.model
471 kB
LFS
Upload tokenizer
14 days ago
special_tokens_map.json
964 Bytes
Upload tokenizer
14 days ago
tokenizer.json
1.1 MB
Upload tokenizer
14 days ago
tokenizer_config.json
1.23 kB
Upload tokenizer
14 days ago