Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
linzw
/
PASTED-grammatical
like
0
Token Classification
Transformers
Safetensors
longformer
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
PASTED-grammatical
Ctrl+K
Ctrl+K
1 contributor
History:
3 commits
linzw
Upload tokenizer
8700ed3
verified
11 months ago
.gitattributes
Safe
1.52 kB
initial commit
11 months ago
README.md
Safe
31 Bytes
initial commit
11 months ago
config.json
1.02 kB
Upload LongformerForTokenClassification
11 months ago
merges.txt
Safe
456 kB
Upload tokenizer
11 months ago
model.safetensors
592 MB
LFS
Upload LongformerForTokenClassification
11 months ago
special_tokens_map.json
Safe
1.01 kB
Upload tokenizer
11 months ago
tokenizer.json
Safe
2.11 MB
Upload tokenizer
11 months ago
tokenizer_config.json
Safe
1.5 kB
Upload tokenizer
11 months ago
vocab.json
Safe
798 kB
Upload tokenizer
11 months ago