Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
Luciaaaaa
/
dummy-model
like
0
Token Classification
Transformers
Safetensors
gpt2
Inference Endpoints
text-generation-inference
arxiv:
1910.09700
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
main
dummy-model
1 contributor
History:
8 commits
Luciaaaaa
Upload tokenizer
628f923
verified
about 2 months ago
.gitattributes
1.52 kB
initial commit
about 2 months ago
README.md
5.18 kB
Upload GPT2ForTokenClassification
about 2 months ago
config.json
1.19 kB
Upload GPT2ForTokenClassification
about 2 months ago
merges.txt
456 kB
Upload tokenizer
about 2 months ago
model.safetensors
1.39 GB
LFS
Upload GPT2ForTokenClassification
about 2 months ago
special_tokens_map.json
119 Bytes
Upload tokenizer
about 2 months ago
tokenizer_config.json
944 Bytes
Upload tokenizer
about 2 months ago
vocab.json
927 kB
Upload tokenizer
about 2 months ago