Hugging Face
Models
Datasets
Spaces
Docs
Solutions
Pricing
Log In
Sign Up
tner
/
roberta-large-tweetner7-2020
like
1
Token Classification
Transformers
PyTorch
tner/tweetner7
roberta
Eval Results
Inference Endpoints
Model card
Files
Files and versions
Community
Train
Deploy
Use in Transformers
8217eeb
roberta-large-tweetner7-2020
1 contributor
History:
6 commits
asahi417
add tokenizer
8217eeb
over 1 year ago
eval
model update
over 1 year ago
.gitattributes
1.17 kB
initial commit
over 1 year ago
config.json
13.3 kB
add model
over 1 year ago
merges.txt
456 kB
add tokenizer
over 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.LongStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
1.42 GB
LFS
add model
over 1 year ago
special_tokens_map.json
239 Bytes
add tokenizer
over 1 year ago
tokenizer.json
1.36 MB
add tokenizer
over 1 year ago
tokenizer_config.json
366 Bytes
add tokenizer
over 1 year ago
vocab.json
798 kB
add tokenizer
over 1 year ago