Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
pere
/
nb-roberta-base-scandinavian-long
like
0
Fill-Mask
Transformers
PyTorch
JAX
TensorBoard
roberta
Inference Endpoints
Model card
Files
Files and versions
Metrics
Training metrics
Community
Train
Deploy
Use this model
b6fe1e0
nb-roberta-base-scandinavian-long
1 contributor
History:
87 commits
pere
tokenizer
b6fe1e0
almost 3 years ago
.gitattributes
1.22 kB
fist submit after clone
almost 3 years ago
README.md
1.89 kB
Update README.md
almost 3 years ago
config.json
701 Bytes
pytorch model
almost 3 years ago
create_config.py
163 Bytes
fist submit after clone
almost 3 years ago
events.out.tfevents.1637101724.t1v-n-358ff5d1-w-0.857283.3.v2
5.93 MB
LFS
Saving weights and logs of step 40001
almost 3 years ago
events.out.tfevents.1637157195.t1v-n-358ff5d1-w-0.908301.3.v2
106 MB
LFS
Saving weights and logs of step 710001
almost 3 years ago
events.out.tfevents.1637704340.t1v-n-358ff5d1-w-0.1392746.3.v2
8.91 MB
LFS
Saving weights and logs of step 60001
almost 3 years ago
flax_model.msgpack
499 MB
LFS
Saving weights and logs of step 60001
almost 3 years ago
generate_pytorch_model.py
349 Bytes
LFS
tokenizer
almost 3 years ago
merges.txt
476 kB
tokenizer
almost 3 years ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"torch.LongStorage"
,
"torch.FloatStorage"
,
"collections.OrderedDict"
What is a pickle import?
499 MB
LFS
pytorch model
almost 3 years ago
run_mlm_flax.py
29.8 kB
fist submit after clone
almost 3 years ago
run_mlm_flax_stream.py
26.8 kB
fist submit after clone
almost 3 years ago
run_stream_128.sh
745 Bytes
a new start. Longer
almost 3 years ago
run_stream_512.sh
742 Bytes
Saving weights and logs of step 10001
almost 3 years ago
special_tokens_map.json
239 Bytes
tokenizer
almost 3 years ago
tokenizer.json
1.39 MB
tokenizer
almost 3 years ago
tokenizer_config.json
291 Bytes
tokenizer
almost 3 years ago
train_tokenizer.py
820 Bytes
fist submit after clone
almost 3 years ago
vocab.json
818 kB
tokenizer
almost 3 years ago