Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
flax-community
/
bertin-roberta-large-spanish
like
0
Fill-Mask
Transformers
PyTorch
JAX
Safetensors
Spanish
roberta
spanish
Inference Endpoints
License:
cc-by-4.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
1bec239
bertin-roberta-large-spanish
6 contributors
History:
58 commits
elishowk
Automatic correction of README.md metadata. Contact website@huggingface.co for any question
1bec239
about 3 years ago
configs
Changed and added vocab and tokenizer
about 3 years ago
mc4
Fixed a couple of conditonals
about 3 years ago
.gitattributes
737 Bytes
initial commit
about 3 years ago
.gitignore
1.84 kB
Initial test with BETO's corpus
about 3 years ago
README.md
2.36 kB
Automatic correction of README.md metadata. Contact website@huggingface.co for any question
about 3 years ago
config.json
618 Bytes
Fix config for checkpoint
about 3 years ago
config.py
256 Bytes
Preparing code for final runs
about 3 years ago
convert.py
876 Bytes
Improved version of conversion script Flax → PyTorch
about 3 years ago
flax_model.msgpack
250 MB
LFS
Model at 210k steps, mlm acc 0.6537
about 3 years ago
get_embeddings_and_perplexity.py
1.53 kB
Add script to generate dataset of embeddings and perplexities. Add script to generate t-SNE plot for embedding and perplexity visualization.
about 3 years ago
merges.txt
505 kB
Changed and added vocab and tokenizer
about 3 years ago
perplexity.py
751 Bytes
Adding checkpointing, wandb, and new mlm script
about 3 years ago
pytorch_model.bin
pickle
Detected Pickle imports (4)
"torch.LongStorage"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
499 MB
LFS
Model at 210k steps, mlm acc 0.6537
about 3 years ago
run.sh
883 Bytes
Adding base config and organizing configs
about 3 years ago
run_mlm_flax.py
30 kB
Adding sampling to mc4
about 3 years ago
run_mlm_flax_stream.py
33 kB
Fix restoring steps
about 3 years ago
run_stream.sh
932 Bytes
Preparing code for final runs
about 3 years ago
special_tokens_map.json
239 Bytes
Changed and added vocab and tokenizer
about 3 years ago
tokenizer.json
1.45 MB
Changed and added vocab and tokenizer
about 3 years ago
tokenizer_config.json
292 Bytes
Changed and added vocab and tokenizer
about 3 years ago
tokens.py
649 Bytes
Scripts for perplexity sampling and fixes
about 3 years ago
tokens.py.orig
899 Bytes
Adjust batch size for extrating tokens
about 3 years ago
tsne_plot.py
3.02 kB
Remove unused imports
about 3 years ago
vocab.json
846 kB
Changed and added vocab and tokenizer
about 3 years ago