Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
nicholasKluge
/
Aira-2-portuguese-124M
like
2
Text Generation
Transformers
PyTorch
Safetensors
nicholasKluge/instruct-aira-dataset
Portuguese
gpt2
alignment
instruction tuned
text generation
conversation
assistant
Carbon Emissions
text-generation-inference
Inference Endpoints
arxiv:
1803.05457
arxiv:
2109.07958
arxiv:
2203.09509
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
main
Aira-2-portuguese-124M
4 contributors
History:
56 commits
nicholasKluge
Update README.md
fd9e154
verified
5 months ago
.gitattributes
Safe
1.48 kB
initial commit
over 1 year ago
Aira_emissions.csv
Safe
776 Bytes
Upload 17 files
about 1 year ago
LICENSE
Safe
10.7 kB
Create LICENSE
over 1 year ago
README.md
Safe
5.92 kB
Update README.md
5 months ago
added_tokens.json
Safe
123 Bytes
Upload 5 files
about 1 year ago
config.json
Safe
939 Bytes
Update config.json
about 1 year ago
generation_config.json
Safe
334 Bytes
Update generation_config.json
11 months ago
lm_evaluation_harness_pt.ipynb
Safe
2.92 kB
Upload lm_evaluation_harness_pt.ipynb
about 1 year ago
merges.txt
Safe
508 kB
Upload 5 files
about 1 year ago
model.safetensors
Safe
498 MB
LFS
Upload 17 files
about 1 year ago
optimizer.pt
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
649 MB
LFS
Upload 17 files
about 1 year ago
pytorch_model.bin
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
498 MB
LFS
Upload 17 files
about 1 year ago
rng_state.pt
Safe
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
What is a pickle import?
5.81 kB
LFS
Upload 17 files
about 1 year ago
scheduler.pt
Safe
pickle
Pickle imports
No problematic imports detected
What is a pickle import?
563 Bytes
LFS
Upload 17 files
about 1 year ago
special_tokens_map.json
Safe
518 Bytes
Upload 5 files
about 1 year ago
tokenizer_config.json
Safe
985 Bytes
Upload 5 files
about 1 year ago
training_stats.parquet
Safe
2.36 kB
LFS
Upload 17 files
about 1 year ago
vocab.json
Safe
1.05 MB
Upload 5 files
about 1 year ago