Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
BAAI
/
Aquila2-7B
like
6
Follow
Beijing Academy of Artificial Intelligence
558
Text Generation
Transformers
Safetensors
aquila
conversational
custom_code
License:
other
Model card
Files
Files and versions
Community
1
Train
Use this model
f563a9e
Aquila2-7B
5 contributors
History:
45 commits
hyxmmm
Update tokenizer_config.json
f563a9e
about 1 year ago
.gitattributes
1.52 kB
initial commit
about 1 year ago
BAAI-Aquila-Model-License -Agreement.pdf
227 kB
Upload BAAI-Aquila-Model-License -Agreement.pdf
about 1 year ago
LICENSE
0 Bytes
initial commit
about 1 year ago
README.md
2.48 kB
Update README.md
about 1 year ago
README_zh.md
1.88 kB
Update README_zh.md
about 1 year ago
base_metrics.jpeg
579 kB
Upload 2 files
about 1 year ago
base_metrics_CN.jpeg
832 kB
Upload 2 files
about 1 year ago
config.json
648 Bytes
Update config.json
about 1 year ago
configuration_aquila.py
5.42 kB
Create configuration_aquila.py
about 1 year ago
generation_config.json
142 Bytes
Upload 2 files
about 1 year ago
log.jpeg
401 kB
Upload log.jpeg
about 1 year ago
merges.txt
1.36 MB
Upload merges.txt
about 1 year ago
modeling_aquila.py
47.2 kB
Create modeling_aquila.py
about 1 year ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
11.6 GB
LFS
Upload pytorch_model-00001-of-00002.bin
about 1 year ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
5.34 GB
LFS
Upload pytorch_model-00002-of-00002.bin
about 1 year ago
pytorch_model.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
29.2 GB
LFS
Upload pytorch_model.bin
about 1 year ago
pytorch_model.bin.index.json
26.8 kB
Upload 2 files
about 1 year ago
special_tokens_map.json
114 Bytes
Upload 5 files
about 1 year ago
tokenizer.json
5.12 MB
Upload 5 files
about 1 year ago
tokenizer_config.json
244 Bytes
Update tokenizer_config.json
about 1 year ago
trainer_state.json
1.02 MB
Upload 5 files
about 1 year ago
vocab.json
2.05 MB
Upload 5 files
about 1 year ago