Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
THUDM
/
LongAlign-6B-64k
like
2
Follow
Knowledge Engineering Group (KEG) & Data Mining at Tsinghua University
1,265
Text Generation
Transformers
PyTorch
THUDM/LongAlign-10k
English
Chinese
chatglm
feature-extraction
Long Context
custom_code
arxiv:
2401.18058
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Use this model
refs/pr/2
LongAlign-6B-64k
2 contributors
History:
18 commits
SFconvertbot
Adding `safetensors` variant of this model
46f8079
verified
20 days ago
assets
Upload leaderboard.png
10 months ago
.gitattributes
1.52 kB
initial commit
10 months ago
README.md
3.37 kB
Update README.md
10 months ago
config.json
1.5 kB
Upload 12 files
10 months ago
configuration_chatglm.py
2.39 kB
Upload 12 files
10 months ago
generation_config.json
111 Bytes
Upload 12 files
10 months ago
model-00001-of-00002.safetensors
9.99 GB
LFS
Adding `safetensors` variant of this model
20 days ago
model-00002-of-00002.safetensors
2.5 GB
LFS
Adding `safetensors` variant of this model
20 days ago
model.safetensors.index.json
21.2 kB
Adding `safetensors` variant of this model
20 days ago
modeling_chatglm.py
47.7 kB
Upload 12 files
10 months ago
pytorch_model-00001-of-00002.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.BFloat16Storage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.99 GB
LFS
Upload pytorch_model-00001-of-00002.bin
10 months ago
pytorch_model-00002-of-00002.bin
pickle
Detected Pickle imports (3)
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
2.5 GB
LFS
Upload pytorch_model-00002-of-00002.bin with huggingface_hub
10 months ago
pytorch_model.bin.index.json
20.4 kB
Upload 12 files
10 months ago
quantization.py
14.7 kB
Upload 12 files
10 months ago
special_tokens_map.json
3 Bytes
Upload 12 files
10 months ago
tokenization_chatglm.py
10.8 kB
Update tokenization_chatglm.py
9 months ago
tokenizer.model
1.02 MB
LFS
Upload 12 files
10 months ago
tokenizer_config.json
324 Bytes
Upload 12 files
10 months ago