Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
lmsys
/
vicuna-13b-delta-v0
like
454
Follow
Large Model Systems Organization
469
Text Generation
Transformers
PyTorch
llama
text-generation-inference
arxiv:
2302.13971
arxiv:
2306.05685
Model card
Files
Files and versions
Community
13
Train
Deploy
Use this model
refs/pr/13
vicuna-13b-delta-v0
1 contributor
History:
16 commits
SFconvertbot
Adding `safetensors` variant of this model
c998b8a
verified
7 months ago
.gitattributes
1.48 kB
initial commit
over 1 year ago
README.md
2.27 kB
Update README.md
about 1 year ago
config.json
507 Bytes
Update tokenizer config
over 1 year ago
generation_config.json
137 Bytes
Upload LlamaForCausalLM
over 1 year ago
model-00001-of-00003.safetensors
9.95 GB
LFS
Adding `safetensors` variant of this model
7 months ago
model-00002-of-00003.safetensors
9.9 GB
LFS
Adding `safetensors` variant of this model
7 months ago
model-00003-of-00003.safetensors
6.18 GB
LFS
Adding `safetensors` variant of this model
7 months ago
model.safetensors.index.json
35.1 kB
Adding `safetensors` variant of this model
7 months ago
pytorch_model-00001-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.95 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model-00002-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.9 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model-00003-of-00003.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
6.18 GB
LFS
Upload LlamaForCausalLM
over 1 year ago
pytorch_model.bin.index.json
33.4 kB
Upload LlamaForCausalLM
over 1 year ago
special_tokens_map.json
411 Bytes
Update tokenizer config
over 1 year ago
tokenizer.model
500 kB
LFS
Upload tokenizer
over 1 year ago
tokenizer_config.json
727 Bytes
Update tokenizer config
over 1 year ago