Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
quantumaikr
/
QuantumLM
like
2
Text Generation
Transformers
PyTorch
English
llama
Inference Endpoints
text-generation-inference
License:
cc-by-nc-4.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
main
QuantumLM
1 contributor
History:
4 commits
quantumaikr
Upload tokenizer
9058130
11 months ago
.gitattributes
1.52 kB
initial commit
12 months ago
README.md
1.87 kB
Create README.md
12 months ago
config.json
637 Bytes
Upload LlamaForCausalLM
12 months ago
generation_config.json
197 Bytes
Upload LlamaForCausalLM
12 months ago
pytorch_model-00001-of-00006.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.96 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00002-of-00006.bin
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.94 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00003-of-00006.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
9.94 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00004-of-00006.bin
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
9.87 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00005-of-00006.bin
pickle
Detected Pickle imports (3)
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
9.87 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model-00006-of-00006.bin
pickle
Detected Pickle imports (3)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
2.49 GB
LFS
Upload LlamaForCausalLM
12 months ago
pytorch_model.bin.index.json
29.9 kB
Upload LlamaForCausalLM
12 months ago
special_tokens_map.json
414 Bytes
Upload tokenizer
11 months ago
tokenizer.json
1.84 MB
Upload tokenizer
11 months ago
tokenizer_config.json
724 Bytes
Upload tokenizer
11 months ago