Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
lambdalabs
/
pythia-12b-deduped-synthetic-instruct
like
3
Follow
Lambda
110
Text Generation
Transformers
PyTorch
Dahoas/synthetic-instruct-gptj-pairwise
English
gpt_neox
causal-lm
pythia
text-generation-inference
Inference Endpoints
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Deploy
Use this model
a7e9c34
pythia-12b-deduped-synthetic-instruct
1 contributor
History:
7 commits
chuanli-lambda
Upload tokenizer
a7e9c34
over 1 year ago
.gitattributes
1.48 kB
initial commit
over 1 year ago
README.md
3.79 kB
Create README.md
over 1 year ago
config.json
706 Bytes
Upload GPTNeoXForCausalLM
over 1 year ago
pytorch_model-00001-of-00005.bin
pickle
Detected Pickle imports (4)
"torch.ByteStorage"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
What is a pickle import?
9.87 GB
LFS
Upload GPTNeoXForCausalLM
over 1 year ago
pytorch_model-00002-of-00005.bin
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch.ByteStorage"
What is a pickle import?
9.68 GB
LFS
Upload GPTNeoXForCausalLM
over 1 year ago
pytorch_model-00003-of-00005.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"collections.OrderedDict"
What is a pickle import?
9.68 GB
LFS
Upload GPTNeoXForCausalLM
over 1 year ago
pytorch_model-00004-of-00005.bin
pickle
Detected Pickle imports (4)
"torch._utils._rebuild_tensor_v2"
,
"torch.ByteStorage"
,
"collections.OrderedDict"
,
"torch.FloatStorage"
What is a pickle import?
10 GB
LFS
Upload GPTNeoXForCausalLM
over 1 year ago
pytorch_model-00005-of-00005.bin
pickle
Detected Pickle imports (4)
"torch.FloatStorage"
,
"collections.OrderedDict"
,
"torch.ByteStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
8.29 GB
LFS
Upload GPTNeoXForCausalLM
over 1 year ago
pytorch_model.bin.index.json
47.3 kB
Upload GPTNeoXForCausalLM
over 1 year ago
special_tokens_map.json
131 Bytes
Upload tokenizer
over 1 year ago
tokenizer.json
2.11 MB
Upload tokenizer
over 1 year ago
tokenizer_config.json
529 Bytes
Upload tokenizer
over 1 year ago