Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
dmayhem93
/
toolformer_v0_epoch2
like
15
Text Generation
Transformers
PyTorch
gptj
Inference Endpoints
Model card
Files
Files and versions
Community
2
Train
Deploy
Use this model
0244f53
toolformer_v0_epoch2
1 contributor
History:
4 commits
SFconvertbot
Adding `safetensors` variant of this model
0244f53
verified
about 1 month ago
.gitattributes
1.48 kB
initial commit
almost 2 years ago
added_tokens.json
109 Bytes
add tokenizer
almost 2 years ago
config.json
1.05 kB
Upload GPTJForCausalLM
almost 2 years ago
merges.txt
456 kB
add tokenizer
almost 2 years ago
model-00001-of-00003.safetensors
9.94 GB
LFS
Adding `safetensors` variant of this model
about 1 month ago
model-00002-of-00003.safetensors
9.78 GB
LFS
Adding `safetensors` variant of this model
about 1 month ago
model-00003-of-00003.safetensors
4.6 GB
LFS
Adding `safetensors` variant of this model
about 1 month ago
model.safetensors.index.json
27.2 kB
Adding `safetensors` variant of this model
about 1 month ago
pytorch_model-00001-of-00003.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.ByteStorage"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.94 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00002-of-00003.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.ByteStorage"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
9.78 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model-00003-of-00003.bin
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.ByteStorage"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
4.6 GB
LFS
Upload GPTJForCausalLM
almost 2 years ago
pytorch_model.bin.index.json
25.8 kB
Upload GPTJForCausalLM
almost 2 years ago
special_tokens_map.json
99 Bytes
add tokenizer
almost 2 years ago
tokenizer_config.json
764 Bytes
add tokenizer
almost 2 years ago
vocab.json
999 kB
add tokenizer
almost 2 years ago