Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
nlpie
/
Llama2-MedTuned-7b
like
10
Text Generation
Transformers
PyTorch
llama
biomedical
clinical
medical
text-generation-inference
Inference Endpoints
arxiv:
2401.00579
License:
apache-2.0
Model card
Files
Files and versions
Community
1
Train
Deploy
Use this model
8ce5468
Llama2-MedTuned-7b
3 contributors
History:
8 commits
SFconvertbot
Adding `safetensors` variant of this model
8ce5468
verified
9 months ago
.gitattributes
Safe
1.52 kB
initial commit
11 months ago
README.md
Safe
1.95 kB
Update README.md
10 months ago
config.json
Safe
721 Bytes
added the first version of the model and tokenizer
11 months ago
generation_config.json
Safe
159 Bytes
added the first version of the model and tokenizer
11 months ago
model-00001-of-00002.safetensors
9.98 GB
LFS
Adding `safetensors` variant of this model
9 months ago
model-00002-of-00002.safetensors
3.5 GB
LFS
Adding `safetensors` variant of this model
9 months ago
model.safetensors.index.json
Safe
25.1 kB
Adding `safetensors` variant of this model
9 months ago
pytorch_model-00001-of-00002.bin
Safe
pickle
Detected Pickle imports (3)
"torch._utils._rebuild_tensor_v2"
,
"collections.OrderedDict"
,
"torch.HalfStorage"
What is a pickle import?
9.98 GB
LFS
added the first version of the model and tokenizer
11 months ago
pytorch_model-00002-of-00002.bin
Safe
pickle
Detected Pickle imports (3)
"torch.HalfStorage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
What is a pickle import?
3.5 GB
LFS
added the first version of the model and tokenizer
11 months ago
pytorch_model.bin.index.json
Safe
24 kB
added the first version of the model and tokenizer
11 months ago
special_tokens_map.json
Safe
96 Bytes
added the first version of the model and tokenizer
11 months ago
tokenizer.json
Safe
1.84 MB
added the first version of the model and tokenizer
11 months ago
tokenizer.model
Safe
500 kB
LFS
added the first version of the model and tokenizer
11 months ago
tokenizer_config.json
Safe
895 Bytes
added the first version of the model and tokenizer
11 months ago