Hugging Face
Models
Datasets
Spaces
Posts
Docs
Solutions
Pricing
Log In
Sign Up
togethercomputer
/
evo-1-131k-base
like
79
Follow
Together
421
Text Generation
Transformers
Safetensors
stripedhyena
long context
deep signal processing
hybrid
biology
genomics
custom_code
arxiv:
7 papers
License:
apache-2.0
Model card
Files
Files and versions
Community
2
Train
Use this model
c9e2eda
evo-1-131k-base
4 contributors
History:
23 commits
Zymrael
add pt ckpt
c9e2eda
8 months ago
.gitattributes
1.52 kB
initial commit
8 months ago
README.md
4.1 kB
Update README.md
8 months ago
cache.py
1.38 kB
init
8 months ago
config.json
1.73 kB
Fix auto tokenizer import reference format in auto map as list for slow and fast.
8 months ago
configuration_hyena.py
3.13 kB
init
8 months ago
engine.py
13.5 kB
init
8 months ago
generation_config.json
69 Bytes
Upload model
8 months ago
layers.py
5.39 kB
init
8 months ago
model-00001-of-00003.safetensors
4.98 GB
LFS
Upload model
8 months ago
model-00002-of-00003.safetensors
4.93 GB
LFS
Upload model
8 months ago
model-00003-of-00003.safetensors
3 GB
LFS
Upload model
8 months ago
model.py
19.4 kB
init
8 months ago
model.safetensors.index.json
34.9 kB
Upload model
8 months ago
modeling_hyena.py
5.55 kB
init
8 months ago
positional_embeddings.py
4.94 kB
init
8 months ago
pytorch_model.pt
pickle
Detected Pickle imports (4)
"collections.OrderedDict"
,
"torch.FloatStorage"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.BFloat16Storage"
What is a pickle import?
16.8 GB
LFS
add pt ckpt
8 months ago
special_tokens_map.json
3 Bytes
Update byte tokenizer to be compatible with auto tokenizer and clean-up.
8 months ago
streamer.py
3.94 kB
init
8 months ago
tokenizer.py
4.37 kB
Remove tokenizer.json and replace tokenizer.py with correct version.
8 months ago
tokenizer_config.json
299 Bytes
Fix auto tokenizer import reference format in auto map as list for slow and fast.
8 months ago
utils.py
2.87 kB
init
8 months ago