Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
togethercomputer
/
evo-1-131k-base
like
97
Follow
Together
455
Text Generation
Transformers
Safetensors
stripedhyena
long context
deep signal processing
hybrid
biology
genomics
custom_code
arxiv:
7 papers
License:
apache-2.0
Model card
Files
Files and versions
Community
3
Train
Use this model
c63a55a
evo-1-131k-base
4 contributors
History:
36 commits
maxall4
Update config.json
c63a55a
verified
13 days ago
.gitattributes
1.52 kB
initial commit
10 months ago
README.md
5.17 kB
Update README.md
8 months ago
cache.py
1.38 kB
init
10 months ago
config.json
1.72 kB
Update config.json
13 days ago
configuration_hyena.py
3.13 kB
init
10 months ago
engine.py
13.5 kB
init
10 months ago
generation_config.json
69 Bytes
Upload model
10 months ago
layers.py
5.39 kB
init
10 months ago
model-00001-of-00003.safetensors
4.98 GB
LFS
Upload model
10 months ago
model-00002-of-00003.safetensors
4.93 GB
LFS
Upload model
10 months ago
model-00003-of-00003.safetensors
3 GB
LFS
Upload model
10 months ago
model.py
19.8 kB
Update model.py
25 days ago
model.safetensors.index.json
34.9 kB
Upload model
10 months ago
modeling_hyena.py
6.92 kB
Update modeling_hyena.py
25 days ago
positional_embeddings.py
4.94 kB
init
10 months ago
pytorch_model.pt
pickle
Detected Pickle imports (4)
"torch.BFloat16Storage"
,
"collections.OrderedDict"
,
"torch._utils._rebuild_tensor_v2"
,
"torch.FloatStorage"
What is a pickle import?
16.8 GB
LFS
add pt ckpt
10 months ago
special_tokens_map.json
3 Bytes
Update byte tokenizer to be compatible with auto tokenizer and clean-up.
10 months ago
streamer.py
3.94 kB
init
10 months ago
tokenizer.py
4.4 kB
Update batch encode plus first argument to match HF convention.
10 months ago
tokenizer_config.json
299 Bytes
Fix auto tokenizer import reference format in auto map as list for slow and fast.
10 months ago
utils.py
2.87 kB
init
10 months ago