Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
aws-neuron
/
optimum-neuron-cache
like
14
Follow
AWS Inferentia and Trainium
89
License:
apache-2.0
Model card
Files
Files and versions
Community
348
main
optimum-neuron-cache
/
inference-cache-config
6 contributors
History:
61 commits
dacorvo
HF staff
Add DeepSeek distilled versions of LLama 8B
509e6bf
verified
30 days ago
gpt2.json
Safe
398 Bytes
Add more gpt2 configurations
11 months ago
granite.json
Safe
1.3 kB
Add configuration for granite models
2 months ago
llama-variants.json
Safe
1.45 kB
Add DeepSeek distilled versions of LLama 8B
30 days ago
llama.json
Safe
1.67 kB
Update inference-cache-config/llama.json
5 months ago
llama2-70b.json
Safe
287 Bytes
Create llama2-70b.json
8 months ago
llama3-70b.json
Safe
584 Bytes
Add DeepSeek distilled model
about 1 month ago
llama3.1-70b.json
Safe
289 Bytes
Rename inference-cache-config/Llama3.1-70b.json to inference-cache-config/llama3.1-70b.json
5 months ago
mistral-variants.json
Safe
1.04 kB
Remove obsolete mistral variants
5 months ago
mistral.json
Safe
1.87 kB
Update inference-cache-config/mistral.json
about 2 months ago
mixtral.json
Safe
583 Bytes
Update inference-cache-config/mixtral.json
5 months ago
qwen2.5-large.json
Safe
849 Bytes
Update inference-cache-config/qwen2.5-large.json
about 1 month ago
qwen2.5.json
Safe
2.69 kB
Add DeepSeek distilled models
about 1 month ago
stable-diffusion.json
Safe
1.91 kB
Update inference-cache-config/stable-diffusion.json
5 months ago