Instructions to use yetter-ai/sage-attention with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Kernels
How to use yetter-ai/sage-attention with Kernels:
# !pip install kernels from kernels import get_kernel kernel = get_kernel("yetter-ai/sage-attention") - Notebooks
- Google Colab
- Kaggle
This is the repository card of serizard1005/sage-attention that has been pushed on the Hub. It was built to be used with the kernels library. This card was automatically generated.
How to use
# make sure `kernels` is installed: `pip install -U kernels`
from kernels import get_kernel
kernel_module = get_kernel("serizard1005/sage-attention")
per_block_int8 = kernel_module.per_block_int8
per_block_int8(...)
Available functions
per_block_int8per_warp_int8sub_meanper_channel_fp8sageattnsageattn3_blackwell
Benchmarks
No benchmark available yet.
- Downloads last month
- 21
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support