Text Classification
Transformers
Safetensors
mistral
feature-extraction
reward_model
custom_code
text-generation-inference

Enable flash_attention_2 support since the underlying Mistral model supports it

#3
by winglian - opened
No description provided.
lievan changed pull request status to merged

Sign up or log in to comment