transformers 4.35.3 is not released yet right ?

#1
by mohan007 - opened

in order for code snippets to work , we need transformers 4.35.3 to be released but it is not yet released , either it is not on tagged branches or releases

Llava Hugging Face org

Hi @mohan007
Yes it is not released yet, we will announce the integration once https://github.com/huggingface/transformers/pull/27662 gets merged and should be added in the next release

When is the next release waiting for that

Llava Hugging Face org

We just merged the PR, you can try to run the model by installing transformers from source:

pip install -q git+https://github.com/huggingface/transformers.git

Have also a look at the google colab demo: https://colab.research.google.com/drive/1qsl6cd2c8gGtEW1xV5io7S8NHh-Cp1TV?usp=sharing

We just merged the PR, you can try to run the model by installing transformers from source:

pip install -q git+https://github.com/huggingface/transformers.git

Have also a look at the google colab demo: https://colab.research.google.com/drive/1qsl6cd2c8gGtEW1xV5io7S8NHh-Cp1TV?usp=sharing

Any speeds up from flash attention we could expect than the original inference from original research author repo

Llava Hugging Face org

Hi @mohan007
We did not compared the performance of our FA2 vs model the original repository, when comparing our native implementation vs their native implementation it led to similar performance, but this might be different when it comes to using FA2.

Sign up or log in to comment