Will there be a 2048 token length version?

#7
by kk3dmax - opened

Will there be a 2048 token length version? (512 is not enough for most RAG tasks).

WhereIsAI org

Hi @kk3dmax , thanks for following our work!

Yes, we are working on an efficient smaller model.
It also supports longer context (>4090).
Stay tuned :)

Thanks bro!

any news @SeanLee97 ? :)

Hello, @SeanLee97 ! I would like to express my great admiration to you and your team for your amazing work! :) I would also like to add that I'm an undergrad currently working on a RAG application as a graduation project, and it relies on your embedding model. Therefore I'm very eager to hear any news about your upcoming model. Keep up the good work!

Sign up or log in to comment