Is this project Alibi version of Falcon-7B?

#1
by quaful - opened

I noticed the name of the project is "falcon-7b-alibi". I was intrigued by the possibility of integrating Alibi technology into the falcon-7b language model. As the name suggests, Alibi technology has the potential to significantly enhance the context length or token limit of large language models. I wanted to reach out to seek confirmation and further details about this integration.

To clarify, could you please confirm whether the "falcon-7b-alibi" project incorporates Alibi technology? If so, I would greatly appreciate it if you could provide information on the extent to which the context length has been increased in the falcon-7b model.

Understanding the increased context length in falcon-7b would be beneficial for my research/work, as it could potentially improve the model's performance on tasks requiring longer contextual understanding.

Thank you in advance for your time and attention to this matter. I look forward to hearing from you soon.

Sign up or log in to comment