Use in transformers code doesn't run?

#2
by emaloney - opened

The only code suggesting how this model can be used, which is under the Use in Transformers tab, does not appear to actually work.

Load model directly

from transformers import AutoTokenizer, HF_ColBERT

tokenizer = AutoTokenizer.from_pretrained("Intel/ColBERT-NQ")
model = HF_ColBERT.from_pretrained("Intel/ColBERT-NQ")

Running this code causes "ImportError: cannot import name 'HF_ColBERT' from 'transformers'" even with latest version of transformers. Also using the builtin transformers RAG classes gives "AssertionError: Config has to be initialized with question_encoder and generator config".
Any ideas?

Intel org

The HF_ColBERT class isn't in the transformers library. For example usage see

https://github.com/IntelLabs/fastRAG/blob/main/examples/plaid_colbert_pipeline.ipynb

Behind the scenes we use the ColBERT package colbert-ai. If you would like to see it in Transformers, open an issue in https://github.com/stanford-futuredata/ColBERT.

Sign up or log in to comment