Using the Q model and the A model with the Inference API?

#1
by latitude - opened

Typically, you would use this model together with the multi-QA_v1-mpnet-asymmetric-Q model, encoding the question with the Q model and the answers with the A model, and then doing a cosine similarity comparison. But the Inference API seems to be using the same model for both. Is there a way I can call the Inference API on a list of sentences and get back a list of vectors, and do the comparison myself?

Sign up or log in to comment