Run model with transformers?

#2
by lewtun HF staff - opened

Hello, very nice work with PairRM - it looks quite handy for quality control of preference datasets :)

I was wondering if it's possible to run the model natively in transformers instead of requiring the llmblender library?

LLM Blender org

Thank you for the question. However, PairRM contains some self-dedesigend layers and it's kind of diffucult to make it compatible with existing codes of transformer library.

Besides, llm-blender also designs some simple interface functions so that PairRM can be properly used, such as compare(), compare_conversations, best_of_n_generate, etc. And integrating with transformers can't provide these inferfaces.
We also have set the minimum packages requirements to install llm-blender package, you can check the setup.py in our Github repo. Therefore, I think installing llm-blender won't cause too many package conflicts in the python environment.

If there are some other scenarios where installing llm-blender will cause conflicts, you can notice us, and see if we can resolve it.

LLM Blender org

Hey, we now support the hugging face compatible loading here: llm-blender/PairRM-hf

Sign up or log in to comment