Model Export to ONNX format

#32
by Desjajja - opened

Is there any pipeline to export this model to ONNX (using torch.onnx/optimum, etc)? I intend to do further acceleration in inference and ONNX file is a must. However, none of the frameworks above support baichuan yet.

Baichuan Intelligent Technology org

I have no idea.Is there any framework supports llama?

Yes, HF optimum supports llama, but baichuan is not even in their plan LOL

Sign up or log in to comment