How many GPU's are required to fine tuning bge-m3 over 1 million tripplets ?

#18
by wilfoderek - opened

Congrulation to all the team of BAAI for the excellent work!
Actually I am collecting 1 million of tripplets (query, list[pos] , list[neg] ). Now, I wonder how many GPU's are required for the fine tuning?
Any suggestion is welcomed friends.

Beijing Academy of Artificial Intelligence org

Thanks for your interest in our work! I think 8*A100 is enough.

@wilfoderek were you able to finetune the model. I fine-tuned model and is now giving me .9995 similarity score for everything no matter what the string is. I must have goofed up the training process I guess.

Sign up or log in to comment