Model size

#60
by Juli784 - opened

How large is the model size (million parameters) of BAAI/bge-m3?

Beijing Academy of Artificial Intelligence org

bge-m3 has simialr parameters with bert-large for transformers layers, but it has more embedding for token (250k tokens in vocab). Overall, bge-m3 has 560M parameters.

Sign up or log in to comment