May I ask if you are considering expanding the dataset and model scale to 30~65B?

#5
by Jackdiy - opened

Personally, I am very much looking forward to your new model.
In fact, I believe that the Pygmalion 13B version is superior to other models of the same kind.
However, with the emergence of more high-scoring models above 30B such as Falcon and Guanaco, as a fan, I still hope to see Pygmalion launch larger-scale models.

Of course, I have also seen models such as Manticore and Hippogriff trained using the Pygmalion dataset, and it seems that you are also trying to collaborate to produce higher quality models.

Pygmalion org

Hi @Jackdiy , thanks for your interest in Pygmalion!

We are, in fact, working on scaling up (and down!) in the near future. Scaling past 33B would be difficult for us with our current compute, so it might take a while before we break through that size class.

If compute is really an issue, I would be happy to assist by renting some GPUs to help with training models of 33B and 40B. I can also help test quantized models for you.

Sign up or log in to comment