Love the name

#1
by mlabonne - opened

πŸ˜‚

Owner

Haha, a merge of your models had to have some reference to the creator! Without a doubt, the "Monarch" models are the best there is on the leaderboard (that work without bugs). Whenever I have the chance, I'm going to thank you for all your contributions... Question: I'm training a merge of your models with the alpaca programming dataset. It's going to take about 18 hours to train, and it's looking pretty good. My question is based on some scientific papers I read about "emergent properties", like Chain of Thought (CoT), I read that when they started training the LLMs with code, that's somewhat when they became more "intelligent", and CoT emerged, which is the model's way of "reasoning in steps". My question is, do you think it's a good approach? What other programming datasets would you use? Alpaca has a 20k rows
image.png

I'd recommend Magicoder datasets that are a lot more evolved, especially for what you want to build. I think that CodeAlpaca is a little outdated at this point. Happy to see your results!

Owner

Thanks Maxime!

Sign up or log in to comment