This is a merge of 6 models that were finetuned on llama3 8b. This has done pretty decent on some coding tasks, for the parameter size. I have looked through models because a lot of people cannot run 33B models (deepseek) for coding.
Juggernaut X V10 is pretty good, its a few weeks old but not very popular. Try it out and let me know what you guys think. I think it is pretty good for a daily use.
Make the ultimate coding finetune to compete with the likes of closed source models using the code_bagel dataset!
Made by @rombodawg of RepleteAi, the code_bagel dataset contains over 800 million tokens of deduplicated and uncensored code from only reputable sources on huggingface. This code is formatted in the alpaca instruct format for ease of use in training.