Artigenz-Coder-DS-6.7B
Artigenz team intends to create family of code generation models that can run very fast on local computers.
Artigenz-Coder-DS-6.7B is the first in this family with 6.7B parameters and 13GB memory footprint 🌟
HomePageAbout the model
Artigenz-Coder-DS-6.7B was finetuned on DeepSeek-Coder-6.7B-Base. The dataset and scripts will be open-sourced soon.
We have open sourced our model weights on 🤗 HF, checkout here!
Team
Special Thanks ❤️
What's Next ❓
The dataset and finetuing scripts used to train Artigenz-Coder-DS-6.7B will be released soon for the open-source-community to use freely. 🛠️.
1B & 3B models from Artigenz family are on the roadmap next with long term goal to enable ⚡ fast local inference for code generation.
Special Thanks to the Open Source Community ❤️
We extend our deepest gratitude to the open source community, especially the Bigcode Project, Magicoder, Hugging Face, DeepSeek, Wizard Coder, Code Llama that enabled research community to build powerfull LLMs.
We need many more people to close the gap between proprietry and open source models and we are commited to contribute our bits to the goal.
Get in Touch
You can reach out to us on LinkedIn or via email for any queries or collaborations! 😊
- Downloads last month
- 72,469