Artigenz-Coder-DS-6.7B

Artigenz team intends to create family of code generation models that can run very fast on local computers.

Artigenz-Coder-DS-6.7B is the first in this family with 6.7B parameters and 13GB memory footprint 🌟

HomePage

About the model

Artigenz-Coder-DS-6.7B was finetuned on DeepSeek-Coder-6.7B-Base. The dataset and scripts will be open-sourced soon.

We have open sourced our model weights on 🤗 HF, checkout here!

Team

Nikita Agarwal

Nikita Agarwal

LinkedIn

AI Researcher

ex Data Scientist at Microsoft

IIIT - Hyderabad, India

Vivek Verma

Vivek Verma

LinkedIn Google Scholar

Post Doctoral Associate

Florida International Univesity

202 Citations

Nalin Abrol

Nalin Abrol

LinkedIn

ex Software Engineer - Plivo (YC S21)

Published in OHBM 2019

IIIT - Hyderabad, India

Special Thanks ❤️

Manish Srivastava

Manish Shrivastava

LinkedIn University

Assistant Professor

Natural Language Processing

IIIT - Hyderabad, India

Manas Kumar Verma

Manas Kumar Verma

LinkedIn YC

CEO

Algouniversity YC(S21)

IIIT - Hyderabad, India

Nikhil Tadigoppula

Nikhil Tadigoppula

IOI

AI Researcher

Bronze medalist

International Olympiad

in Informatics 2013

IIIT - Hyderabad, India

What's Next ❓

The dataset and finetuing scripts used to train Artigenz-Coder-DS-6.7B will be released soon for the open-source-community to use freely. 🛠️.

1B & 3B models from Artigenz family are on the roadmap next with long term goal to enable ⚡ fast local inference for code generation.

Special Thanks to the Open Source Community ❤️

We extend our deepest gratitude to the open source community, especially the Bigcode Project, Magicoder, Hugging Face, DeepSeek, Wizard Coder, Code Llama that enabled research community to build powerfull LLMs.

We need many more people to close the gap between proprietry and open source models and we are commited to contribute our bits to the goal.

Get in Touch

You can reach out to us on LinkedIn or via email for any queries or collaborations! 😊

Email agarwal1503.nikita@gmail.com
Downloads last month
72,469
Safetensors
Model size
6.74B params
Tensor type
BF16
·
Inference API
Input a message to start chatting with Artigenz/Artigenz-Coder-DS-6.7B.

Model tree for Artigenz/Artigenz-Coder-DS-6.7B

Adapters
4 models
Finetunes
5 models
Quantizations
4 models

Spaces using Artigenz/Artigenz-Coder-DS-6.7B 3