ajibawa-2023's picture
Update README.md
a9a0c98 verified
|
raw
history blame
No virus
2.09 kB
metadata
datasets:
  - ajibawa-2023/Code-290k-ShareGPT
language:
  - en
tags:
  - code
license: other

Code-290k-6.7B-Instruct

This model is trained on DeepSeek-Coder-6.7B-Instruct. I have used my existing dataset Code-290k-ShareGPT for training purpose. It is trained on around 290000 set of codes. Along with Python, Java, JavaScript, GO, C++, Rust, Ruby, Sql, MySql, R, Julia, Haskell, etc. code with detailed explanation is used for training purpose. This model utilises Alpaca format. Besides code generation it will also give you explanation.

Training:

Entire dataset was trained on 4 x A100 80GB. For 3 epoch, training took 85 hours. DeepSeek-Coder codebase and DeepSpeed was used for training purpose.

This is a full fine tuned model.

Links for quantized models are given below.

Exllama

Exllama v2:Link

Extremely thankful to Bartowski for making Quantized version of the model.

Example Prompt:

This is a conversation with your helpful AI assistant. AI assistant can generate Code in various Programming Languages along with necessary explanation.

### Instruction:
{instruction}

### Response:

You can modify above Prompt as per your requirement. I have used Alpaca format.

I want to say special Thanks to the Open Source community for helping & guiding me to better understand the AI/Model development.

Thank you for your love & support.

Examples

  1. Bayes Theorem - Python

image/png

  1. Fermat's little theorem

image/png

  1. The Arrhenius equation using R

image/png