File size: 1,780 Bytes
14a6663 ad17380 14a6663 5d7ad49 cf6606d 83bfb42 cf6606d d5a624d cf6606d d5a624d cf6606d 6d60562 de375bb 09dbf8c 83bfb42 cf6606d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 |
---
license: other
datasets:
- rombodawg/LosslessMegaCodeTrainingV2_1m_Evol_Uncensored
---
This is one of the first models trained on the LosslessMegaCodeTrainingV2_1m_Evol_Uncensored dataset. The version of the dataset used for this model was poorly filtered on some loose parameters that arent anything to write home about but plans for much more refined filtering are in the works
- This model was made as a colaboration between me and andreaskoepf who is an affiliate of Open Assistant.
This model is extremely good at coding, and might be one of the best coding models for its size and much better than any 7b parameter model. Plans for bigger models are coming in the future.
### Prompt template
[chatml](https://github.com/openai/openai-python/blob/main/chatml.md) format is used:
"<|im_start|>system\n{system message}<|im_end|>\n<|im_start|>user\n{user prompt}<|im_end|>\n<|im_start|>assistant\n{Assistant answer}<|im_end|>\n"
multi-line:
```
"""
<|im_start|>system
{system message}<|im_end|>
<|im_start|>user
{user prompt}<|im_end|>
<|im_start|>assistant
{Assistant answer}<|im_end|>
"""
```
Benchmarks for the model can be found at the link bellow the model here is called (andreaskoepf/llama2-7b-megacode2_min100)
- https://tju01.github.io/FastEval-OpenAssistant/
Training information:
- https://wandb.ai/open-assistant/public-sft/runs/run17_megacode_min100
The link for the full dataset is bellow:
- https://huggingface.co/datasets/rombodawg/LosslessMegaCodeTrainingV2_1m_Evol_Uncensored
Link for the filtered dataset used to make this model are bellow:
- https://huggingface.co/datasets/andreaskoepf/megacode2-min100
The original posting for this model was uploaded at the link bellow.
https://huggingface.co/andreaskoepf/llama2-7b-megacode2_min100 |