metadata
base_model: opencsg/opencsg-starcoder2-15b-v0.1
pipeline_tag: text-generation
inference: false
license: bigcode-openrail-m
model_creator: OpenCSG
model_name: opencsg-starcoder2-15b-v0.1
model_type: starcoder2
tags:
- code
quantized_by: arzeth
Model Info
- Model creator: OpenCSG
- Original card (has more info): https://huggingface.co/opencsg/opencsg-starcoder2-15b-v0.1
Layers | Context | Template |
---|---|---|
40 |
16384 |
I think it's Alpaca. ChatML seems to work too but answers are probably worse? |
Below is Alpaca template. I think there should be a new line (\n
) after ### Response:
. Sysprompt which is the first line can be changed of course:
You are an exceptionally intelligent coding assistant that consistently delivers accurate and reliable responses to user instructions.
### Instruction:
{instruction}
### Response:
Quantization info
Without imatrix. Quantized with llama.cpp b2333 (2024-03-04).