Codellama 34b base model fine-tuned on the text chunk from the OpenAssistant-Guanaco dataset instead of Q&A pair, so it struggles to determine the end of the answer. recommend using a stop string like "### Human:" to prevent the model from talking to itself.

Prompt template:

### Human: {prompt}
### Assistant:
Downloads last month
1,961
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.