Edit model card

Model Card for kanelsnegl-v0.1-GGUF

!!! This model is built for fun and learning and needs much more finetuning !!! For now only Q4_K.

Kanelsnegl Logo

Model Description

Base model: Zephyr-7b-alpha finetuned on DDSC/partial-danish-gigaword-no-twitter. The training involved a maximum length of 512. QLora completion finetuning of all linear layers was also implemented. This model is mostly fun tinkering for personal learning purpose. A heavily instruction-tuned model was used in the hope of transferring some of its behaviors into the Danish finetune.

The model often produces somewhat fun halucinations and instruction following is not really happening that well, but there may be ways to fine-tune it in a supervised manner for more controlled behavior.

Works with Ollama but not too well

Usage with CTransformers

from ctransformers import AutoModelForCausalLM

# Set gpu_layers to the number of layers to offload to GPU. Set to 0 if no GPU acceleration is available on your system.
llm = AutoModelForCausalLM.from_pretrained('RJuro/kanelsnegl-v0.1-GGUF', 
                                           model_file="kanelsnegl-v0.1-Q4_K.gguf", 
                                           model_type="mistral", 
                                           gpu_layers=50
                                           )

print(llm("Aalborg er en dejlig by.", 
          top_k=50, 
          top_p=0.95, 
          temperature=0.8, 
          repetition_penalty=1.2, 
          reset=True,
          max_new_tokens=128))

Returns 😂

Det var det også før de bakker hul i jorden og lægger køretøjerne på grunden, så man ikke kan komme ud af gaderne.
Det er jo bare fordi vi har et stort problem med at have en stor by udenfor Aalborg, der bliver ved med at vokse og vokse.
Jeg bor i Aalborg, men jeg kører hver dag til Frederikshavn på arbejde. Jeg er ikke så glad for det, da den traf
Downloads last month
10
GGUF
Unable to determine this model's library. Check the docs .