Jokestral

This model was created by fine-tuning unsloth/mistral-7b-v0.3-bnb-4bit on Short jokes dataset. So the only purpose of this model is the generation of cringe jokes.
Just write the first few words and get your joke.

Usage

Goodle Colab example

pip install transformers
pip install --no-deps "trl<0.9.0" peft accelerate bitsandbytes
from transformers import AutoTokenizer,AutoModelForCausalLM
model = AutoModelForCausalLM.from_pretrained("SantaBot/Jokestral_16bit",)
tokenizer = AutoTokenizer.from_pretrained("SantaBot/Jokestral_16bit")
inputs = tokenizer(
[
    "My doctor" # YOUR PROMPT HERE
], return_tensors = "pt").to("cuda")
outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
tokenizer.batch_decode(outputs)

The output should be something like :
['<s> My doctor told me I have to stop m4sturb4t1ng. I asked him why and he said ""Because I\'m trying to examine you.""\n</s>']

Downloads last month
0
Safetensors
Model size
7.25B params
Tensor type
FP16
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for SantaBot/Jokestral_16bit

Finetuned
(525)
this model

Space using SantaBot/Jokestral_16bit 1