metadata
license: apache-2.0
DrKlaus-7B
DrKlaus-7B is a SFT model made with AutoSloth by macadeliccc
Process
Original Model: mistralai/Mistral-7B-Instruct-v0.2
Datatset: medalpaca/medical_meadow_wikidoc_patient_information
Learning Rate: 3e-05
Steps: 80
Warmup Steps: 8
Per Device Train Batch Size: 24
Gradient Accumulation Steps 12
Optimizer: adamw_8bit
Max Sequence Length: 1024
Max Prompt Length: 512
Max Length: 1024
💻 Usage
!pip install -qU transformers
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline
model = "macadeliccc/DrKlaus-7B"
tokenizer = AutoTokenizer.from_pretrained(model)
# Example prompt
prompt = "Your example prompt here"
# Generate a response
model = AutoModelForCausalLM.from_pretrained(model)
pipeline = pipeline("text-generation", model=model, tokenizer=tokenizer)
outputs = pipeline(prompt, max_length=50, num_return_sequences=1)
print(outputs[0]["generated_text"])