from peft import AutoPeftModelForCausalLM

path_to_adapter="macadeliccc/Samantha-Qwen-2-7B-lora"

model = AutoPeftModelForCausalLM.from_pretrained(
    # path to the output directory
    path_to_adapter,
    device_map="auto",
    trust_remote_code=True
).eval()

vpm_resampler_embedtokens_weight = torch.load(f"{path_to_adapter}/vpm_resampler_embedtokens.pt")

msg = model.load_state_dict(vpm_resampler_embedtokens_weight, strict=False)
Downloads last month
16
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for macadeliccc/Samantha-Qwen2-7B-LoRa

Base model

Qwen/Qwen2-7B
Quantized
(46)
this model

Datasets used to train macadeliccc/Samantha-Qwen2-7B-LoRa