A cutting-edge foundation for your very own LLM.
🌐 TigerBot • 🤗 Hugging Face
基于tigerbot-7b-sft,使用QLoRA,在medical-qa数据上微调的模型。
如果下载过tigerbot-7b-sft,可以只下载qlora_ckpt_2400_adapter_model
后合成。
import transformers
from peft import PeftModel
model = transformers.AutoModelForCausalLM.from_pretrained("TigerResearch/tigerbot-7b-sft")
model = PeftModel.from_pretrained(model, './adapter_model', is_trainable=False)
model = model_peft.merge_and_unload()
或者完整加载整个模型
from transformers import AutoTokenizer, AutoModelForCausalLM
from accelerate import infer_auto_device_map, dispatch_model
from accelerate.utils import get_balanced_memory
tokenizer = AutoTokenizer.from_pretrained("TigerResearch/medical-bot-peft-from-tigerbot-7b-sft")
model = AutoModelForCausalLM.from_pretrained("TigerResearch/medical-bot-peft-from-tigerbot-7b-sft")
- Downloads last month
- 26
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.