PEFT
English
medical

Is it possible to merge the finetuned version with the original mistral model into one model ?

#4
by AiModelsMarket - opened

If it is possible please tell me how to do it. I'm a novice looking for help. Thank you very much!

not really "merge" , but basically "yes" , you will use torch adapters instead.

You can use the model simply duplicate this space on a A10G : https://huggingface.co/spaces/pseudo

Or you can duplicate this space tonic/mistralmed_chat. see the files Tab for the files and you can also clone this repository and run it locally. Hope this helps !

Thanks for answer. I have a laptop with 16 gb ram and Nvidia RTX 4500 (6gb vram) . I want to use the model locally and If I quantise a mistral model .guff in 2 bites it works locally. But I want to quantise the finetuned model of yours ..so I need it to be merged in one model first then to quantise it so that it fit in my laptop and use it locally. Please if you know tell me how to merge it in one model (the finetune of yours with original mistral) . If I ask noob questions please forgive my ignorance . I'm just a noob amazed by AI and it's opportunities :) .

Sign up or log in to comment