Instructions to use PhaseOfCode/sevzero-llama3-8b-sft-primary with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- PEFT
How to use PhaseOfCode/sevzero-llama3-8b-sft-primary with PEFT:
from peft import PeftModel from transformers import AutoModelForCausalLM base_model = AutoModelForCausalLM.from_pretrained("unsloth/Meta-Llama-3.1-8B-Instruct") model = PeftModel.from_pretrained(base_model, "PhaseOfCode/sevzero-llama3-8b-sft-primary") - Notebooks
- Google Colab
- Kaggle
- Xet hash:
- 2e56cad9ac2a38f4b7de211953881b613c19e51838c05adf8f7984c139c2e8f5
- Size of remote file:
- 17.2 MB
- SHA256:
- 6b9e4e7fb171f92fd137b777cc2714bf87d11576700a1dcd7a399e7bbe39537b
·
Xet efficiently stores Large Files inside Git, intelligently splitting files into unique chunks and accelerating uploads and downloads. More info.