plutus_llm / README.md
ubiodee's picture
Update README.md
3086f33 verified
metadata
tags:
  - text-generation
  - causal-lm
  - transformers
  - peft
library_name: transformers
model-index:
  - name: Llama-3-8B Fine-tuned
    results: []

Fine-Tuned Llama-3-8B Model

This model is a fine-tuned version of NousResearch/Meta-Llama-3-8B using LoRA and 8-bit quantization.

Usage

To load the model:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "ubiodee/Test_Plutus"
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
tokenizer = AutoTokenizer.from_pretrained(model_name)