Edit model card

#llama3_70b

#trained no_robot dataset

#using fsdp_qlora 8 GPU cluster

#git clone https://github.com/bigsnarfdude/fsdp-qlora


import transformers
import torch
model_id = "vincentoh/llama3_70b_no_robot_fsdp_qlora"

pipeline = transformers.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.bfloat16}, device_map="auto")
pipeline("Why is the sky blue?")
Downloads last month
11
Safetensors
Model size
70.6B params
Tensor type
FP16
·