newbie here

#3
by developerbayman - opened

im trying to load this in my code my question is do i need to download the model(s) or will it automatically download when i put this in my code or do i need to download them manually also is there a way to download everything in one go any help would be appreciated as im just trying to get a jarvis like assistant that can operate my computer and help me write code and keep me company and execute custom voice commands im on a orange pi 5 .....truthfully i know very little about what im doing i had a successful chatgpt chat script using openai but i wanna be able to use it while being offline and once i get this working and trainable ....i will disappear into the night (probably): is this part just the path to the models once i download them?VVVVV
from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("Aeala/GPT4-x-AlpacaDente2-30b")
model = AutoModelForCausalLM.from_pretrained("Aeala/GPT4-x-AlpacaDente2-30b")

Should work! All the stuff it'll download should work (it should ignore the 4bit.safetensors unless you're intentionally trying to load in GPTQ 4-bit). Good luck! Let me know if you run into any issues and I hope you enjoy the model! ^~^

Sign up or log in to comment