Quick start

from ctransformers import AutoModelForCausalLM

model = AutoModelForCausalLM.from_pretrained("npvinHnivqn/GGUF-openchat", 
                                                   model_file="openchat.gguf", 
                                                   model_type="llama", gpu_layers=0,
                                                   context_length=768)
model('''AI will ''', temperature=0.1)
Downloads last month
3
GGUF
Model size
13B params
Architecture
llama
Hardware compatibility
Log In to view the estimation
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support