Transformers
GGUF
English
tinyllama
conversational

./main: No such file or directory

#6
by romyull - opened

Hi, I tried to run Tinyllama1.1B in my local machine. I have converted this using this code.
python3 convert.py ../TinyLlama-1.1B-Chat-v1.0. And I also tried using quantized model from here.
I tried multiple times like below to get output .
romy@romy-VM:~/SLM/llama.cpp$ ./main -m ../TinyLlama-1.1B-Chat-v1.0/ggml-model-f16.gguf --color -c 2048 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "<|system|>\n{system_message}\n<|user|>\n{prompt}\n<|assistant|>"

and
romy@romy-VM:~/SLM/llama.cpp$ ./main -t 3 -m ../TinyLlama-1.1B-Chat-v1.0/ggml-model-f16.gguf --color -c 2048 --temp 0.7 --repeat_penalty 1.1 -n -1 -p "### Instruction: write a story about llamas\n### Response:"

Everytime I get response like bash: ./main: No such file or directory.
Could you please tell me where I am making mistake. I am completely new to working on this.

This might indicate that you haven't built llama.cpp.

romyull changed discussion status to closed

Sign up or log in to comment