How to create in Ollama??

#2
by heyiammahu - opened

2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51186]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51187]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51188]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51189]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51190]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51191]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51192]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51193]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51194]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51195]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51196]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51197]
2023/12/16 21:21:42 parser.go:62: WARNING: Unknown command: [PAD51198]

Error: no FROM line for the model was specified

This is the error i am getting when i run ollama create phi-2-q4 -f ./phi-2_Q8_0.gguf

Any idea to create it

Yeah but it is not working in ollama

Support has been added to llama.cpp master so the ball is on Ollama now.

https://github.com/ggerganov/llama.cpp/commit/b9e74f9bca5fdf7d0a22ed25e7a9626335fdfa48

LM Studio Beta is updated.
So I am trying there now!

Thanks!!

Not a good model.
Screenshot 2023-12-19 at 3.08.30 PM.png
Screenshot 2023-12-19 at 3.09.25 PM.png
Screenshot 2023-12-19 at 3.11.48 PM.png

I find it interesting for a only 3b parameters model you will soon be able to run anywhere. It won't do math or you prolly would have to implement a Chain of Thought in the prompts or external tools after processing.

@namankhator : thanks for the feedback! Please recall that this is a base completion model, so the format of your question really matters. When you give instruction I recommend using the format:

Instruct: YOUR INSTRUCTION
Output:

Moreover, for any kind of reasoning it's useful to add "Let's think step by step", even for easy questions. If you do both of those things, it works for your example.

Hey @sebubeck

Thanks for the recommendations.
I believe Instruct and Output are already set. (attached image from LM Studio)

I tried to use the prompt you asked but it still did not work.
Screenshot 2023-12-29 at 2.30.20 PM.png

Screenshot 2023-12-29 at 2.38.33 PM.png

I will try for tasks other than reasoning, and if need be will update.

Sign up or log in to comment