Ollama submission

#5
by buzsh - opened

Any chance we'll see an entry there?

Very excited for what is to come with this! Looking forward to seeing how this performs on-device at WWDC.

I'm pulling and going to try to run it on ollama.

I'm pulling and going to try to run it on ollama.

Any luck?

Reading config.json

  "architectures": [
    "OpenELMForCausalLM"
  ],

Seems like it's a whole new architecture, will have to wait for llama cpp to add it and for ollama to pull those changes. Should be possible though in the future, just not now. You can check the progress here and ollama will likely announce they have added support when they make a new release, but you can also search for related issues or pull requests once support is merged into llama cpp. But there isn't any point in making any until the work on llama cpp end is completed

I'm pulling and going to try to run it on ollama.

Any luck?

no, we need to wait llama.cpp to add this architecture

Sign up or log in to comment