Unable to load model in SwitChat example

#7
by ltouati - opened

Hi Team,

Am I doing anything wrong, I've just downloaded the model to use it on my MBP M1 and I get this message
No model could be loaded: Error Domain=com.apple.mlassetio Code=1 "Failed to parse the model specification. Error: Field number 14 has wireType 6, which is not supported." UserInfo={NSLocalizedDescription=Failed to parse the model specification. Error: Field number 14 has wireType 6, which is not supported.}

Core ML Projects org

Hi @ltouati ! What version of macOS and Xcode are you using?

I'm running macOS Sonoma and Xcode Version 15.0.1 (15A507)

Hi,

I am having this same issue, MBP M1 Pro, same error message, same version (macOS Sonoma 14.1, Xcode 15.0.1). Any suggestions on how to fix?

Core ML Projects org

I've been trying to reproduce this issue, but it works fine for me. There are only a couple of things I can think of:

  • First, ensure you have plenty free disk space. Core ML preparation of large models can use big temporary files.
  • Second, make sure the files have been correctly downloaded. If you downloaded them with git, it might be the case that the LFS binaries have not been expanded. I would recommend you download the repo with the huggingface-cli command-line tool, like so:
pip install --upgrade huggingface-cli
huggingface-cli download --local-dir-use-symlinks False --local-dir ~/Desktop/Llama-2-7b-chat-coreml

That would place the model files in the Llama-2-7b-chat-coreml folder in your desktop. As a reference, file llama-2-7b-chat.mlpackage/Data/com.apple.CoreML/weights should have a size of 13476985472 bytes.

Please, let me know if that helps!

I'm able to replicate the issue when cloning the repo (without LFS). Both Weights.bin and model.mlmodel aren't downloaded correctly without LFS enabled.

To solve it, download both files manually from the Huggingface site, or alternatively, use git lfs pull after you've cloned the repo.

In my case, my harddisk was also full and it took me a while to figure that out. MacOS Sonoma didn't show any warning or error.

@pcuenq first of all thanks for thinking and helping us (The Swift Developers!!), I'd been trying to run swift-Chat with this model, It has the right size, as mentioned above but still fails to load the model, in my case it was showing a missing Manifest.json file, did add the file manually to the folder, now the error goes like this:

Compiling model file://Repositories/Llama-2-7b-chat-coreml/
No model could be loaded: Error Domain=com.apple.CoreML Code=3 "Failed to read model package at file://..Repositories/Llama-2-7b-chat-coreml/. Error: Item does not exist for identifier: 4C2A622E-2520-4610-9DB5-C52668265512" UserInfo={NSLocalizedDescription=Failed to read model package at file://..Repositories/Llama-2-7b-chat-coreml/. Error: Item does not exist for identifier: 4C2A622E-2520-4610-9DB5-C52668265512}
Failed to read model package at file://..Repositories/Llama-2-7b-chat-coreml/. Error: Item does not exist for identifier: 4C2A622E-2520-4610-9DB5-C52668265512

Running on MacOS Sonoma 14
Xcode 15.0 (15A240d)
M1 MacBook PRO, 16 GB

Sign up or log in to comment