Can't compile on Xcode
Hi there, I'd been trying to add this model to an Xcode project (Xcode Version 16.2 beta (16B5100e)) but by just adding the model files it generates some error that I can't seem to solve, any help would be appreciated.
The project only has one dependency, the transformers package over SPM.
These are the errors I am getting by just adding the model:
Sandbox: coremlcompiler(14211) deny(1) file-read-data /Users/.../Repos/OpenELM/OpenELM/models--corenet-community--coreml-OpenELM-270M/blobs/3642097f29ba8d36a15de0a0512c7d780047d35c
Sandbox: coremlcompiler(14211) deny(1) file-read-data /Users/.../Repos/OpenELM/OpenELM/models--corenet-community--coreml-OpenELM-270M/blobs/3642097f29ba8d36a15de0a0512c7d780047d35c
/Users/.../Repos/OpenELM/coremlc:1:1 Failed to read model package at file:///Users/.../Repos/OpenELM/OpenELM/models--corenet-community--coreml-OpenELM-270M/snapshots/a7f63dbafbbef9ef87c35602ca1318acc045331e/OpenELM-270M-128-float32.mlpackage/. Error: Input stream is not valid
Hello @0xjorgev !
How did you download the model? I believe it's a problem with symbolic links in the downloaded folder. If you have the huggingface-cli
command line utility installed [1], I recommend you use the following command:
huggingface-cli download --local-dir model corenet-community/coreml-OpenELM-270M --include "*.mlpackage/
That would download the model and put it inside a folder called model
.
[1] You can use brew install huggingface-cli
otherwise
Yes, by default it replicates the cache structure used by transformers and other libs, which relies on symlinks that are not supported by Xcode. Using --local-dir
bypasses that format.
Closing the issue now!