Error deploying model for inference on Azure
As the title says, I'm trying to deploy the model on Azure ML using a custom scoring script.
When I try to load the model on the init() event handler I get the following error: "RuntimeError: torch.UntypedStorage(): Storage device not recognized: mps"
Can you suggest how to load the model in a platform independent mode?
For Azure inference I'm using Linux Ubuntu 20.04 base image
Hello!
I'm not very familiar with AzureML, but "RuntimeError: torch.UntypedStorage(): Storage device not recognized: mps"
sounds like you're trying to use the model with the mps
device, whereas your image is Linux Ubuntu (and mps
is for Mac only). Could it be that you're specifying a device
as "mps"
somewhere? I would recommend "cpu"
if you don't have a GPU in your instance.
- Tom Aarsen
Hi Tom!
Thank you for your feedback. Indeed, I had to specify map_location="cpu"
while loading the model. As a beginner I also made the mistake of storing and resuming the model using joblib
instead of torch.save()
and torch.load()
.
Now it works on Azure!
Thank you!