Windows not supported for torch.compile

#2
by dkjsnnr1 - opened
...
--> 702         raise RuntimeError("Windows not yet supported for torch.compile")```

Is there a way to get around this or is it Linux use only?

It seems that issue can be avoiced by using the nightly Version of Torch. Find your version here: https://download.pytorch.org/whl/nightly/cpu/torch/
You need to comment out the check for windows two times in eval_frame.py (just follow the error path)

#if sys.platform == "win32":
# raise RuntimeError("Windows not yet supported for torch.compile")

But even then the sample code does not work out on my side:


TypeError Traceback (most recent call last)
Cell In[1], line 8
7 try:
----> 8 model = HQQModelForCausalLM.from_quantized("PrunaAI/microsoft-Phi-3-mini-128k-instruct-HQQ-1bit-smashed", device_map='auto')
9 except:

TypeError: HQQWrapper.from_quantized() got an unexpected keyword argument 'device_map'

During handling of the above exception, another exception occurred:

TypeError Traceback (most recent call last)
Cell In[1], line 10
8 model = HQQModelForCausalLM.from_quantized("PrunaAI/microsoft-Phi-3-mini-128k-instruct-HQQ-1bit-smashed", device_map='auto')
9 except:
---> 10 model = AutoHQQHFModel.from_quantized("PrunaAI/microsoft-Phi-3-mini-128k-instruct-HQQ-1bit-smashed")
11 tokenizer = AutoTokenizer.from_pretrained("microsoft/Phi-3-mini-128k-instruct")
13 input_ids = tokenizer("What is the color of prunes?,", return_tensors='pt').to(model.device)["input_ids"]

File c:\Users\User\AppData\Local\pypoetry\Cache\virtualenvs\project-VJC9LFwh-py3.11\Lib\site-packages\hqq\models\base.py:458, in BaseHQQModel.from_quantized(cls, save_dir_or_hub, compute_dtype, device, cache_dir, adapter)
455 save_dir = cls.try_snapshot_download(save_dir_or_hub, cache_dir)
457 # Load model from config
--> 458 model = cls.create_model(save_dir)
460 # Track save directory
461 model.save_dir = save_dir

TypeError: BaseHQQHFModel.create_model() missing 1 required positional argument: 'kwargs'

Sign up or log in to comment