Anyway to load this with oobabooga?

#3
by GamingDaveUK - opened

Tried and i get an error that its missing a config file:
Traceback (most recent call last):
File “F:\oobabooga_windows\text-generation-webui\server.py”, line 67, in load_model_wrapper
shared.model, shared.tokenizer = load_model(shared.model_name)
File “F:\oobabooga_windows\text-generation-webui\modules\models.py”, line 74, in load_model
shared.model_type = find_model_type(model_name)
File “F:\oobabooga_windows\text-generation-webui\modules\models.py”, line 62, in find_model_type
config = AutoConfig.from_pretrained(Path(f’{shared.args.model_dir}/{model_name}'), trust_remote_code=shared.args.trust_remote_code)
File “F:\oobabooga_windows\installer_files\env\lib\site-packages\transformers\models\auto\configuration_auto.py”, line 916, in from_pretrained
config_dict, unused_kwargs = PretrainedConfig.get_config_dict(pretrained_model_name_or_path, **kwargs)
File “F:\oobabooga_windows\installer_files\env\lib\site-packages\transformers\configuration_utils.py”, line 573, in get_config_dict
config_dict, kwargs = cls._get_config_dict(pretrained_model_name_or_path, **kwargs)
File “F:\oobabooga_windows\installer_files\env\lib\site-packages\transformers\configuration_utils.py”, line 628, in _get_config_dict
resolved_config_file = cached_file(
File “F:\oobabooga_windows\installer_files\env\lib\site-packages\transformers\utils\hub.py”, line 380, in cached_file
raise EnvironmentError(
OSError: models\mayank31398_starcoder-GPTQ-4bit-128g does not appear to have a file named config.json. Checkout ‘https://huggingface.co/models\mayank31398_starcoder-GPTQ-4bit-128g/None’ for available files.

hey @GamingDaveUK ,
you need to copy all the files from the original starcoder model (except from the model files .bin)
and copy them to the directory of this model, then it will work for you, but I cant make it to work properly,
I think we need to send the correct configuration for the input the model expect to recieve.

I really want to use this

I copied config files from bigcode/starcoder and now I can load the model in oobabooga webui, but when I try to generate, I get the following error:

Traceback (most recent call last):
File "C:\oobabooga_windows\text-generation-webui\modules\callbacks.py", line 73, in gentask
ret = self.mfunc(callback=_callback, **self.kwargs)
File "C:\oobabooga_windows\text-generation-webui\modules\text_generation.py", line 274, in generate_with_callback
shared.model.generate(**kwargs)
File "C:\oobabooga_windows\installer_files\env\lib\site-packages\torch\utils_contextlib.py", line 115, in decorate_context
return func(*args, **kwargs)
File "C:\oobabooga_windows\installer_files\env\lib\site-packages\transformers\generation\utils.py", line 1568, in generate
return self.sample(
File "C:\oobabooga_windows\installer_files\env\lib\site-packages\transformers\generation\utils.py", line 2651, in sample
next_tokens = torch.multinomial(probs, num_samples=1).squeeze(1)
RuntimeError: probability tensor contains either inf, nan or element < 0

Sign up or log in to comment