Unable to load the model using transformers

#6
by whiskycody - opened

Hi,

I'm trying to load the model using the below code but I'm getting error in downloading the model.

Code:

Load model directly

from transformers import AutoProcessor, AutoModelForCausalLM

processor = AutoProcessor.from_pretrained("LanguageBind/Video-LLaVA-7B")
model = AutoModelForCausalLM.from_pretrained("LanguageBind/Video-LLaVA-7B")

Output Error :
File /kaggle/working/transformers/utils/hub.py:452, in cached_file(path_or_repo_id, filename, cache_dir, force_download, resume_download, proxies, token, revision, local_files_only, subfolder, repo_type, user_agent, _raise_exceptions_for_gated_repo, _raise_exceptions_for_missing_entries, _raise_exceptions_for_connection_errors, _commit_hash, **deprecated_kwargs)
450 if revision is None:
451 revision = "main"
--> 452 raise EnvironmentError(
453 f"{path_or_repo_id} does not appear to have a file named {full_filename}. Checkout "
454 f"'https://huggingface.co/ class="ansi-bold" style="color:rgb(175,95,135)">{path_or_repo_id}/{revision}' for available files."
455 ) from e
456 except HTTPError as err:
457 resolved_file = _get_cache_file_to_return(path_or_repo_id, full_filename, cache_dir, revision)

OSError: LanguageBind/Video-LLaVA-7B does not appear to have a file named preprocessor_config.json. Checkout 'https://huggingface.co/LanguageBind/Video-LLaVA-7B/main' for available files.

How to solve this issue ? Is the model available through transformers ?

Hey!

The model is not yet supported by transformers. We are in the process of adding it, please take a look here

Sign up or log in to comment