The code for using the model is broken

#7
by elgrancapitanbeto - opened

from transformers import pipeline

pipe = pipeline("text-generation", model="Lin-Chen/ShareGPT4V-7B")
config.json: 100%|β–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆβ–ˆ| 1.17k/1.17k [00:00<00:00, 11.6MB/s]
Traceback (most recent call last):
File "", line 1, in
File "/opt/conda/lib/python3.10/site-packages/transformers/pipelines/init.py", line 751, in pipeline
config = AutoConfig.from_pretrained(model, _from_pipeline=task, **hub_kwargs, **model_kwargs)
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 1050, in from_pretrained
config_class = CONFIG_MAPPING[config_dict["model_type"]]
File "/opt/conda/lib/python3.10/site-packages/transformers/models/auto/configuration_auto.py", line 748, in getitem
raise KeyError(key)
KeyError: 'share4v'

I have same problem, do you solve it?

I have the same question. I have tried the different versions of Transformer, but it did not work.

I have the same question. I have tried the different versions of Transformer, but it did not work.

Maybe you can use our provided code to infer our model.

Sign up or log in to comment