czczup commited on
Commit
830ef86
1 Parent(s): ab2e749

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +3 -3
README.md CHANGED
@@ -10,7 +10,7 @@ datasets:
10
  pipeline_tag: visual-question-answering
11
  ---
12
 
13
- # Model Card for Mini-InternVL-Chat-V1.5
14
  <p align="center">
15
  <img src="https://cdn-uploads.huggingface.co/production/uploads/64119264f0f81eb569e0d569/D60YzQBIzvoCvLRp2gZ0A.jpeg" alt="Image Description" width="300" height="300" />
16
  </p>
@@ -61,7 +61,7 @@ TODO
61
 
62
  ## Model Usage
63
 
64
- We provide an example code to run Mini-InternVL-Chat-V1.5 using `transformers`.
65
 
66
  You also can use our [online demo](https://internvl.opengvlab.com/) for a quick experience of this model.
67
 
@@ -155,7 +155,7 @@ def load_image(image_file, input_size=448, max_num=6):
155
  return pixel_values
156
 
157
 
158
- path = "OpenGVLab/Mini-InternVL-Chat-V1-5"
159
  model = AutoModel.from_pretrained(
160
  path,
161
  torch_dtype=torch.bfloat16,
 
10
  pipeline_tag: visual-question-answering
11
  ---
12
 
13
+ # Model Card for Mini-InternVL-Chat-2B-V1.5
14
  <p align="center">
15
  <img src="https://cdn-uploads.huggingface.co/production/uploads/64119264f0f81eb569e0d569/D60YzQBIzvoCvLRp2gZ0A.jpeg" alt="Image Description" width="300" height="300" />
16
  </p>
 
61
 
62
  ## Model Usage
63
 
64
+ We provide an example code to run Mini-InternVL-Chat-2B-V1.5 using `transformers`.
65
 
66
  You also can use our [online demo](https://internvl.opengvlab.com/) for a quick experience of this model.
67
 
 
155
  return pixel_values
156
 
157
 
158
+ path = "OpenGVLab/Mini-InternVL-Chat-2B-V1-5"
159
  model = AutoModel.from_pretrained(
160
  path,
161
  torch_dtype=torch.bfloat16,