czczup commited on
Commit
c65b4db
β€’
1 Parent(s): bd1705a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -2
README.md CHANGED
@@ -18,7 +18,10 @@ pipeline_tag: visual-question-answering
18
 
19
  > _Two interns holding hands, symbolizing the integration of InternViT and InternLM._
20
 
21
- \[[InternVL 1.5 Technical Report](https://arxiv.org/abs/2404.16821)\] \[[CVPR Paper](https://arxiv.org/abs/2312.14238)\] \[[GitHub](https://github.com/OpenGVLab/InternVL)\] \[[Chat Demo](https://internvl.opengvlab.com/)\] \[[中文解读](https://zhuanlan.zhihu.com/p/675877376)\]
 
 
 
22
 
23
  You can run multimodal large models using a 1080Ti now.
24
 
@@ -55,7 +58,7 @@ As shown in the figure below, we adopted the same model architecture as InternVL
55
 
56
  ## Performance
57
 
58
- ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64119264f0f81eb569e0d569/BbsilHS8PjwZwlc330_g4.png)
59
 
60
  ## Model Usage
61
 
 
18
 
19
  > _Two interns holding hands, symbolizing the integration of InternViT and InternLM._
20
 
21
+ [\[πŸ†• Blog\]](https://internvl.github.io/blog/) [\[πŸ“œ InternVL 1.0 Paper\]](https://arxiv.org/abs/2312.14238) [\[πŸ“œ InternVL 1.5 Report\]](https://arxiv.org/abs/2404.16821) [\[πŸ—¨οΈ Chat Demo\]](https://internvl.opengvlab.com/) [\[πŸ€— HuggingFace Demo\]](https://huggingface.co/spaces/OpenGVLab/InternVL)
22
+
23
+ [\[πŸš€ Quick Start\]](#model-usage) [\[🌐 Community-hosted API\]](https://rapidapi.com/adushar1320/api/internvl-chat) [\[πŸ“– 中文解读\]](https://zhuanlan.zhihu.com/p/675877376)
24
+
25
 
26
  You can run multimodal large models using a 1080Ti now.
27
 
 
58
 
59
  ## Performance
60
 
61
+ ![image/png](https://cdn-uploads.huggingface.co/production/uploads/64119264f0f81eb569e0d569/ngl8oZvNrjItWtLUQqB2V.png)
62
 
63
  ## Model Usage
64