Hugging Face
Models
Datasets
Spaces
Posts
Docs
Enterprise
Pricing
Log In
Sign Up
amitha
/
mllava-llama2-en-zh
like
0
Visual Question Answering
Transformers
Safetensors
LinkSoul/Chinese-LLaVA-Vision-Instructions
English
Chinese
llava_llama
llava
vlm
custom_code
arxiv:
2406.11665
License:
apache-2.0
Model card
Files
Files and versions
Community
Train
Use this model
main
mllava-llama2-en-zh
Commit History
Update README.md
d9d0008
verified
amitha
commited on
Jun 19
Update README.md
910fdd1
verified
amitha
commited on
Jun 19
Upload LlavaLlamaForCausalLM
657cf67
verified
amitha
commited on
Jun 19
Update README.md
0a732bf
verified
amitha
commited on
Jun 19
Update README.md
5a8c844
verified
amitha
commited on
Jun 19
Update README.md
72edfb1
verified
amitha
commited on
Jun 19
Update README.md
73c3414
verified
amitha
commited on
Jun 19
initial commit
3ea8889
verified
amitha
commited on
Jun 19