DavidNguyen commited on
Commit
d212107
1 Parent(s): 5f9460e

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +0 -2
README.md CHANGED
@@ -19,8 +19,6 @@ Mixture of Experts (MoEs) plays an essential role in the development of more eff
19
 
20
  We have released five MoE algorithms trained based on `microsoft/Phi-3-mini-4k-instruct` for LLMs and `SigLIP` for vision encoding. These models were trained on the [LLAVA-665K dataset](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K). We evaluated these state-of-the-art algorithms on 11 benchmarks, examining various aspects of MoE algorithm performance.
21
 
22
-
23
- Replace `<link-to-LLAVA-665K>` with the actual URL to the LLAVA-665K dataset. Let me know if you need more adjustments!
24
  | Model | MoE Method | AI2D | Text VQA | GQA | Hallusion<br>Benchmark | MathVista<br>Validation | MMBenchEN<br>/ dev | MMMU<br>Validation | MMStar | POPE | SQA IMG<br>Full | MME | AVG |
25
  |---------------------|---------------------|-------|----------|-------|-------------------------|-------------------------|---------------------|---------------------|--------|--------|------------------|-----------|-------|
26
  | SigLIP 224 + Phi3 | SMoE-R | 64.35 | 40.35 | 60.03 | **41.75** | 28.7 | 67.96 | 40.22 | 39.47 | 84.31 | 80.71 | 1,655.81 | 54.78 |
 
19
 
20
  We have released five MoE algorithms trained based on `microsoft/Phi-3-mini-4k-instruct` for LLMs and `SigLIP` for vision encoding. These models were trained on the [LLAVA-665K dataset](https://huggingface.co/datasets/liuhaotian/LLaVA-Instruct-150K). We evaluated these state-of-the-art algorithms on 11 benchmarks, examining various aspects of MoE algorithm performance.
21
 
 
 
22
  | Model | MoE Method | AI2D | Text VQA | GQA | Hallusion<br>Benchmark | MathVista<br>Validation | MMBenchEN<br>/ dev | MMMU<br>Validation | MMStar | POPE | SQA IMG<br>Full | MME | AVG |
23
  |---------------------|---------------------|-------|----------|-------|-------------------------|-------------------------|---------------------|---------------------|--------|--------|------------------|-----------|-------|
24
  | SigLIP 224 + Phi3 | SMoE-R | 64.35 | 40.35 | 60.03 | **41.75** | 28.7 | 67.96 | 40.22 | 39.47 | 84.31 | 80.71 | 1,655.81 | 54.78 |