MalayMMLU
Dilancarkan pada 27 September 2024
English | Bahasa Melayu
π Penerbitan β’ Kod β’ π Poster
Pengenalan
MalayMMLU ialah tanda aras kefahaman bahasa pelbagai tugas (Massive Multitask Language Understanding (MMLU) dalam Bahasa Inggeris) pertama untuk Bahasa Melayu. Tanda aras ini merangkumi 24,213 soalan yang meliputi peringkat pendidikan rendah (Tahun 1-6) dan menengah (Tingkatan 1-5) di Malaysia, terdiri daripada 5 topik utama yang dibahagikan kepada 22 subjek.
Topik | Subjek |
---|---|
STEM | Sains Komputer (Menengah), Biologi (Menengah), Kimia (Menengah), Literasi Komputer (Menengah), Matematik (Rendah, Menengah), Matematik Tambahan (Menengah), Reka Bentuk dan Teknologi (Rendah, Menengah), Sains Teras (Rendah, Menengah), Teknologi Maklumat dan Komunikasi (Rendah), Teknologi Automotif (Menengah) |
Bahasa | Bahasa Melayu (Rendah, Menengah) |
Sains Sosial | Geografi (Menengah), Kajian Tempatan (Rendah), Sejarah (Rendah, Menengah) |
Lain-lain | Kemahiran Hidup (Rendah, Menengah), Prinsip Perakaunan (Menengah), Ekonomi (Menengah), Perniagaan (Menengah), Pertanian (Menengah) |
Kemanusiaan | Pendidikan Al Quran dan Al Sunnah (Menengah), Pendidikan Islam (Rendah, Menengah), Pengetahuan Sains Sukan (Menengah) |
Keputusan
Keputusan Penilaian Zero-shot untuk MalayMMLU (Ketepatan token pertama)
Organisasi | Model | Visual | Ketepatan | ||||||
---|---|---|---|---|---|---|---|---|---|
Bahasa | Kemanusiaan | STEM | Sains Sosial | Lain-lain | Purata | ||||
Rawak | 38.01 | 42.09 | 36.31 | 36.01 | 38.07 | 38.02 | |||
OpenAI | GPT-4o | β | 87.12 | 88.12 | 83.83 | 82.58 | 83.09 | 84.98 | |
GPT-4 | β | 82.90 | 83.91 | 78.80 | 77.29 | 77.33 | 80.11 | ||
GPT-4o mini | β | 82.03 | 81.50 | 78.51 | 75.67 | 76.30 | 78.78 | ||
GPT-3.5 | 69.62 | 71.01 | 67.17 | 66.70 | 63.73 | 67.78 | |||
Meta | LLaMA-3.1 (70B) | 78.75 | 82.59 | 78.96 | 77.20 | 75.32 | 78.44 | ||
LLaMA-3.1 (8B) | 65.47 | 67.17 | 64.10 | 62.59 | 62.13 | 64.24 | |||
LLaMA-3 (8B) | 63.93 | 66.21 | 62.26 | 62.97 | 61.38 | 63.46 | |||
LLaMA-2 (13B) | 45.58 | 50.72 | 44.13 | 44.55 | 40.87 | 45.26 | |||
LLaMA-2 (7B) | 47.47 | 52.74 | 48.71 | 50.72 | 48.19 | 49.61 | |||
LLaMA-3.2 (3B) | 58.52 | 60.66 | 56.65 | 54.06 | 52.75 | 56.45 | |||
LLaMA-3.2 (1B) | 38.88 | 43.30 | 40.65 | 40.56 | 39.55 | 40.46 | |||
Qwen (Alibaba) | Qwen 2.5 (72B) | 79.09 | 79.95 | 80.88 | 75.80 | 75.05 | 77.79 | ||
Qwen-2.5 (32B) | 76.96 | 76.70 | 79.74 | 72.35 | 70.88 | 74.83 | |||
Qwen-2-VL (7B) | β | 68.16 | 63.62 | 67.58 | 60.38 | 59.08 | 63.49 | ||
Qwen-2-VL (2B) | β | 58.22 | 55.56 | 57.51 | 53.67 | 55.10 | 55.83 | ||
Qwen-1.5 (14B) | 64.47 | 60.64 | 61.97 | 57.66 | 58.05 | 60.47 | |||
Qwen-1.5 (7B) | 60.13 | 59.14 | 58.62 | 54.26 | 54.67 | 57.18 | |||
Qwen-1.5 (4B) | 48.39 | 52.01 | 51.37 | 50.00 | 49.10 | 49.93 | |||
Qwen-1.5 (1.8B) | 42.70 | 43.37 | 43.68 | 43.12 | 44.42 | 43.34 | |||
Zhipu | GLM-4-Plus | 78.04 | 75.63 | 77.49 | 74.07 | 72.66 | 75.48 | ||
GLM-4-Air | 67.88 | 69.56 | 70.20 | 66.06 | 66.18 | 67.60 | |||
GLM-4-Flash | 63.52 | 65.69 | 66.31 | 63.21 | 63.59 | 64.12 | |||
GLM-4 | 63.39 | 56.72 | 54.40 | 57.24 | 55.00 | 58.07 | |||
GLM-4β β (9B) | 58.51 | 60.48 | 56.32 | 55.04 | 53.97 | 56.87 | |||
Gemma-2 (9B) | 75.83 | 72.83 | 75.07 | 69.72 | 70.33 | 72.51 | |||
Gemma (7B) | 45.53 | 50.92 | 46.13 | 47.33 | 46.27 | 47.21 | |||
Gemma (2B) | 46.50 | 51.15 | 49.20 | 48.06 | 48.79 | 48.46 | |||
SAIL (Sea) | Sailorβ (14B) | 78.40 | 72.88 | 69.63 | 69.47 | 68.67 | 72.29 | ||
Sailorβ (7B) | 74.54 | 68.62 | 62.79 | 64.69 | 63.61 | 67.58 | |||
Cohere for AI | Command R (32B) | 71.68 | 71.49 | 66.68 | 67.19 | 63.64 | 68.47 | ||
OpenGVLab | InternVL2 (40B) | β | 70.36 | 68.49 | 64.88 | 65.93 | 60.54 | 66.51 | |
Damo (Alibaba) | SeaLLM-v2.5β (7B) | 69.75 | 67.94 | 65.29 | 62.66 | 63.61 | 65.89 | ||
Mistral | Pixtral (12B) | β | 64.81 | 62.68 | 64.72 | 63.93 | 59.49 | 63.25 | |
Mistral Small (22B) | 65.19 | 65.03 | 63.36 | 61.58 | 59.99 | 63.05 | |||
Mistral-v0.3 (7B) | 56.97 | 59.29 | 57.14 | 58.28 | 56.56 | 57.71 | |||
Mistral-v0.2 (7B) | 56.23 | 59.86 | 57.10 | 56.65 | 55.22 | 56.92 | |||
Microsoft | Phi-3 (14B) | 60.07 | 58.89 | 60.91 | 58.73 | 55.24 | 58.72 | ||
Phi-3 (3.8B) | 52.24 | 55.52 | 54.81 | 53.70 | 51.74 | 53.43 | |||
01.AI | Yi-1.5 (9B) | 56.20 | 53.36 | 57.47 | 50.53 | 49.75 | 53.08 | ||
Stability AI | StableLM 2 (12B) | 53.40 | 54.84 | 51.45 | 51.79 | 50.16 | 52.45 | ||
StableLM 2 (1.6B) | 43.92 | 51.10 | 45.27 | 46.14 | 46.75 | 46.48 | |||
Baichuan | Baichuan-2 (7B) | 40.41 | 47.35 | 44.37 | 46.33 | 43.54 | 44.30 | ||
Mesolitica | MaLLaM-v2β (5B) | 42.57 | 46.44 | 42.24 | 40.82 | 38.74 | 42.08 | ||
Yellow.ai | Komodoβ (7B) | 43.62 | 45.53 | 39.34 | 39.75 | 39.48 | 41.72 |
Rujukan
@InProceedings{MalayMMLU2024,
author = {Poh, Soon Chang and Yang, Sze Jue and Tan, Jeraelyn Ming Li and Chieng, Lawrence Leroy Tze Yao and Tan, Jia Xuan and Yu, Zhenyu and Foong, Chee Mun and Chan, Chee Seng},
title = {MalayMMLU: A Multitask Benchmark for the Low-Resource Malay Language},
booktitle = {Findings of the Association for Computational Linguistics: EMNLP 2024},
month = {November},
year = {2024},
}
Maklumbalas
Cadangan dan pendapat (sama ada positif atau negatif) amat dialu-alukan. Sila hubungi dengan menghantar emel ke cs.chan di um.edu.my
.