Text Generation
Transformers
PyTorch
Thai
English
mpt
custom_code
text-generation-inference
Rasu23 commited on
Commit
f0a4dc0
1 Parent(s): 729a5f4

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -11
README.md CHANGED
@@ -92,19 +92,19 @@ Finetuning datasets are sourced from [LAION OIG chip2 and infill_dbpedia (Apache
92
  # Evaluation
93
  We performed human and machine evaluations on XQuAD zero-shot and one-shot settings:
94
  ## XQuAD
95
- | Model | Exact Match (Zero-shot) | F1 (Zero-shot) | Exact Match (One-shot) | F1 (One-shot) |
96
- |:--------------:|:-----------------------:|:--------------:|:----------------------:|:-------------:|
97
- | openthaigpt7B | 18.57 | 28.40 | 30.42 | 39.76 |
98
- | SeaLLM7B | 6.30 | 21.67 | 34.45 | 47.80 |
99
- | Typhoon-7b | - | 34.46 | - | **54.03** |
100
- | WangchanLion7B | **37.56** | **49.84** | **39.24** | 51.07 |
101
 
102
  ## iAPP Wiki QA
103
- | Model | Exact Match (Zero-shot) | F1 (Zero-shot) | Exact Match (One-shot) | F1 (One-shot) |
104
- |:--------------:|:-----------------------:|:--------------:|:----------------------:|:-------------:|
105
- | openthaigpt7B | 22.06 | 40.07 | 31.39 | 47.98 |
106
- | SeaLLM7B | 8.25 | 34.40 | 40.05 | 58.27 |
107
- | WangchanLion7B | **55.48** | **67.93** | **56.43** | **68.85** |
108
 
109
  # What WangchanLion offers:
110
  - Transparent pretrained model: The development of SEA-LION is community-driven, with different ASEAN collaborators contributing pretraining datasets. The SEA-LION developers ensure that all datasets are safe and can be utilized without commercial restrictions. This transparency extends to the provision of pretraining code, ensuring anyone can replicate SEA-LION using the provided datasets.
 
92
  # Evaluation
93
  We performed human and machine evaluations on XQuAD zero-shot and one-shot settings:
94
  ## XQuAD
95
+ | Model | F1 (Zero-shot) | F1 (One-shot) |
96
+ |:--------------:|:--------------:|:-------------:|
97
+ | openthaigpt7B | 27.3487 | 34.3104 |
98
+ | SeaLLM7B V2 | 16.1104 | 25.7399 |
99
+ | Typhoon-7b | 34.46 | **54.03** |
100
+ | WangchanLion7B | **45.8763** | 49.9145 |
101
 
102
  ## iAPP Wiki QA
103
+ | Model | F1 (Zero-shot) | F1 (One-shot) |
104
+ |:--------------:|:--------------:|:-------------:|
105
+ | openthaigpt7B | 40.0614 | 46.6883 |
106
+ | SeaLLM7B V2 | 23.6425 | 28.9934 |
107
+ | WangchanLion7B | **58.9051** | **62.9776** |
108
 
109
  # What WangchanLion offers:
110
  - Transparent pretrained model: The development of SEA-LION is community-driven, with different ASEAN collaborators contributing pretraining datasets. The SEA-LION developers ensure that all datasets are safe and can be utilized without commercial restrictions. This transparency extends to the provision of pretraining code, ensuring anyone can replicate SEA-LION using the provided datasets.