zxybazh commited on
Commit
039316c
1 Parent(s): 49da190

Upload folder using huggingface_hub

Browse files
LICENSE.txt ADDED
@@ -0,0 +1,50 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ This project utilizes materials from Llama 2, provided by Meta Platforms, Inc. The Llama 2 materials are licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.
2
+
3
+ A copy of the license agreement can be found at [Link to the License, e.g. https://github.com/facebookresearch/llama/blob/main/LICENSE].
4
+
5
+ All applicable terms and conditions outlined in the LLAMA 2 Community License Agreement have been adhered to, including but not limited to the retention of the attribution notice in all redistributed copies of the Llama Materials as follows:
6
+
7
+ "Llama 2 is licensed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved."
8
+
9
+ This project complies with all applicable laws and regulations and adheres to the Acceptable Use Policy for the Llama Materials.
10
+
11
+
12
+ AceGPT COMMUNITY LICENSE AGREEMENT
13
+ AceGPT Version Release Date: Sep 23, 2023
14
+
15
+
16
+ "Agreement" means the terms and conditions for use, reproduction, distribution and modification of the AceGPT Materials set forth herein.
17
+
18
+ "Licensee" or "you" means you, or your employer or any other person or entity (if you are entering into this Agreement on such person or entity's behalf), of the age required under applicable laws, rules or regulations to provide legal consent and that has legal authority to bind your employer or such other person or entity if you are entering in this Agreement on their behalf.
19
+
20
+ "AceGPT" means the foundational large language models and software and algorithms, including machine-learning model code, trained model weights, inference-enabling code, training-enabling code, fine-tuning enabling code and other elements of the foregoing distributed by SRIBD, CUHK(shenzhen) and KAUST at https://github.com/FreedomIntelligence/AceGPT and https://huggingface.co/FreedomIntelligence/.
21
+
22
+ "AceGPT Materials" means, collectively, our proprietary AceGPT and
23
+ Documentation (and any portion thereof) made available under this Agreement.
24
+
25
+ By clicking "I Accept" below or by using or distributing any portion or element of our Materials, you agree to be bound by this Agreement.
26
+
27
+ 1. License Rights and Redistribution.
28
+
29
+ a. Grant of Rights. You are granted a non-exclusive, worldwide, non-transferable and royalty-free limited license under our intellectual property or other rights owned by our embodied in the AceGPT Materials to use, reproduce, distribute, copy, create derivative works of, and make modifications to the AceGPT Materials.
30
+
31
+ b. Redistribution and Use.
32
+
33
+ i. If you distribute or make the AceGPT Materials, or any derivative works thereof, available to a third party, you shall provide a copy of this Agreement to such third party.
34
+ ii. If you receive AceGPT Materials, or any derivative works thereof, from a Licensee as part of an integrated end user product, then Section 2 of this Agreement will not apply to you.
35
+
36
+ iii. You must retain in all copies of the AceGPT Materials that you distribute the following attribution notice within a "Notice" text file distributed as a part of such copies: "AceGPT is licensed under the AceGPT Community License"
37
+
38
+ iv. Your use of the AceGPT Materials must comply with applicable laws and regulations (including trade compliance laws and regulations) and adhere to the Acceptable Use Policy for the AceGPT Materials, which is hereby incorporated by reference into this Agreement.
39
+
40
+ v. You will not use the AceGPT Materials or any output or results of the AceGPT Materials to improve any other large language model (excluding AceGPT or derivative works thereof).
41
+
42
+ 2. Additional Commercial Terms. If, on the AceGPT version release date, the monthly active users of the products or services made available by or for Licensee, or Licensee's affiliates, is greater than 700 million monthly active users in the preceding calendar month, you must request a license from SRIBD, which SRIBD may grant to you in its sole discretion, and you are not authorized to exercise any of the rights under this Agreement unless or until SRIBD otherwise expressly grants you such rights.
43
+
44
+ 3. Disclaimer of Warranty. UNLESS REQUIRED BY APPLICABLE LAW, THE ACEGPT MATERIALS AND ANY OUTPUT AND RESULTS THEREFROM ARE PROVIDED ON AN "AS IS" BASIS, WITHOUT WARRANTIES OF ANY KIND, EITHER EXPRESS OR IMPLIED, INCLUDING, WITHOUT LIMITATION, ANY WARRANTIES OF TITLE, NON-INFRINGEMENT, MERCHANTABILITY, OR FITNESS FOR A PARTICULAR PURPOSE. YOU ARE SOLELY RESPONSIBLE FOR DETERMINING THE APPROPRIATENESS OF USING OR REDISTRIBUTING THE ACEGPT MATERIALS AND ASSUME ANY RISKS ASSOCIATED WITH YOUR USE OF THE ACEGPT MATERIALS AND ANY OUTPUT AND RESULTS.
45
+
46
+ 4. Limitation of Liability. IN NO EVENT WILL SRIBD OR ITS AFFILIATES BE LIABLE UNDER ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, TORT, NEGLIGENCE, PRODUCTS LIABILITY, OR OTHERWISE, ARISING OUT OF THIS AGREEMENT, FOR ANY LOST PROFITS OR ANY INDIRECT, SPECIAL, CONSEQUENTIAL, INCIDENTAL, EXEMPLARY OR PUNITIVE DAMAGES, EVEN IF SRIBD OR ITS AFFILIATES HAVE BEEN ADVISED OF THE POSSIBILITY OF ANY OF THE FOREGOING.
47
+
48
+ 5. Intellectual Property.
49
+
50
+ a. No trademark licenses are granted under this Agreement, and in connection with the AceGPT Materials, neither SRIBD nor Licensee may use any name or m
README.md ADDED
@@ -0,0 +1,83 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: apache-2.0
3
+ language:
4
+ - ar
5
+ ---
6
+ # <b>AceGPT</b>
7
+ AceGPT is a fully fine-tuned generative text model collection based on LlaMA2, particularly in the
8
+ Arabic language domain. This is the repository for the 13B pre-trained model.
9
+
10
+ ---
11
+ ## Model Details
12
+ We have released the AceGPT family of large language models, which is a collection of fully fine-tuned generative text models based on LlaMA2, ranging from 7B to 13B parameters. Our models include two main categories: AceGPT and AceGPT-chat. AceGPT-chat is an optimized version specifically designed for dialogue applications. It is worth mentioning that our models have demonstrated superior performance compared to all currently available open-source Arabic dialogue models in multiple benchmark tests. Furthermore, in our human evaluations, our models have shown comparable satisfaction levels to some closed-source models, such as ChatGPT, in the Arabic language.
13
+ ## Model Developers
14
+ We are from the School of Data Science, the Chinese University of Hong Kong, Shenzhen (CUHKSZ), the Shenzhen Research Institute of Big Data (SRIBD), and the King Abdullah University of Science and Technology (KAUST).
15
+ ## Variations
16
+ AceGPT families come in a range of parameter sizes —— 7B and 13B, each size of model has a base category and a -chat category.
17
+ ## Input
18
+ Models input text only.
19
+ ## Output
20
+ Models output text only.
21
+
22
+ ## Model Evaluation Results
23
+
24
+ Experiments on Arabic MMLU and EXAMs. ' AverageBest ', ' STEM ', ' Humanities ', ' Social Sciences ' and ' Others (Business, Health, Misc)' belong to Arabic MMLU. Best performance is in bold and the second best is underlined.
25
+
26
+ | Model | Average | STEM | Humanities | Social Sciences | Others (Business, Health, Misc) |EXAMs |
27
+ |-----------------|---------|------|------------|-----------------|---------------------------------|--------------|
28
+ | Bloomz Muennighoff et al. (2022) | 30.95 | 32.32 | 26.71 | 35.85 | 28.95 | 33.89 |
29
+ | Llama2-7B | 28.81 | 28.48 | 26.68 | 29.88 | 30.18 | 23.48 |
30
+ | Llama2-13B | 31.25 | 31.06 | 27.11 | 35.5 | 31.35 | 25.45 |
31
+ | Jais-13B-base | 30.01 | 27.85 | 25.42 | 39.7 | 27.06 | 35.67 |
32
+ | AceGPT-7B-base | 30.36 | 26.63 | 28.17 | 35.15 | 31.5 | 31.96 |
33
+ | AceGPT-13B-base | <u>37.26</u> | <u>35.16</u> | <u>30.3</u> | <u>47.34</u> | <u>36.25</u> | <u>36.63</u> |
34
+ | ChatGPT | <b>46.07</b> | <b>44.17</b> | <b>35.33</b> | <b>61.26</b> | <b>43.52</b> | <b>45.63 </b> |
35
+
36
+ ---
37
+ ## Samples
38
+ #### Arabic MMLU (5-shot)
39
+ فيما يلي أسئلة الاختيار من متعدد (مع الإجابات) حول جبر تجريدي
40
+ سؤال: العثور على جميع قيم c في Z_3 بحيث يكون Z_3 [x]/(x^2+c) حقلًا.
41
+ A. 0
42
+ B. 1
43
+ C. 2
44
+ D. 3
45
+ إجابة: B
46
+
47
+
48
+ سؤال: البيان رقم 1 | إذا كان aH عنصرًا في مجموعة العوامل ، فإن | aH | يقسم | a |. البيان رقم 2 | إذا كانت H و K مجموعات فرعية لـ G ، فإن HK مجموعة فرعية لـ G.
49
+ A. صحيح ، صحيح
50
+ B. خطأ ، خطأ
51
+ C. صحيح ، خطأ
52
+ D. خطأ ، صحيح
53
+ إجابة: B
54
+
55
+
56
+ سؤال: العبارة 1 | كل عنصر من مجموعة يولد مجموعة دورية من المجموعة. العبارة 2 | المجموعة المتناظرة S_10 لديها 10 عناصر.
57
+ A. صحيح، صحيح
58
+ B. خطأ، خطأ
59
+ C. صحيح، خطأ
60
+ D. خطأ، صحيح
61
+ إجابة: C
62
+
63
+
64
+ سؤال: البيان 1| كل وظيفة من مجموعة محدودة على نفسها يجب أن تكون واحدة لكل مجموعة. البيان 2 | كل فرع فرعي لمجموعة أبيلية هو أبيلي.
65
+ A. صحيح, صحيح
66
+ B. خاطئ, خاطئ
67
+ C. صحيح, خاطئ
68
+ D. خاطئ, صحيح\nإجابة: A
69
+
70
+ سؤال: اعثر على خاصية الحلقة 2Z.
71
+ A. 0
72
+ B. 3
73
+ C. 12
74
+ D. 30
75
+ إجابة: A
76
+
77
+ سؤال: ما هو الدرجة للامتداد الميداني الناتج من Q(sqrt(2), sqrt(3), sqrt(18)) على Q؟
78
+ A. 0
79
+ B. 4
80
+ C. 2
81
+ D. 6
82
+ إجابة:",
83
+ # You can get more detail at https://github.com/FreedomIntelligence/AceGPT/tree/main
config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "meta-llama/Llama-2-13b-hf",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "bos_token_id": 1,
7
+ "eos_token_id": 2,
8
+ "hidden_act": "silu",
9
+ "hidden_size": 5120,
10
+ "initializer_range": 0.02,
11
+ "intermediate_size": 13824,
12
+ "max_length": 4096,
13
+ "max_position_embeddings": 2048,
14
+ "model_type": "llama",
15
+ "num_attention_heads": 40,
16
+ "num_hidden_layers": 40,
17
+ "num_key_value_heads": 40,
18
+ "pad_token_id": 0,
19
+ "pretraining_tp": 1,
20
+ "rms_norm_eps": 1e-05,
21
+ "rope_scaling": null,
22
+ "tie_word_embeddings": false,
23
+ "torch_dtype": "float16",
24
+ "transformers_version": "4.32.0.dev0",
25
+ "use_cache": true,
26
+ "vocab_size": 32000
27
+ }
generation_config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
+ "eos_token_id": 2,
5
+ "max_length": 4096,
6
+ "pad_token_id": 0,
7
+ "temperature": 0.9,
8
+ "top_p": 0.6,
9
+ "transformers_version": "4.32.0.dev0"
10
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3e667c53d7bc38e49326f30238e24300ff71606efd19654bc0cd7b62551ccbed
3
+ size 26031781208
special_tokens_map.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<s>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "</s>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "unk_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ }
23
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:9e556afd44213b6bd1be2b850ebbbd98f5481437a8021afaf58ee7fb1818d347
3
+ size 499723
tokenizer_config.json ADDED
@@ -0,0 +1,34 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": true,
3
+ "add_eos_token": false,
4
+ "bos_token": {
5
+ "__type": "AddedToken",
6
+ "content": "<s>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false
11
+ },
12
+ "clean_up_tokenization_spaces": false,
13
+ "eos_token": {
14
+ "__type": "AddedToken",
15
+ "content": "</s>",
16
+ "lstrip": false,
17
+ "normalized": false,
18
+ "rstrip": false,
19
+ "single_word": false
20
+ },
21
+ "legacy": false,
22
+ "model_max_length": 1000000000000000019884624838656,
23
+ "pad_token": null,
24
+ "sp_model_kwargs": {},
25
+ "tokenizer_class": "LlamaTokenizer",
26
+ "unk_token": {
27
+ "__type": "AddedToken",
28
+ "content": "<unk>",
29
+ "lstrip": false,
30
+ "normalized": false,
31
+ "rstrip": false,
32
+ "single_word": false
33
+ }
34
+ }