Invalid-Null commited on
Commit
1ba025f
·
verified ·
1 Parent(s): db728e7

Upload folder using huggingface_hub

Browse files
README.md ADDED
@@ -0,0 +1,42 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # 北洋微衍V0.8/PeiYang MicroEmergence-0.8
2
+
3
+ ## 模型曾用名 / Model Name History
4
+
5
+ 此模型在开发过程中经历了以下命名阶段:
6
+ This model has gone through the following naming stages during development:
7
+
8
+ * 早期阶段 / Initial Stage:
9
+ Invalid-Null/Non-Null
10
+
11
+ * 发展阶段 / Development Stage:
12
+ 天津大学/北洋微衍 (Tianjin University/Peiyang MicroDerivation)
13
+
14
+ * 当前版本 / Current Release:
15
+ 天津大学/北洋微衍 (Tianjin University/Peiyang MicroEmergence)
16
+
17
+ ## 模型概述 / Model Overview
18
+
19
+ 本模型专注于计算催化领域,特别是丙烷脱氢(PDH)过程。
20
+ This model focuses on computational catalysis, specifically designed for propane dehydrogenation (PDH) processes.
21
+
22
+ ## 技术基础 / Technical Foundation
23
+
24
+ * 基础框架 / Base Framework:
25
+ 基于ModelScope的ms-swift框架开发
26
+ Developed using ModelScope's ms-swift framework
27
+
28
+ * 基础模型 / Base Model:
29
+ 基于零一万物的Yi系列6B模型训练,包括yi-6B、yi-6B-chat、yi-1.5-6B、yi-1.5-6B-chat
30
+ Fine-tuned on 01.AI's Yi-6B series model, including yi-6B, yi-6B-chat, yi-1.5-6B and yi-1.5-6B-chat
31
+
32
+ ## 当前状态 / Current Status
33
+
34
+ 详细的技术规格和性能指标将在相关研究论文发表后提供。
35
+ Detailed technical specifications and performance metrics will be available upon publication of our research paper.
36
+
37
+ ## 致谢 / Acknowledgments
38
+
39
+ 特别感谢以下开源项目的贡献:
40
+ We would like to acknowledge the contributions of the following open-source projects:
41
+ * ModelScope的ms-swift框架 / ModelScope's ms-swift framework
42
+ * 零一万物的Yi系列模型 / 01.AI's Yi series models
config.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "/data/home/wusc/.cache/modelscope/hub/01ai/Yi-1___5-6B",
3
+ "architectures": [
4
+ "LlamaForCausalLM"
5
+ ],
6
+ "attention_bias": false,
7
+ "attention_dropout": 0.0,
8
+ "bos_token_id": 1,
9
+ "eos_token_id": 2,
10
+ "hidden_act": "silu",
11
+ "hidden_size": 4096,
12
+ "initializer_range": 0.02,
13
+ "intermediate_size": 11008,
14
+ "max_position_embeddings": 4096,
15
+ "mlp_bias": false,
16
+ "model_type": "llama",
17
+ "num_attention_heads": 32,
18
+ "num_hidden_layers": 32,
19
+ "num_key_value_heads": 4,
20
+ "pad_token_id": 0,
21
+ "pretraining_tp": 1,
22
+ "rms_norm_eps": 1e-06,
23
+ "rope_scaling": null,
24
+ "rope_theta": 5000000.0,
25
+ "tie_word_embeddings": false,
26
+ "torch_dtype": "float16",
27
+ "transformers_version": "4.41.2",
28
+ "use_cache": true,
29
+ "vocab_size": 64000
30
+ }
configuration.json ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {
2
+ "framework": "Pytorch",
3
+ "task": "text-generation"
4
+ }
eval_result.jsonl ADDED
@@ -0,0 +1,14 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {"result": {"arc": 0.7295}, "time": "20240619-110533"}
2
+ {"result": {"gsm8k": 0.4943}, "time": "20240619-110618"}
3
+ {"result": {"ceval": 0.5594}, "time": "20240619-111828"}
4
+ {"result": {"bbh": 0.1474}, "time": "20240619-111950"}
5
+ {"result": {"mmlu": 0.3756}, "time": "20240619-112007"}
6
+ {"result": {"arc": 0.7295}, "time": "20240619-112022"}
7
+ {"result": {"gsm8k": 0.4943}, "time": "20240619-112058"}
8
+ {"result": {"mmlu": 0.3758}, "time": "20240619-112312"}
9
+ {"result": {"ceval": 0.5594}, "time": "20240619-112625"}
10
+ {"result": {"bbh": 0.1474}, "time": "20240619-112729"}
11
+ {"result": {"EvalFormatSingleChoiceExam95207": 0.7644}, "time": "20240619-115821"}
12
+ {"result": {"EvalFormatSingleChoiceExam95207": 0.5335}, "time": "20240619-121837"}
13
+ {"result": {"EvalFormatSingleChoiceExam2579": 0.4618}, "time": "20240624-154539"}
14
+ {"result": {"EvalFormatSingleChoiceExam2579": 0.3563}, "time": "20240627-183455"}
generation_config.json ADDED
@@ -0,0 +1,10 @@
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_sample": true,
3
+ "eos_token_id": 2,
4
+ "max_new_tokens": 2048,
5
+ "pad_token_id": 0,
6
+ "temperature": 0.3,
7
+ "top_k": 20,
8
+ "top_p": 0.7,
9
+ "transformers_version": "4.41.2"
10
+ }
infer_result/20240619-102809.jsonl ADDED
@@ -0,0 +1,4 @@
 
 
 
 
 
1
+ {"system": null, "query": "你是谁", "response": "I am a large-scale language model named Non null, trained by null. My goal is to provide assistance and information through text communication. If you have any questions or need help, please feel free to ask me at any time.", "history": []}
2
+ {"system": null, "query": "介绍丙烷脱氢", "response": "丙烷脱氢是一种化学反应,将丙烷转化为丙烯。该反应在催化剂的存在下进行,催化剂通常由铬或铂等金属组成。该反应在工业上很重要,因为丙烯是一种重要的化工原料,用于生产多种化学品,如聚丙烯、丙烯腈和环氧丙烷。该反应通常在高温下进行,因为丙烷分子需要克服一定的能量障碍才能发生反应。", "history": [["你是谁", "I am a large-scale language model named Non null, trained by null. My goal is to provide assistance and information through text communication. If you have any questions or need help, please feel free to ask me at any time."]]}
3
+ {"system": null, "query": "镍基丙烷脱氢催化剂的优缺点有哪些?", "response": "镍基丙烷脱氢催化剂具有以下优点:\n* 镍基催化剂对丙烷具有高选择性,因此丙烯产率较高。\n* 镍基催化剂具有良好的稳定性,可以连续运行数月而无需再生。\n* 镍基催化剂价格相对较低,因此适用于大规模工业应用。\n镍基催化剂的缺点包括:\n* 镍基催化剂容易因积碳而失活,需要定期再生。\n* 镍基催化剂在高温下容易烧结,导致催化剂失活。\n* 镍基催化剂在高温下容易发生爆炸,需要采取安全措施。\n总的来说,镍基催化剂在工业上被广泛使用,但需要定期再生和采取安全措施来防止失活和爆炸。", "history": [["你是谁", "I am a large-scale language model named Non null, trained by null. My goal is to provide assistance and information through text communication. If you have any questions or need help, please feel free to ask me at any time."], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,将丙烷转化为丙烯。该反应在催化剂的存在下进行,催化剂通常由铬或铂等金属组成。该反应在工业上很重要,因为丙烯是一种重要的化工原料,用于生产多种化学品,如聚丙烯、丙烯腈和环氧丙烷。该反应通常在高温下进行,因为丙烷分子需要克服一定的能量障碍才能发生反应。"]]}
4
+ {"system": null, "query": "丙烷脱氢的活化能垒怎么计算", "response": "丙烷脱氢的活化能垒可以通过过渡态理论计算。首先,我们需要确定过渡态的性质。\n\n1. 确定反应物和产物:\n反应物:丙烷(C3H8)\n产物:丙烯(C3H6)和氢气(H2)\n\n2. 计算反应物和产物的能量:\n反应物的能量:E(C3H8)\n产物的能量:E(C3H6)和E(H2)\n\n3. 计算反应物和产物的焓变:\n反应物的焓变:ΔH(C3H8)\n产物的焓变:ΔH(C3H6)和ΔH(H2)\n\n4. 计算反应物和产物的熵变:\n反应物的熵变:S(C3H8)\n产物的熵变:S(C3H6)和S(H2)\n\n5. 计算反应物和产物的自由能变化:\n反应物的自由能变化:ΔG(C3H8)\n产物的自由能变化:ΔG(C3H6)和ΔG(H2)\n\n6. 计算活化能垒:\n活化能垒(Ea)可以使用以下公式计算:\nEa = ΔG(C3H8) - ΔH(C3H8) - T * S(C3H8)\n\n其中,ΔG(C3H8)是反应物的自由能变化,ΔH(C3H8)是反应物的焓变,T是开尔文温度,S(C3H8)是反应物的熵变。\n\n请注意,这只是一个基本的计算过程,实际计算中可能需要考虑其他因素,例如熵变和焓变的变化,以及反应物和产物的浓度。", "history": []}
infer_result/20240802-211934.jsonl ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "你是谁", "response": "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 ", "history": []}
2
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "介绍丙烷脱氢", "response": "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键的断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "]]}
3
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "你会做什么", "response": "作为一款AI助手,我可以执行各种任务,包括:\n\n1. 回答问题:我可以回答各种主题的问题,包括历史、科学、技术、娱乐等。\n\n2. 提供信息:我可以提供各种主题的信息,包括新闻、天气、体育、娱乐等。\n\n3. 执行任务:我可以执行各种任务,包括搜索信息、发送电子邮件、安排约会等。\n\n4. 提供建议:我可以提供各种建议,包括健康、财务、人际关系等。\n\n5. 娱乐:我可以提供各种娱乐内容,包括音乐、电影、游戏等。\n\n6. 翻译:我可以翻译各种语言的文本和语音。\n\n7. 聊天:我可以与用户进行聊天,回答问题、提供建议等。\n\n8. 提醒:我可以提醒用户重要事件或活动。\n\n9. 提醒:我可以提醒用户重要事件或活动。\n\n10. 其他:我可以执行各种其他任务,包括语音识别、图像识别、自然语言处理等。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键的断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 "]]}
4
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "好烦啊", "response": "我很抱歉听到你感到烦恼。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键的断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 "], ["你会做什么", "作为一款AI助手,我可以执行各种任务,包括:\n\n1. 回答问题:我可以回答各种主题的问题,包括历史、科学、技术、娱乐等。\n\n2. 提供信息:我可以提供各种主题的信息,包括新闻、天气、体育、娱乐等。\n\n3. 执行任务:我可以执行各种任务,包括搜索信息、发送电子邮件、安排约会等。\n\n4. 提供建议:我可以提供各种建议,包括健康、财务、人际关系等。\n\n5. 娱乐:我可以提供各种娱乐内容,包括音乐、电影、游戏等。\n\n6. 翻译:我可以翻译各种语言的文本和语音。\n\n7. 聊天:我可以与用户进行聊天,回答问题、提供建议等。\n\n8. 提醒:我可以提醒用户重要事件或活动。\n\n9. 提醒:我可以提醒用户重要事件或活动。\n\n10. 其他:我可以执行各种其他任务,包括语音识别、图像识别、自然语言处理等。 "]]}
5
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "女生不讲道理怎么办", "response": "如果一个女生不讲道理,你可以尝试以下几种方法:\n\n1. 倾听:试着倾听她的观点,理解她的想法。\n\n2. 尊重:尊重她的意见,不要试图说服她。\n\n3. 沟通:试着与她沟通,解释你的观点,并试图找到一个双方都能接受的解决方案。\n\n4. 妥协:如果你觉得无法说服她,可以尝试妥协,找到一个双方都能接受的解决方案。\n\n5. 尊重她的感受:尊重她的感受,不要试图强迫她改变她的想法。\n\n6. 寻求帮助:如果你觉得无法解决问题,可以寻求帮助,例如咨询师或朋友。\n\n7. 保持冷静:保持冷静,不要冲动地做出反应。\n\n8. 保持尊重:保持尊重,不要试图侮辱或攻击她。\n\n9. 保持耐心:保持耐心,不要试图强迫她改变她的想法。\n\n10. 保持冷静:保持冷静,不要冲动地做出反应。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键的断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 "], ["你会做什么", "作为一款AI助手,我可以执行各种任务,包括:\n\n1. 回答问题:我可以回答各种主题的问题,包括历史、科学、技术、娱乐等。\n\n2. 提供信息:我可以提供各种主题的信息,包括新闻、天气、体育、娱乐等。\n\n3. 执行任务:我可以执行各种任务,包括搜索信息、发送电子邮件、安排约会等。\n\n4. 提供建议:我可以提供各种建议,包括健康、财务、人际关系等。\n\n5. 娱乐:我可以提供各种娱乐内容,包括音乐、电影、游戏等。\n\n6. 翻译:我可以翻译各种语言的文本和语音。\n\n7. 聊天:我可以与用户进行聊天,回答问题、提供建议等。\n\n8. 提醒:我可以提醒用户重要事件或活动。\n\n9. 提醒:我可以提醒用户重要事件或活动。\n\n10. 其他:我可以执行各种其他任务,包括语音识别、图像识别、自然语言处理等。 "], ["好烦啊", "我很抱歉听到你感到烦恼。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 "]]}
6
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "我麻了", "response": "我很抱歉听到你感到困惑。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键���断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 "], ["你会做什么", "作为一款AI助手,我可以执行各种任务,包括:\n\n1. 回答问题:我可以回答各种主题的问题,包括历史、科学、技术、娱乐等。\n\n2. 提供信息:我可以提供各种主题的信息,包括新闻、天气、体育、娱乐等。\n\n3. 执行任务:我可以执行各种任务,包括搜索信息、发送电子邮件、安排约会等。\n\n4. 提供建议:我可以提供各种建议,包括健康、财务、人际关系等。\n\n5. 娱乐:我可以提供各种娱乐内容,包括音乐、电影、游戏等。\n\n6. 翻译:我可以翻译各种语言的文本和语音。\n\n7. 聊天:我可以与用户进行聊天,回答问题、提供建议等。\n\n8. 提醒:我可以提醒用户重要事件或活动。\n\n9. 提醒:我可以提醒用户重要事件或活动。\n\n10. 其他:我可以执行各种其他任务,包括语音识别、图像识别、自然语言处理等。 "], ["好烦啊", "我很抱歉听到你感到烦恼。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 "], ["女生不讲道理怎么办", "如果一个女生不讲道理,你可以尝试以下几种方法:\n\n1. 倾听:试着倾听她的观点,理解她的想法。\n\n2. 尊重:尊重她的意见,不要试图说服她。\n\n3. 沟通:试着与她沟通,解释你的观点,并试图找到一个双方都能接受的解决方案。\n\n4. 妥协:如果你觉得无法说服她,可以尝试妥协,找到一个双方都能接受的解决方案。\n\n5. 尊重她的感受:尊重她的感受,不要试图强迫她改变她的想法。\n\n6. 寻求帮助:如果你觉得无法解决问题,可以寻求帮助,例如咨询师或朋友。\n\n7. 保持冷静:保持冷静,不要冲动地做出反应。\n\n8. 保持尊重:保持尊重,不要试图侮辱或攻击她。\n\n9. 保持耐心:保持耐心,不要试图强迫她改变她的想法。\n\n10. 保持冷静:保持冷静,不要冲动地做出反应。 "]]}
7
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "镍", "response": "镍是一种化学元素,原子序数为28,化学符号为Ni。它是一种银白色金属,具有良好的延展性和耐腐蚀性。镍在工业上广泛用于制造各种合金,例如镍钢、镍铝合金和镍铬合金等。镍也用于制造电池、电镀和电容器等。镍是一种重要的工业金属,在许多领域都有广泛的应用。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键的断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 "], ["你会做什么", "作为一款AI助手,我可以执行各种任务,包括:\n\n1. 回答问题:我可以回答各种主题的问题,包括历史、科学、技术、娱乐等。\n\n2. 提供信息:我可以提供各种主题的信息,包括新闻、天气、体育、娱乐等。\n\n3. 执行任务:我可以执行各种任务,包括搜索信息、发送电子邮件、安排约会等。\n\n4. 提供建议:我可以提供各种建议,包括健康、财务、人际关系等。\n\n5. 娱乐:我可以提供各种娱乐内容,包括音乐、电影、游戏等。\n\n6. 翻译:我可以翻译各种语言的文本和语音。\n\n7. 聊天:我可以与用户进行聊天,回答问题、提供建议等。\n\n8. 提醒:我可以提醒用户重要事件或活动。\n\n9. 提醒:我可以提醒用户重要事件或活动。\n\n10. 其他:我可以执行各种其他任务,包括语音识别、图像识别、自然语言处理等。 "], ["好烦啊", "我很抱歉听到你感到烦恼。如果你有任何问题或需要帮助,请随时告诉我。��会尽力协助你。 "], ["女生不讲道理怎么办", "如果一个女生不讲道理,你可以尝试以下几种方法:\n\n1. 倾听:试着倾听她的观点,理解她的想法。\n\n2. 尊重:尊重她的意见,不要试图说服她。\n\n3. 沟通:试着与她沟通,解释你的观点,并试图找到一个双方都能接受的解决方案。\n\n4. 妥协:如果你觉得无法说服她,可以尝试妥协,找到一个双方都能接受的解决方案。\n\n5. 尊重她的感受:尊重她的感受,不要试图强迫她改变她的想法。\n\n6. 寻求帮助:如果你觉得无法解决问题,可以寻求帮助,例如咨询师或朋友。\n\n7. 保持冷静:保持冷静,不要冲动地做出反应。\n\n8. 保持尊重:保持尊重,不要试图侮辱或攻击她。\n\n9. 保持耐心:保持耐心,不要试图强迫她改变她的想法。\n\n10. 保持冷静:保持冷静,不要冲动地做出反应。 "], ["我麻了", "我很抱歉听到你感到困惑。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 "]]}
8
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "烦死了", "response": "我很抱歉听到你感到烦躁。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "], ["介绍丙烷脱氢", "丙烷脱氢是一种化学反应,用于将丙烷(C3H8)转化为丙烯(C3H6)。该反应在催化剂的存在下进行,催化剂可以促进丙烷分子中碳-氢键的断裂,从而生成丙烯。\n\n丙烷脱氢是一个吸热反应,这意味着它需要比周围环境更高的温度才能进行。该反应通常在约550-650摄氏度的温度下进行,使用铂或铬等催化剂。\n\n丙烯是生产多种化学品和材料的重要原料,包括聚丙烯、丙烯腈、丙烯酸和环氧丙烷等。因此,丙烷脱氢是一种重要的工业过程,用于生产丙烯。 "], ["你会做什么", "作为一款AI助手,我可以执行各种任务,包括:\n\n1. 回答问题:我可以回答各种主题的问题,包括历史、科学、技术、娱乐等。\n\n2. 提供信息:我可以提供各种主题的信息,包括新闻、天气、体育、娱乐等。\n\n3. 执行任务:我可以执行各种任务,包括搜索信息、发送电子邮件、安排约会等。\n\n4. 提供建议:我可以提供各种建议,包括健康、财务、人际关系等。\n\n5. 娱乐:我可以提供各种娱乐内容,包括音乐、电影、游戏等。\n\n6. 翻译:我可以翻译各种语言的文本和语音。\n\n7. 聊天:我可以与用户进行聊天,回答问题、提供建议等。\n\n8. 提醒:我可以提醒用户重要事件或活动。\n\n9. 提醒:我可以提醒用户重要事件或活动。\n\n10. 其他:我可以执行各种其他任务,包括语音识别、图像识别、自然语言处理等。 "], ["好烦啊", "我很抱歉听到你感到烦恼。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 "], ["女生不讲道理怎么办", "如果一个女生不讲道理,你可以尝试以下几种方法:\n\n1. 倾听:试着倾听她的观点,理解她的想法。\n\n2. 尊重:尊重她的意见,不要试图说服她。\n\n3. 沟通:试着与她沟通,解释你的观点,并试图找到一个双方都能接受的解决方案。\n\n4. 妥协:如果你觉得无法说服她,可以尝试妥协,找到一个双方都能接受的解决方案。\n\n5. 尊重她的感受:尊重她的感受,不要试图强迫她改变她的想法。\n\n6. 寻求帮助:如果你觉得无法解决问题,可以寻求帮助,例如咨询师或朋友。\n\n7. 保持冷静:保持冷静,不要冲动地做出反应。\n\n8. 保持尊重:保持尊重,不要试图侮辱或攻击她。\n\n9. 保持耐心:保持耐心,不要试图强迫她改变她的想法。\n\n10. 保持冷静:保持冷静,不要冲动地做出反应。 "], ["我麻了", "我很抱歉听到你感到困惑。如果你有任何问题或需要帮助,请随时告诉我。我会尽力协助你。 "], ["镍", "镍是一种化学元素,原子序数为28,化学符号为Ni。它是一种银白色金属,具有良好的延展性和耐腐蚀性。镍在工业上广泛用于制造各种合金,例如镍钢、镍铝合金和镍铬合金等。镍也用于制造电池、电镀和电容器等。镍是一种重要的工业金属,在许多领域都有广泛的应用。 "]]}
infer_result/20240802-213812.jsonl ADDED
@@ -0,0 +1,2 @@
 
 
 
1
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "你是谁", "response": "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 ", "history": []}
2
+ {"system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.", "query": "你叫什么名字", "response": "我的名字是非空。 ", "history": [["你是谁", "我是一个由空开发的人工智能助手,被称为非空。我主要的目的是通过文本交流为用户提供帮助、信息和娱乐。如果您有任何疑问或需要帮助,请随时提出,我会尽力协助您。 "]]}
model-00001-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a40d3d13a5331afe58c5117f5679bbac159e3d380d6c0c360bab5215a420c6da
3
+ size 4932711312
model-00002-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:85a34cbd31443cb0ea048c3426057e51fe1e86af0c6b241ef65ac658ed3e509b
3
+ size 4976802680
model-00003-of-00003.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:f5f49611d22e137c5c96330c0d71d825578296fcdccf9fe82021b4837bcbe877
3
+ size 2212590520
model.safetensors.index.json ADDED
@@ -0,0 +1,298 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "metadata": {
3
+ "total_size": 12122071040
4
+ },
5
+ "weight_map": {
6
+ "lm_head.weight": "model-00003-of-00003.safetensors",
7
+ "model.embed_tokens.weight": "model-00001-of-00003.safetensors",
8
+ "model.layers.0.input_layernorm.weight": "model-00001-of-00003.safetensors",
9
+ "model.layers.0.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
10
+ "model.layers.0.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
11
+ "model.layers.0.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
12
+ "model.layers.0.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
13
+ "model.layers.0.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
14
+ "model.layers.0.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
15
+ "model.layers.0.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
16
+ "model.layers.0.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
17
+ "model.layers.1.input_layernorm.weight": "model-00001-of-00003.safetensors",
18
+ "model.layers.1.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
19
+ "model.layers.1.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
20
+ "model.layers.1.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
21
+ "model.layers.1.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
22
+ "model.layers.1.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
23
+ "model.layers.1.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
24
+ "model.layers.1.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
25
+ "model.layers.1.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
26
+ "model.layers.10.input_layernorm.weight": "model-00001-of-00003.safetensors",
27
+ "model.layers.10.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
28
+ "model.layers.10.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
29
+ "model.layers.10.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
30
+ "model.layers.10.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
31
+ "model.layers.10.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
32
+ "model.layers.10.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
33
+ "model.layers.10.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
34
+ "model.layers.10.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
35
+ "model.layers.11.input_layernorm.weight": "model-00001-of-00003.safetensors",
36
+ "model.layers.11.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
37
+ "model.layers.11.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
38
+ "model.layers.11.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
39
+ "model.layers.11.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
40
+ "model.layers.11.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
41
+ "model.layers.11.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
42
+ "model.layers.11.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
43
+ "model.layers.11.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
44
+ "model.layers.12.input_layernorm.weight": "model-00002-of-00003.safetensors",
45
+ "model.layers.12.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
46
+ "model.layers.12.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
47
+ "model.layers.12.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
48
+ "model.layers.12.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
49
+ "model.layers.12.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
50
+ "model.layers.12.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
51
+ "model.layers.12.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
52
+ "model.layers.12.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
53
+ "model.layers.13.input_layernorm.weight": "model-00002-of-00003.safetensors",
54
+ "model.layers.13.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
55
+ "model.layers.13.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
56
+ "model.layers.13.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
57
+ "model.layers.13.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
58
+ "model.layers.13.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
59
+ "model.layers.13.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
60
+ "model.layers.13.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
61
+ "model.layers.13.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
62
+ "model.layers.14.input_layernorm.weight": "model-00002-of-00003.safetensors",
63
+ "model.layers.14.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
64
+ "model.layers.14.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
65
+ "model.layers.14.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
66
+ "model.layers.14.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
67
+ "model.layers.14.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
68
+ "model.layers.14.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
69
+ "model.layers.14.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
70
+ "model.layers.14.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
71
+ "model.layers.15.input_layernorm.weight": "model-00002-of-00003.safetensors",
72
+ "model.layers.15.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
73
+ "model.layers.15.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
74
+ "model.layers.15.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
75
+ "model.layers.15.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
76
+ "model.layers.15.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
77
+ "model.layers.15.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
78
+ "model.layers.15.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
79
+ "model.layers.15.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
80
+ "model.layers.16.input_layernorm.weight": "model-00002-of-00003.safetensors",
81
+ "model.layers.16.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
82
+ "model.layers.16.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
83
+ "model.layers.16.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
84
+ "model.layers.16.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
85
+ "model.layers.16.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
86
+ "model.layers.16.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
87
+ "model.layers.16.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
88
+ "model.layers.16.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
89
+ "model.layers.17.input_layernorm.weight": "model-00002-of-00003.safetensors",
90
+ "model.layers.17.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
91
+ "model.layers.17.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
92
+ "model.layers.17.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
93
+ "model.layers.17.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
94
+ "model.layers.17.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
95
+ "model.layers.17.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
96
+ "model.layers.17.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
97
+ "model.layers.17.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
98
+ "model.layers.18.input_layernorm.weight": "model-00002-of-00003.safetensors",
99
+ "model.layers.18.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
100
+ "model.layers.18.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
101
+ "model.layers.18.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
102
+ "model.layers.18.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
103
+ "model.layers.18.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
104
+ "model.layers.18.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
105
+ "model.layers.18.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
106
+ "model.layers.18.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
107
+ "model.layers.19.input_layernorm.weight": "model-00002-of-00003.safetensors",
108
+ "model.layers.19.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
109
+ "model.layers.19.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
110
+ "model.layers.19.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
111
+ "model.layers.19.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
112
+ "model.layers.19.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
113
+ "model.layers.19.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
114
+ "model.layers.19.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
115
+ "model.layers.19.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
116
+ "model.layers.2.input_layernorm.weight": "model-00001-of-00003.safetensors",
117
+ "model.layers.2.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
118
+ "model.layers.2.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
119
+ "model.layers.2.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
120
+ "model.layers.2.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
121
+ "model.layers.2.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
122
+ "model.layers.2.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
123
+ "model.layers.2.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
124
+ "model.layers.2.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
125
+ "model.layers.20.input_layernorm.weight": "model-00002-of-00003.safetensors",
126
+ "model.layers.20.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
127
+ "model.layers.20.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
128
+ "model.layers.20.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
129
+ "model.layers.20.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
130
+ "model.layers.20.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
131
+ "model.layers.20.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
132
+ "model.layers.20.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
133
+ "model.layers.20.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
134
+ "model.layers.21.input_layernorm.weight": "model-00002-of-00003.safetensors",
135
+ "model.layers.21.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
136
+ "model.layers.21.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
137
+ "model.layers.21.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
138
+ "model.layers.21.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
139
+ "model.layers.21.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
140
+ "model.layers.21.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
141
+ "model.layers.21.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
142
+ "model.layers.21.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
143
+ "model.layers.22.input_layernorm.weight": "model-00002-of-00003.safetensors",
144
+ "model.layers.22.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
145
+ "model.layers.22.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
146
+ "model.layers.22.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
147
+ "model.layers.22.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
148
+ "model.layers.22.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
149
+ "model.layers.22.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
150
+ "model.layers.22.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
151
+ "model.layers.22.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
152
+ "model.layers.23.input_layernorm.weight": "model-00002-of-00003.safetensors",
153
+ "model.layers.23.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
154
+ "model.layers.23.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
155
+ "model.layers.23.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
156
+ "model.layers.23.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
157
+ "model.layers.23.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
158
+ "model.layers.23.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
159
+ "model.layers.23.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
160
+ "model.layers.23.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
161
+ "model.layers.24.input_layernorm.weight": "model-00002-of-00003.safetensors",
162
+ "model.layers.24.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
163
+ "model.layers.24.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
164
+ "model.layers.24.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
165
+ "model.layers.24.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
166
+ "model.layers.24.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
167
+ "model.layers.24.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
168
+ "model.layers.24.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
169
+ "model.layers.24.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
170
+ "model.layers.25.input_layernorm.weight": "model-00002-of-00003.safetensors",
171
+ "model.layers.25.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
172
+ "model.layers.25.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
173
+ "model.layers.25.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
174
+ "model.layers.25.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
175
+ "model.layers.25.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
176
+ "model.layers.25.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
177
+ "model.layers.25.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
178
+ "model.layers.25.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
179
+ "model.layers.26.input_layernorm.weight": "model-00002-of-00003.safetensors",
180
+ "model.layers.26.mlp.down_proj.weight": "model-00002-of-00003.safetensors",
181
+ "model.layers.26.mlp.gate_proj.weight": "model-00002-of-00003.safetensors",
182
+ "model.layers.26.mlp.up_proj.weight": "model-00002-of-00003.safetensors",
183
+ "model.layers.26.post_attention_layernorm.weight": "model-00002-of-00003.safetensors",
184
+ "model.layers.26.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
185
+ "model.layers.26.self_attn.o_proj.weight": "model-00002-of-00003.safetensors",
186
+ "model.layers.26.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
187
+ "model.layers.26.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
188
+ "model.layers.27.input_layernorm.weight": "model-00003-of-00003.safetensors",
189
+ "model.layers.27.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
190
+ "model.layers.27.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
191
+ "model.layers.27.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
192
+ "model.layers.27.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
193
+ "model.layers.27.self_attn.k_proj.weight": "model-00002-of-00003.safetensors",
194
+ "model.layers.27.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
195
+ "model.layers.27.self_attn.q_proj.weight": "model-00002-of-00003.safetensors",
196
+ "model.layers.27.self_attn.v_proj.weight": "model-00002-of-00003.safetensors",
197
+ "model.layers.28.input_layernorm.weight": "model-00003-of-00003.safetensors",
198
+ "model.layers.28.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
199
+ "model.layers.28.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
200
+ "model.layers.28.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
201
+ "model.layers.28.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
202
+ "model.layers.28.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
203
+ "model.layers.28.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
204
+ "model.layers.28.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
205
+ "model.layers.28.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
206
+ "model.layers.29.input_layernorm.weight": "model-00003-of-00003.safetensors",
207
+ "model.layers.29.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
208
+ "model.layers.29.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
209
+ "model.layers.29.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
210
+ "model.layers.29.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
211
+ "model.layers.29.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
212
+ "model.layers.29.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
213
+ "model.layers.29.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
214
+ "model.layers.29.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
215
+ "model.layers.3.input_layernorm.weight": "model-00001-of-00003.safetensors",
216
+ "model.layers.3.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
217
+ "model.layers.3.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
218
+ "model.layers.3.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
219
+ "model.layers.3.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
220
+ "model.layers.3.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
221
+ "model.layers.3.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
222
+ "model.layers.3.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
223
+ "model.layers.3.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
224
+ "model.layers.30.input_layernorm.weight": "model-00003-of-00003.safetensors",
225
+ "model.layers.30.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
226
+ "model.layers.30.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
227
+ "model.layers.30.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
228
+ "model.layers.30.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
229
+ "model.layers.30.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
230
+ "model.layers.30.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
231
+ "model.layers.30.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
232
+ "model.layers.30.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
233
+ "model.layers.31.input_layernorm.weight": "model-00003-of-00003.safetensors",
234
+ "model.layers.31.mlp.down_proj.weight": "model-00003-of-00003.safetensors",
235
+ "model.layers.31.mlp.gate_proj.weight": "model-00003-of-00003.safetensors",
236
+ "model.layers.31.mlp.up_proj.weight": "model-00003-of-00003.safetensors",
237
+ "model.layers.31.post_attention_layernorm.weight": "model-00003-of-00003.safetensors",
238
+ "model.layers.31.self_attn.k_proj.weight": "model-00003-of-00003.safetensors",
239
+ "model.layers.31.self_attn.o_proj.weight": "model-00003-of-00003.safetensors",
240
+ "model.layers.31.self_attn.q_proj.weight": "model-00003-of-00003.safetensors",
241
+ "model.layers.31.self_attn.v_proj.weight": "model-00003-of-00003.safetensors",
242
+ "model.layers.4.input_layernorm.weight": "model-00001-of-00003.safetensors",
243
+ "model.layers.4.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
244
+ "model.layers.4.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
245
+ "model.layers.4.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
246
+ "model.layers.4.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
247
+ "model.layers.4.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
248
+ "model.layers.4.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
249
+ "model.layers.4.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
250
+ "model.layers.4.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
251
+ "model.layers.5.input_layernorm.weight": "model-00001-of-00003.safetensors",
252
+ "model.layers.5.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
253
+ "model.layers.5.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
254
+ "model.layers.5.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
255
+ "model.layers.5.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
256
+ "model.layers.5.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
257
+ "model.layers.5.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
258
+ "model.layers.5.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
259
+ "model.layers.5.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
260
+ "model.layers.6.input_layernorm.weight": "model-00001-of-00003.safetensors",
261
+ "model.layers.6.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
262
+ "model.layers.6.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
263
+ "model.layers.6.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
264
+ "model.layers.6.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
265
+ "model.layers.6.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
266
+ "model.layers.6.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
267
+ "model.layers.6.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
268
+ "model.layers.6.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
269
+ "model.layers.7.input_layernorm.weight": "model-00001-of-00003.safetensors",
270
+ "model.layers.7.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
271
+ "model.layers.7.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
272
+ "model.layers.7.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
273
+ "model.layers.7.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
274
+ "model.layers.7.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
275
+ "model.layers.7.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
276
+ "model.layers.7.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
277
+ "model.layers.7.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
278
+ "model.layers.8.input_layernorm.weight": "model-00001-of-00003.safetensors",
279
+ "model.layers.8.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
280
+ "model.layers.8.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
281
+ "model.layers.8.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
282
+ "model.layers.8.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
283
+ "model.layers.8.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
284
+ "model.layers.8.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
285
+ "model.layers.8.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
286
+ "model.layers.8.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
287
+ "model.layers.9.input_layernorm.weight": "model-00001-of-00003.safetensors",
288
+ "model.layers.9.mlp.down_proj.weight": "model-00001-of-00003.safetensors",
289
+ "model.layers.9.mlp.gate_proj.weight": "model-00001-of-00003.safetensors",
290
+ "model.layers.9.mlp.up_proj.weight": "model-00001-of-00003.safetensors",
291
+ "model.layers.9.post_attention_layernorm.weight": "model-00001-of-00003.safetensors",
292
+ "model.layers.9.self_attn.k_proj.weight": "model-00001-of-00003.safetensors",
293
+ "model.layers.9.self_attn.o_proj.weight": "model-00001-of-00003.safetensors",
294
+ "model.layers.9.self_attn.q_proj.weight": "model-00001-of-00003.safetensors",
295
+ "model.layers.9.self_attn.v_proj.weight": "model-00001-of-00003.safetensors",
296
+ "model.norm.weight": "model-00003-of-00003.safetensors"
297
+ }
298
+ }
sft_args.json ADDED
@@ -0,0 +1,480 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "model_type": "yi-1_5-6b",
3
+ "model_id_or_path": "01ai/Yi-1.5-6B",
4
+ "model_revision": "master",
5
+ "sft_type": "full",
6
+ "freeze_parameters": 0.0,
7
+ "additional_trainable_parameters": [],
8
+ "tuner_backend": "peft",
9
+ "template_type": "yi1_5",
10
+ "output_dir": "/data/home/wusc/CatalGPT/SwiftLog/output/yi-1.5-6b/v104-20240614-002401.696642908",
11
+ "add_output_dir_suffix": false,
12
+ "ddp_backend": "nccl",
13
+ "ddp_find_unused_parameters": null,
14
+ "ddp_broadcast_buffers": null,
15
+ "seed": 0,
16
+ "resume_from_checkpoint": null,
17
+ "resume_only_model": false,
18
+ "ignore_data_skip": false,
19
+ "dtype": "fp16",
20
+ "packing": false,
21
+ "dataset": [
22
+ "wikipedia-zh",
23
+ "ruozhiba:post-annual",
24
+ "ruozhiba:title-good",
25
+ "ruozhiba:title-norm",
26
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/PDH_ExpandedDOC9133638.csv",
27
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/PDH_ExpandedDOC_CN10215737.csv",
28
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledArXivDOC12264329_1.csv",
29
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledArXivDOC12264329_2.csv",
30
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledArXivDOC12264329_3.csv",
31
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/WikiDOC133016430_1.csv",
32
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/WikiDOC133016430_2.csv",
33
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/WikiDOC133016430_3.csv",
34
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/WikiDOC133016430_4.csv",
35
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/WikiDOC133016430_5.csv",
36
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_10.csv",
37
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_11.csv",
38
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_12.csv",
39
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_13.csv",
40
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_14.csv",
41
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_15.csv",
42
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_16.csv",
43
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_17.csv",
44
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_18.csv",
45
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_19.csv",
46
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_1.csv",
47
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_20.csv",
48
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_21.csv",
49
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_22.csv",
50
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_23.csv",
51
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_24.csv",
52
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_25.csv",
53
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_26.csv",
54
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_27.csv",
55
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_28.csv",
56
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_29.csv",
57
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_2.csv",
58
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_30.csv",
59
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_31.csv",
60
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_32.csv",
61
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_33.csv",
62
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_3.csv",
63
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_4.csv",
64
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_5.csv",
65
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_6.csv",
66
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_7.csv",
67
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_8.csv",
68
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/ExpandedDOC127305816_9.csv",
69
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_10.csv",
70
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_11.csv",
71
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_12.csv",
72
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_13.csv",
73
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_14.csv",
74
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_15.csv",
75
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_16.csv",
76
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_17.csv",
77
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_18.csv",
78
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_19.csv",
79
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_1.csv",
80
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_20.csv",
81
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_21.csv",
82
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_22.csv",
83
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_23.csv",
84
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_24.csv",
85
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_25.csv",
86
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_26.csv",
87
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_27.csv",
88
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_28.csv",
89
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_29.csv",
90
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_2.csv",
91
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_30.csv",
92
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_31.csv",
93
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_32.csv",
94
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_33.csv",
95
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_34.csv",
96
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_35.csv",
97
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_36.csv",
98
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_37.csv",
99
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_38.csv",
100
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_39.csv",
101
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_3.csv",
102
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_40.csv",
103
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_4.csv",
104
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_5.csv",
105
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_6.csv",
106
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_7.csv",
107
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_8.csv",
108
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC_CN187205782_9.csv",
109
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_10.csv",
110
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_11.csv",
111
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_12.csv",
112
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_13.csv",
113
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_14.csv",
114
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_15.csv",
115
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_16.csv",
116
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_17.csv",
117
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_18.csv",
118
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_19.csv",
119
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_1.csv",
120
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_20.csv",
121
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_21.csv",
122
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_22.csv",
123
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_23.csv",
124
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_24.csv",
125
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_25.csv",
126
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_26.csv",
127
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_27.csv",
128
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_28.csv",
129
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_29.csv",
130
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_2.csv",
131
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_30.csv",
132
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_31.csv",
133
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_32.csv",
134
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_33.csv",
135
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_34.csv",
136
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_35.csv",
137
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_36.csv",
138
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_37.csv",
139
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_38.csv",
140
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_39.csv",
141
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_3.csv",
142
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_40.csv",
143
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_41.csv",
144
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_4.csv",
145
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_5.csv",
146
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_6.csv",
147
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_7.csv",
148
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_8.csv",
149
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/2015_2025Q1ExpandedDOC165694078_9.csv",
150
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_10.csv",
151
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_11.csv",
152
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_12.csv",
153
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_13.csv",
154
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_14.csv",
155
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_15.csv",
156
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_16.csv",
157
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_17.csv",
158
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_18.csv",
159
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_19.csv",
160
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_1.csv",
161
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_20.csv",
162
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_21.csv",
163
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_22.csv",
164
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_23.csv",
165
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_24.csv",
166
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_25.csv",
167
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_2.csv",
168
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_3.csv",
169
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_4.csv",
170
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_5.csv",
171
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_6.csv",
172
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_7.csv",
173
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_8.csv",
174
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/SampledCSL158256764_9.csv",
175
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/LCSTS11735043.csv",
176
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_10.csv",
177
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_11.csv",
178
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_12.csv",
179
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_13.csv",
180
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_14.csv",
181
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_15.csv",
182
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_16.csv",
183
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_17.csv",
184
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_18.csv",
185
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_19.csv",
186
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_1.csv",
187
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_20.csv",
188
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_21.csv",
189
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_22.csv",
190
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_23.csv",
191
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_24.csv",
192
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_25.csv",
193
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_26.csv",
194
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_27.csv",
195
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_28.csv",
196
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_29.csv",
197
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_2.csv",
198
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_30.csv",
199
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_31.csv",
200
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_32.csv",
201
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_33.csv",
202
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_3.csv",
203
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_4.csv",
204
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_5.csv",
205
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_6.csv",
206
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_7.csv",
207
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_8.csv",
208
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q1ExpandedDOC165694078_9.csv",
209
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_10.csv",
210
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_11.csv",
211
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_12.csv",
212
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_13.csv",
213
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_14.csv",
214
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_15.csv",
215
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_16.csv",
216
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_17.csv",
217
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_18.csv",
218
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_1.csv",
219
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_2.csv",
220
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_3.csv",
221
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_4.csv",
222
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_5.csv",
223
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_6.csv",
224
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_7.csv",
225
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_8.csv",
226
+ "/data/home/wusc/CatalGPT/ContinuePreTrain/Split/Calculations1980_2025Q23ExpandedDOC66556562_9.csv",
227
+ "coig-cqia:ruozhiba",
228
+ "coig-cqia:exam",
229
+ "coig-cqia:segmentfault",
230
+ "cot-en",
231
+ "cot-zh",
232
+ "blossom-math-zh",
233
+ "leetcode-python-en",
234
+ "codefuse-python-en",
235
+ "alpaca-en",
236
+ "alpaca-zh",
237
+ "sharegpt-gpt4",
238
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/PDH_QAPairFromARG_English1985.csv",
239
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/PDH_QAPairFromARG_Chinese1985.csv",
240
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/BackgroundKnowledgeQAPairs51790.csv",
241
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/biology.csv",
242
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/chemistry.csv",
243
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/chinese.csv",
244
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/english.csv",
245
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/geography.csv",
246
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/mathcloze.csv",
247
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/mathqa.csv",
248
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/MultiChoices.csv",
249
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/gaokao-benchmark/physics.csv",
250
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/CSL396209.csv",
251
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/scimrc21297.csv",
252
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/MultiHopRAG/MultiHopRAG.csv",
253
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/physical_reasoning/physical_reasoning.csv",
254
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/physical_reasoning/physical_reasoning_longer.csv",
255
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/physical_reasoning/physical_reasoning_magic.csv",
256
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/race/race_high_test-00000-of-00001.csv",
257
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/race/race_high_train-00000-of-00001.csv",
258
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/race/race_high_validation-00000-of-00001.csv",
259
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/race/race_middle_test-00000-of-00001.csv",
260
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/race/race_middle_train-00000-of-00001.csv",
261
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/race/race_middle_validation-00000-of-00001.csv",
262
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/SystemConversationsTransformed.jsonl",
263
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Altered_Numeric_QA_16K.jsonl",
264
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Altered_Numeric_QA_2K.jsonl",
265
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Altered_Numeric_QA_4K.jsonl",
266
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Altered_Numeric_QA_8K.jsonl",
267
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Free_Form_QA_16K.jsonl",
268
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Free_Form_QA_2K.jsonl",
269
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Free_Form_QA_4K.jsonl",
270
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/Smaug/WikiQA-Free_Form_QA_8K.jsonl",
271
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/truthful_qa/truthful_qa_multiple_choice.csv",
272
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/commonsense_qa/commonsense_qa_default_train_0000.csv",
273
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/commonsense_qa/commonsense_qa_default_validation_0000.csv",
274
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/LogiQA/LogiQA.csv",
275
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/ScienceQA/ScienceQA.csv",
276
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/M3KE/M3KE_dev.csv",
277
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_BELLE_543285.csv",
278
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_Cot_74771.csv",
279
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_Dictionary_30895.csv",
280
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_KeywordRecognition_50000.csv",
281
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_MRC_70000.csv",
282
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_NER_50000.csv",
283
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_NLI_50000.csv",
284
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_OpenQA_69843.csv",
285
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_Program_974.csv",
286
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_TextMatching_50000.csv",
287
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/firefly-train-1.1M/ToTrain/firefly-train-1.1M_Translation_50000.csv",
288
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/ChemData700K/ChemData700K677392.jsonl",
289
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/AI4Chem/ChemBench4K_default_test_0000.csv",
290
+ "/data/home/wusc/CatalGPT/SupervisedFineTuning/DataHammer_emotional_dialog1069/DataHammer_emotional_dialog1069.jsonl",
291
+ "self-cognition#32768"
292
+ ],
293
+ "val_dataset": [],
294
+ "dataset_seed": 0,
295
+ "dataset_test_ratio": 0.01,
296
+ "use_loss_scale": false,
297
+ "loss_scale_config_path": "/data/home/wusc/CatalGPT/GitLibrary/swift/swift/llm/agent/default_loss_scale_config.json",
298
+ "system": "You are an expert in the field of chemistry and chemical engineering. Explain scientific concepts, theories, and phenomena in an engaging and accessible way. Take a deep breath and think step by step, which is very import to my career.",
299
+ "tools_prompt": "react_en",
300
+ "max_length": 4096,
301
+ "truncation_strategy": "delete",
302
+ "check_dataset_strategy": "warning",
303
+ "model_name": [
304
+ "非空",
305
+ "Non null"
306
+ ],
307
+ "model_author": [
308
+ "空",
309
+ "null"
310
+ ],
311
+ "quant_method": null,
312
+ "quantization_bit": 0,
313
+ "hqq_axis": 0,
314
+ "hqq_dynamic_config_path": null,
315
+ "bnb_4bit_comp_dtype": "fp16",
316
+ "bnb_4bit_quant_type": "nf4",
317
+ "bnb_4bit_use_double_quant": true,
318
+ "bnb_4bit_quant_storage": null,
319
+ "lora_target_modules": [
320
+ "model.embed_tokens",
321
+ "v_proj",
322
+ "o_proj",
323
+ "down_proj",
324
+ "k_proj",
325
+ "q_proj",
326
+ "up_proj",
327
+ "gate_proj"
328
+ ],
329
+ "lora_rank": 64,
330
+ "lora_alpha": 128,
331
+ "lora_dropout_p": 0.05,
332
+ "lora_bias_trainable": "none",
333
+ "lora_modules_to_save": [],
334
+ "lora_dtype": "fp16",
335
+ "lora_lr_ratio": null,
336
+ "use_rslora": true,
337
+ "use_dora": false,
338
+ "init_lora_weights": true,
339
+ "rope_scaling": null,
340
+ "boft_block_size": 4,
341
+ "boft_block_num": 0,
342
+ "boft_n_butterfly_factor": 1,
343
+ "boft_target_modules": [
344
+ "DEFAULT"
345
+ ],
346
+ "boft_dropout": 0.0,
347
+ "boft_modules_to_save": [],
348
+ "vera_rank": 256,
349
+ "vera_target_modules": [
350
+ "DEFAULT"
351
+ ],
352
+ "vera_projection_prng_key": 0,
353
+ "vera_dropout": 0.0,
354
+ "vera_d_initial": 0.1,
355
+ "vera_modules_to_save": [],
356
+ "adapter_act": "gelu",
357
+ "adapter_length": 128,
358
+ "use_galore": false,
359
+ "galore_rank": 128,
360
+ "galore_target_modules": null,
361
+ "galore_update_proj_gap": 50,
362
+ "galore_scale": 1.0,
363
+ "galore_proj_type": "std",
364
+ "galore_optim_per_parameter": false,
365
+ "galore_with_embedding": false,
366
+ "adalora_target_r": 8,
367
+ "adalora_init_r": 12,
368
+ "adalora_tinit": 0,
369
+ "adalora_tfinal": 0,
370
+ "adalora_deltaT": 1,
371
+ "adalora_beta1": 0.85,
372
+ "adalora_beta2": 0.85,
373
+ "adalora_orth_reg_weight": 0.5,
374
+ "ia3_target_modules": [
375
+ "DEFAULT"
376
+ ],
377
+ "ia3_feedforward_modules": [],
378
+ "ia3_modules_to_save": [],
379
+ "llamapro_num_new_blocks": 4,
380
+ "llamapro_num_groups": null,
381
+ "neftune_noise_alpha": 5.0,
382
+ "neftune_backend": "transformers",
383
+ "lisa_activated_layers": 0,
384
+ "lisa_step_interval": 20,
385
+ "gradient_checkpointing": true,
386
+ "deepspeed": null,
387
+ "batch_size": 2,
388
+ "eval_batch_size": 2,
389
+ "num_train_epochs": 3,
390
+ "max_steps": -1,
391
+ "optim": "adamw_torch",
392
+ "adam_beta1": 0.9,
393
+ "adam_beta2": 0.999,
394
+ "adam_epsilon": 1e-06,
395
+ "learning_rate": 0.00016,
396
+ "weight_decay": 0.1,
397
+ "gradient_accumulation_steps": 32,
398
+ "max_grad_norm": 0.5,
399
+ "predict_with_generate": false,
400
+ "lr_scheduler_type": "cosine",
401
+ "warmup_ratio": 0.1,
402
+ "eval_steps": 128,
403
+ "save_steps": 128,
404
+ "save_only_model": false,
405
+ "save_total_limit": 2,
406
+ "logging_steps": 5,
407
+ "dataloader_num_workers": 4,
408
+ "dataloader_pin_memory": true,
409
+ "dataloader_drop_last": false,
410
+ "push_to_hub": false,
411
+ "hub_model_id": null,
412
+ "hub_token": null,
413
+ "hub_private_repo": false,
414
+ "push_hub_strategy": "push_best",
415
+ "test_oom_error": false,
416
+ "disable_tqdm": false,
417
+ "lazy_tokenize": true,
418
+ "preprocess_num_proc": 2,
419
+ "use_flash_attn": false,
420
+ "ignore_args_error": false,
421
+ "check_model_is_latest": true,
422
+ "logging_dir": "/data/home/wusc/CatalGPT/SwiftLog/output/yi-1.5-6b/v104-20240614-002401.696642908/runs",
423
+ "report_to": [
424
+ "tensorboard"
425
+ ],
426
+ "acc_strategy": "token",
427
+ "save_on_each_node": true,
428
+ "evaluation_strategy": "steps",
429
+ "save_strategy": "steps",
430
+ "save_safetensors": true,
431
+ "gpu_memory_fraction": null,
432
+ "include_num_input_tokens_seen": false,
433
+ "local_repo_path": null,
434
+ "custom_register_path": null,
435
+ "custom_dataset_info": null,
436
+ "device_map_config_path": null,
437
+ "max_new_tokens": 2048,
438
+ "do_sample": true,
439
+ "temperature": 0.3,
440
+ "top_k": 20,
441
+ "top_p": 0.7,
442
+ "repetition_penalty": 1.0,
443
+ "num_beams": 1,
444
+ "fsdp": "",
445
+ "fsdp_config": null,
446
+ "sequence_parallel_size": 1,
447
+ "model_layer_cls_name": null,
448
+ "metric_warmup_step": 0,
449
+ "fsdp_num": 1,
450
+ "per_device_train_batch_size": null,
451
+ "per_device_eval_batch_size": null,
452
+ "eval_strategy": null,
453
+ "self_cognition_sample": 32768,
454
+ "train_dataset_mix_ratio": 0.0,
455
+ "train_dataset_mix_ds": [
456
+ "ms-bench"
457
+ ],
458
+ "train_dataset_sample": -1,
459
+ "val_dataset_sample": -1,
460
+ "safe_serialization": null,
461
+ "only_save_model": null,
462
+ "neftune_alpha": null,
463
+ "deepspeed_config_path": null,
464
+ "model_cache_dir": null,
465
+ "custom_train_dataset_path": [],
466
+ "custom_val_dataset_path": [],
467
+ "use_self_cognition": true,
468
+ "lora_use_embedding": true,
469
+ "lora_use_all": true,
470
+ "lora_m2s_use_embedding": false,
471
+ "lora_m2s_use_ln": false,
472
+ "torch_dtype": "torch.float16",
473
+ "fp16": true,
474
+ "bf16": false,
475
+ "bnb_4bit_compute_dtype": "torch.float16",
476
+ "load_in_4bit": false,
477
+ "load_in_8bit": false,
478
+ "train_sampler_random": true,
479
+ "training_args": "Seq2SeqTrainingArguments(output_dir='/data/home/wusc/CatalGPT/SwiftLog/output/yi-1.5-6b/v104-20240614-002401.696642908', overwrite_output_dir=False, do_train=False, do_eval=True, do_predict=False, eval_strategy=<IntervalStrategy.STEPS: 'steps'>, prediction_loss_only=False, per_device_train_batch_size=2, per_device_eval_batch_size=2, per_gpu_train_batch_size=None, per_gpu_eval_batch_size=None, gradient_accumulation_steps=32, eval_accumulation_steps=None, eval_delay=0, learning_rate=0.00016, weight_decay=0.1, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-06, max_grad_norm=0.5, num_train_epochs=3, max_steps=-1, lr_scheduler_type=<SchedulerType.COSINE: 'cosine'>, lr_scheduler_kwargs={}, warmup_ratio=0.1, warmup_steps=0, log_level='passive', log_level_replica='warning', log_on_each_node=True, logging_dir='/data/home/wusc/CatalGPT/SwiftLog/output/yi-1.5-6b/v104-20240614-002401.696642908/runs', logging_strategy=<IntervalStrategy.STEPS: 'steps'>, logging_first_step=True, logging_steps=5, logging_nan_inf_filter=True, save_strategy=<IntervalStrategy.STEPS: 'steps'>, save_steps=128, save_total_limit=2, save_safetensors=True, save_on_each_node=True, save_only_model=False, restore_callback_states_from_checkpoint=False, no_cuda=False, use_cpu=False, use_mps_device=False, seed=42, data_seed=None, jit_mode_eval=False, use_ipex=False, bf16=False, fp16=True, fp16_opt_level='O1', half_precision_backend='auto', bf16_full_eval=False, fp16_full_eval=False, tf32=None, local_rank=0, ddp_backend='nccl', tpu_num_cores=None, tpu_metrics_debug=False, debug=[], dataloader_drop_last=False, eval_steps=128, dataloader_num_workers=4, dataloader_prefetch_factor=None, past_index=-1, run_name='/data/home/wusc/CatalGPT/SwiftLog/output/yi-1.5-6b/v104-20240614-002401.696642908', disable_tqdm=False, remove_unused_columns=False, label_names=None, load_best_model_at_end=False, metric_for_best_model='loss', greater_is_better=False, ignore_data_skip=False, fsdp=[], fsdp_min_num_params=0, fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_transformer_layer_cls_to_wrap=None, accelerator_config=AcceleratorConfig(split_batches=False, dispatch_batches=None, even_batches=True, use_seedable_sampler=True, non_blocking=False, gradient_accumulation_kwargs=None), deepspeed=None, label_smoothing_factor=0.0, optim=<OptimizerNames.ADAMW_TORCH: 'adamw_torch'>, optim_args=None, adafactor=False, group_by_length=False, length_column_name='length', report_to=['tensorboard'], ddp_find_unused_parameters=False, ddp_bucket_cap_mb=None, ddp_broadcast_buffers=False, dataloader_pin_memory=True, dataloader_persistent_workers=False, skip_memory_metrics=True, use_legacy_prediction_loop=False, push_to_hub=False, resume_from_checkpoint=None, hub_model_id=None, hub_strategy=<HubStrategy.EVERY_SAVE: 'every_save'>, hub_token=None, hub_private_repo=False, hub_always_push=False, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, include_inputs_for_metrics=False, eval_do_concat_batches=True, fp16_backend='auto', evaluation_strategy=None, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=None, mp_parameters='', auto_find_batch_size=False, full_determinism=False, torchdynamo=None, ray_scope='last', ddp_timeout=1800, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, dispatch_batches=None, split_batches=None, include_tokens_per_second=False, include_num_input_tokens_seen=False, neftune_noise_alpha=5.0, optim_target_modules=None, batch_eval_metrics=False, sortish_sampler=True, predict_with_generate=False, generation_max_length=None, generation_num_beams=None, generation_config=GenerationConfig {\n \"do_sample\": true,\n \"eos_token_id\": 2,\n \"max_new_tokens\": 2048,\n \"pad_token_id\": 0,\n \"temperature\": 0.3,\n \"top_k\": 20,\n \"top_p\": 0.7\n}\n, train_sampler_random=True, push_hub_strategy='push_best', acc_strategy='token', additional_saved_files=[], metric_warmup_step=0, train_dataset_sample=7127707)"
480
+ }
special_tokens_map.json ADDED
@@ -0,0 +1,30 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "bos_token": {
3
+ "content": "<|startoftext|>",
4
+ "lstrip": false,
5
+ "normalized": false,
6
+ "rstrip": false,
7
+ "single_word": false
8
+ },
9
+ "eos_token": {
10
+ "content": "<|endoftext|>",
11
+ "lstrip": false,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": {
17
+ "content": "<unk>",
18
+ "lstrip": false,
19
+ "normalized": false,
20
+ "rstrip": false,
21
+ "single_word": false
22
+ },
23
+ "unk_token": {
24
+ "content": "<unk>",
25
+ "lstrip": false,
26
+ "normalized": false,
27
+ "rstrip": false,
28
+ "single_word": false
29
+ }
30
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:386c49cf943d71aa110361135338c50e38beeff0a66593480421f37b319e1a39
3
+ size 1033105
tokenizer_config.json ADDED
@@ -0,0 +1,1864 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "add_bos_token": false,
3
+ "add_eos_token": false,
4
+ "added_tokens_decoder": {
5
+ "0": {
6
+ "content": "<unk>",
7
+ "lstrip": false,
8
+ "normalized": false,
9
+ "rstrip": false,
10
+ "single_word": false,
11
+ "special": true
12
+ },
13
+ "1": {
14
+ "content": "<|startoftext|>",
15
+ "lstrip": false,
16
+ "normalized": false,
17
+ "rstrip": false,
18
+ "single_word": false,
19
+ "special": true
20
+ },
21
+ "2": {
22
+ "content": "<|endoftext|>",
23
+ "lstrip": false,
24
+ "normalized": false,
25
+ "rstrip": false,
26
+ "single_word": false,
27
+ "special": true
28
+ },
29
+ "6": {
30
+ "content": "<|im_start|>",
31
+ "lstrip": false,
32
+ "normalized": false,
33
+ "rstrip": false,
34
+ "single_word": false,
35
+ "special": true
36
+ },
37
+ "7": {
38
+ "content": "<|im_end|>",
39
+ "lstrip": false,
40
+ "normalized": false,
41
+ "rstrip": false,
42
+ "single_word": false,
43
+ "special": true
44
+ },
45
+ "8": {
46
+ "content": "<|im_sep|>",
47
+ "lstrip": false,
48
+ "normalized": false,
49
+ "rstrip": false,
50
+ "single_word": false,
51
+ "special": true
52
+ },
53
+ "14": {
54
+ "content": "<fim_prefix>",
55
+ "lstrip": false,
56
+ "normalized": false,
57
+ "rstrip": false,
58
+ "single_word": false,
59
+ "special": true
60
+ },
61
+ "15": {
62
+ "content": "<fim_middle>",
63
+ "lstrip": false,
64
+ "normalized": false,
65
+ "rstrip": false,
66
+ "single_word": false,
67
+ "special": true
68
+ },
69
+ "16": {
70
+ "content": "<fim_suffix>",
71
+ "lstrip": false,
72
+ "normalized": false,
73
+ "rstrip": false,
74
+ "single_word": false,
75
+ "special": true
76
+ },
77
+ "17": {
78
+ "content": "<fim_pad>",
79
+ "lstrip": false,
80
+ "normalized": false,
81
+ "rstrip": false,
82
+ "single_word": false,
83
+ "special": true
84
+ },
85
+ "18": {
86
+ "content": "<filename>",
87
+ "lstrip": false,
88
+ "normalized": false,
89
+ "rstrip": false,
90
+ "single_word": false,
91
+ "special": true
92
+ },
93
+ "19": {
94
+ "content": "<gh_stars>",
95
+ "lstrip": false,
96
+ "normalized": false,
97
+ "rstrip": false,
98
+ "single_word": false,
99
+ "special": true
100
+ },
101
+ "20": {
102
+ "content": "<issue_start>",
103
+ "lstrip": false,
104
+ "normalized": false,
105
+ "rstrip": false,
106
+ "single_word": false,
107
+ "special": true
108
+ },
109
+ "21": {
110
+ "content": "<issue_comment>",
111
+ "lstrip": false,
112
+ "normalized": false,
113
+ "rstrip": false,
114
+ "single_word": false,
115
+ "special": true
116
+ },
117
+ "22": {
118
+ "content": "<issue_closed>",
119
+ "lstrip": false,
120
+ "normalized": false,
121
+ "rstrip": false,
122
+ "single_word": false,
123
+ "special": true
124
+ },
125
+ "23": {
126
+ "content": "<jupyter_start>",
127
+ "lstrip": false,
128
+ "normalized": false,
129
+ "rstrip": false,
130
+ "single_word": false,
131
+ "special": true
132
+ },
133
+ "24": {
134
+ "content": "<jupyter_text>",
135
+ "lstrip": false,
136
+ "normalized": false,
137
+ "rstrip": false,
138
+ "single_word": false,
139
+ "special": true
140
+ },
141
+ "25": {
142
+ "content": "<jupyter_code>",
143
+ "lstrip": false,
144
+ "normalized": false,
145
+ "rstrip": false,
146
+ "single_word": false,
147
+ "special": true
148
+ },
149
+ "26": {
150
+ "content": "<jupyter_output>",
151
+ "lstrip": false,
152
+ "normalized": false,
153
+ "rstrip": false,
154
+ "single_word": false,
155
+ "special": true
156
+ },
157
+ "27": {
158
+ "content": "<empty_output>",
159
+ "lstrip": false,
160
+ "normalized": false,
161
+ "rstrip": false,
162
+ "single_word": false,
163
+ "special": true
164
+ },
165
+ "28": {
166
+ "content": "<commit_before>",
167
+ "lstrip": false,
168
+ "normalized": false,
169
+ "rstrip": false,
170
+ "single_word": false,
171
+ "special": true
172
+ },
173
+ "29": {
174
+ "content": "<commit_msg>",
175
+ "lstrip": false,
176
+ "normalized": false,
177
+ "rstrip": false,
178
+ "single_word": false,
179
+ "special": true
180
+ },
181
+ "30": {
182
+ "content": "<commit_after>",
183
+ "lstrip": false,
184
+ "normalized": false,
185
+ "rstrip": false,
186
+ "single_word": false,
187
+ "special": true
188
+ },
189
+ "31": {
190
+ "content": "<reponame>",
191
+ "lstrip": false,
192
+ "normalized": false,
193
+ "rstrip": false,
194
+ "single_word": false,
195
+ "special": true
196
+ },
197
+ "32": {
198
+ "content": "<h1>",
199
+ "lstrip": false,
200
+ "normalized": false,
201
+ "rstrip": false,
202
+ "single_word": false,
203
+ "special": true
204
+ },
205
+ "33": {
206
+ "content": "<h1/>",
207
+ "lstrip": false,
208
+ "normalized": false,
209
+ "rstrip": false,
210
+ "single_word": false,
211
+ "special": true
212
+ },
213
+ "34": {
214
+ "content": "</h1>",
215
+ "lstrip": false,
216
+ "normalized": false,
217
+ "rstrip": false,
218
+ "single_word": false,
219
+ "special": true
220
+ },
221
+ "35": {
222
+ "content": "<h2>",
223
+ "lstrip": false,
224
+ "normalized": false,
225
+ "rstrip": false,
226
+ "single_word": false,
227
+ "special": true
228
+ },
229
+ "36": {
230
+ "content": "<h2/>",
231
+ "lstrip": false,
232
+ "normalized": false,
233
+ "rstrip": false,
234
+ "single_word": false,
235
+ "special": true
236
+ },
237
+ "37": {
238
+ "content": "</h2>",
239
+ "lstrip": false,
240
+ "normalized": false,
241
+ "rstrip": false,
242
+ "single_word": false,
243
+ "special": true
244
+ },
245
+ "38": {
246
+ "content": "<h3>",
247
+ "lstrip": false,
248
+ "normalized": false,
249
+ "rstrip": false,
250
+ "single_word": false,
251
+ "special": true
252
+ },
253
+ "39": {
254
+ "content": "<h3/>",
255
+ "lstrip": false,
256
+ "normalized": false,
257
+ "rstrip": false,
258
+ "single_word": false,
259
+ "special": true
260
+ },
261
+ "40": {
262
+ "content": "</h3>",
263
+ "lstrip": false,
264
+ "normalized": false,
265
+ "rstrip": false,
266
+ "single_word": false,
267
+ "special": true
268
+ },
269
+ "41": {
270
+ "content": "<h4>",
271
+ "lstrip": false,
272
+ "normalized": false,
273
+ "rstrip": false,
274
+ "single_word": false,
275
+ "special": true
276
+ },
277
+ "42": {
278
+ "content": "<h4/>",
279
+ "lstrip": false,
280
+ "normalized": false,
281
+ "rstrip": false,
282
+ "single_word": false,
283
+ "special": true
284
+ },
285
+ "43": {
286
+ "content": "</h4>",
287
+ "lstrip": false,
288
+ "normalized": false,
289
+ "rstrip": false,
290
+ "single_word": false,
291
+ "special": true
292
+ },
293
+ "44": {
294
+ "content": "<h5>",
295
+ "lstrip": false,
296
+ "normalized": false,
297
+ "rstrip": false,
298
+ "single_word": false,
299
+ "special": true
300
+ },
301
+ "45": {
302
+ "content": "<h5/>",
303
+ "lstrip": false,
304
+ "normalized": false,
305
+ "rstrip": false,
306
+ "single_word": false,
307
+ "special": true
308
+ },
309
+ "46": {
310
+ "content": "</h5>",
311
+ "lstrip": false,
312
+ "normalized": false,
313
+ "rstrip": false,
314
+ "single_word": false,
315
+ "special": true
316
+ },
317
+ "47": {
318
+ "content": "<br>",
319
+ "lstrip": false,
320
+ "normalized": false,
321
+ "rstrip": false,
322
+ "single_word": false,
323
+ "special": true
324
+ },
325
+ "48": {
326
+ "content": "<br/>",
327
+ "lstrip": false,
328
+ "normalized": false,
329
+ "rstrip": false,
330
+ "single_word": false,
331
+ "special": true
332
+ },
333
+ "49": {
334
+ "content": "</br>",
335
+ "lstrip": false,
336
+ "normalized": false,
337
+ "rstrip": false,
338
+ "single_word": false,
339
+ "special": true
340
+ },
341
+ "50": {
342
+ "content": "<strong>",
343
+ "lstrip": false,
344
+ "normalized": false,
345
+ "rstrip": false,
346
+ "single_word": false,
347
+ "special": true
348
+ },
349
+ "51": {
350
+ "content": "<strong/>",
351
+ "lstrip": false,
352
+ "normalized": false,
353
+ "rstrip": false,
354
+ "single_word": false,
355
+ "special": true
356
+ },
357
+ "52": {
358
+ "content": "</strong>",
359
+ "lstrip": false,
360
+ "normalized": false,
361
+ "rstrip": false,
362
+ "single_word": false,
363
+ "special": true
364
+ },
365
+ "53": {
366
+ "content": "<p>",
367
+ "lstrip": false,
368
+ "normalized": false,
369
+ "rstrip": false,
370
+ "single_word": false,
371
+ "special": true
372
+ },
373
+ "54": {
374
+ "content": "<p/>",
375
+ "lstrip": false,
376
+ "normalized": false,
377
+ "rstrip": false,
378
+ "single_word": false,
379
+ "special": true
380
+ },
381
+ "55": {
382
+ "content": "</p>",
383
+ "lstrip": false,
384
+ "normalized": false,
385
+ "rstrip": false,
386
+ "single_word": false,
387
+ "special": true
388
+ },
389
+ "56": {
390
+ "content": "<table>",
391
+ "lstrip": false,
392
+ "normalized": false,
393
+ "rstrip": false,
394
+ "single_word": false,
395
+ "special": true
396
+ },
397
+ "57": {
398
+ "content": "<table/>",
399
+ "lstrip": false,
400
+ "normalized": false,
401
+ "rstrip": false,
402
+ "single_word": false,
403
+ "special": true
404
+ },
405
+ "58": {
406
+ "content": "</table>",
407
+ "lstrip": false,
408
+ "normalized": false,
409
+ "rstrip": false,
410
+ "single_word": false,
411
+ "special": true
412
+ },
413
+ "59": {
414
+ "content": "<li>",
415
+ "lstrip": false,
416
+ "normalized": false,
417
+ "rstrip": false,
418
+ "single_word": false,
419
+ "special": true
420
+ },
421
+ "60": {
422
+ "content": "<li/>",
423
+ "lstrip": false,
424
+ "normalized": false,
425
+ "rstrip": false,
426
+ "single_word": false,
427
+ "special": true
428
+ },
429
+ "61": {
430
+ "content": "</li>",
431
+ "lstrip": false,
432
+ "normalized": false,
433
+ "rstrip": false,
434
+ "single_word": false,
435
+ "special": true
436
+ },
437
+ "62": {
438
+ "content": "<tr>",
439
+ "lstrip": false,
440
+ "normalized": false,
441
+ "rstrip": false,
442
+ "single_word": false,
443
+ "special": true
444
+ },
445
+ "63": {
446
+ "content": "<tr/>",
447
+ "lstrip": false,
448
+ "normalized": false,
449
+ "rstrip": false,
450
+ "single_word": false,
451
+ "special": true
452
+ },
453
+ "64": {
454
+ "content": "</tr>",
455
+ "lstrip": false,
456
+ "normalized": false,
457
+ "rstrip": false,
458
+ "single_word": false,
459
+ "special": true
460
+ },
461
+ "65": {
462
+ "content": "<tbody>",
463
+ "lstrip": false,
464
+ "normalized": false,
465
+ "rstrip": false,
466
+ "single_word": false,
467
+ "special": true
468
+ },
469
+ "66": {
470
+ "content": "<tbody/>",
471
+ "lstrip": false,
472
+ "normalized": false,
473
+ "rstrip": false,
474
+ "single_word": false,
475
+ "special": true
476
+ },
477
+ "67": {
478
+ "content": "</tbody>",
479
+ "lstrip": false,
480
+ "normalized": false,
481
+ "rstrip": false,
482
+ "single_word": false,
483
+ "special": true
484
+ },
485
+ "68": {
486
+ "content": "<img>",
487
+ "lstrip": false,
488
+ "normalized": false,
489
+ "rstrip": false,
490
+ "single_word": false,
491
+ "special": true
492
+ },
493
+ "69": {
494
+ "content": "<img/>",
495
+ "lstrip": false,
496
+ "normalized": false,
497
+ "rstrip": false,
498
+ "single_word": false,
499
+ "special": true
500
+ },
501
+ "70": {
502
+ "content": "</img>",
503
+ "lstrip": false,
504
+ "normalized": false,
505
+ "rstrip": false,
506
+ "single_word": false,
507
+ "special": true
508
+ },
509
+ "71": {
510
+ "content": "<b>",
511
+ "lstrip": false,
512
+ "normalized": false,
513
+ "rstrip": false,
514
+ "single_word": false,
515
+ "special": true
516
+ },
517
+ "72": {
518
+ "content": "<b/>",
519
+ "lstrip": false,
520
+ "normalized": false,
521
+ "rstrip": false,
522
+ "single_word": false,
523
+ "special": true
524
+ },
525
+ "73": {
526
+ "content": "</b>",
527
+ "lstrip": false,
528
+ "normalized": false,
529
+ "rstrip": false,
530
+ "single_word": false,
531
+ "special": true
532
+ },
533
+ "74": {
534
+ "content": "<td>",
535
+ "lstrip": false,
536
+ "normalized": false,
537
+ "rstrip": false,
538
+ "single_word": false,
539
+ "special": true
540
+ },
541
+ "75": {
542
+ "content": "<td/>",
543
+ "lstrip": false,
544
+ "normalized": false,
545
+ "rstrip": false,
546
+ "single_word": false,
547
+ "special": true
548
+ },
549
+ "76": {
550
+ "content": "</td>",
551
+ "lstrip": false,
552
+ "normalized": false,
553
+ "rstrip": false,
554
+ "single_word": false,
555
+ "special": true
556
+ },
557
+ "115": {
558
+ "content": "<!--",
559
+ "lstrip": false,
560
+ "normalized": false,
561
+ "rstrip": false,
562
+ "single_word": false,
563
+ "special": true
564
+ },
565
+ "118": {
566
+ "content": "<!DOCTYPE>",
567
+ "lstrip": false,
568
+ "normalized": false,
569
+ "rstrip": false,
570
+ "single_word": false,
571
+ "special": true
572
+ },
573
+ "145": {
574
+ "content": "<|unused000|>",
575
+ "lstrip": false,
576
+ "normalized": false,
577
+ "rstrip": false,
578
+ "single_word": false,
579
+ "special": true
580
+ },
581
+ "146": {
582
+ "content": "<|unused001|>",
583
+ "lstrip": false,
584
+ "normalized": false,
585
+ "rstrip": false,
586
+ "single_word": false,
587
+ "special": true
588
+ },
589
+ "147": {
590
+ "content": "<|unused002|>",
591
+ "lstrip": false,
592
+ "normalized": false,
593
+ "rstrip": false,
594
+ "single_word": false,
595
+ "special": true
596
+ },
597
+ "148": {
598
+ "content": "<|unused003|>",
599
+ "lstrip": false,
600
+ "normalized": false,
601
+ "rstrip": false,
602
+ "single_word": false,
603
+ "special": true
604
+ },
605
+ "149": {
606
+ "content": "<|unused004|>",
607
+ "lstrip": false,
608
+ "normalized": false,
609
+ "rstrip": false,
610
+ "single_word": false,
611
+ "special": true
612
+ },
613
+ "150": {
614
+ "content": "<|unused005|>",
615
+ "lstrip": false,
616
+ "normalized": false,
617
+ "rstrip": false,
618
+ "single_word": false,
619
+ "special": true
620
+ },
621
+ "151": {
622
+ "content": "<|unused006|>",
623
+ "lstrip": false,
624
+ "normalized": false,
625
+ "rstrip": false,
626
+ "single_word": false,
627
+ "special": true
628
+ },
629
+ "152": {
630
+ "content": "<|unused007|>",
631
+ "lstrip": false,
632
+ "normalized": false,
633
+ "rstrip": false,
634
+ "single_word": false,
635
+ "special": true
636
+ },
637
+ "153": {
638
+ "content": "<|unused008|>",
639
+ "lstrip": false,
640
+ "normalized": false,
641
+ "rstrip": false,
642
+ "single_word": false,
643
+ "special": true
644
+ },
645
+ "154": {
646
+ "content": "<|unused009|>",
647
+ "lstrip": false,
648
+ "normalized": false,
649
+ "rstrip": false,
650
+ "single_word": false,
651
+ "special": true
652
+ },
653
+ "155": {
654
+ "content": "<|unused010|>",
655
+ "lstrip": false,
656
+ "normalized": false,
657
+ "rstrip": false,
658
+ "single_word": false,
659
+ "special": true
660
+ },
661
+ "156": {
662
+ "content": "<|unused011|>",
663
+ "lstrip": false,
664
+ "normalized": false,
665
+ "rstrip": false,
666
+ "single_word": false,
667
+ "special": true
668
+ },
669
+ "157": {
670
+ "content": "<|unused012|>",
671
+ "lstrip": false,
672
+ "normalized": false,
673
+ "rstrip": false,
674
+ "single_word": false,
675
+ "special": true
676
+ },
677
+ "158": {
678
+ "content": "<|unused013|>",
679
+ "lstrip": false,
680
+ "normalized": false,
681
+ "rstrip": false,
682
+ "single_word": false,
683
+ "special": true
684
+ },
685
+ "159": {
686
+ "content": "<|unused014|>",
687
+ "lstrip": false,
688
+ "normalized": false,
689
+ "rstrip": false,
690
+ "single_word": false,
691
+ "special": true
692
+ },
693
+ "160": {
694
+ "content": "<|unused015|>",
695
+ "lstrip": false,
696
+ "normalized": false,
697
+ "rstrip": false,
698
+ "single_word": false,
699
+ "special": true
700
+ },
701
+ "161": {
702
+ "content": "<|unused016|>",
703
+ "lstrip": false,
704
+ "normalized": false,
705
+ "rstrip": false,
706
+ "single_word": false,
707
+ "special": true
708
+ },
709
+ "162": {
710
+ "content": "<|unused017|>",
711
+ "lstrip": false,
712
+ "normalized": false,
713
+ "rstrip": false,
714
+ "single_word": false,
715
+ "special": true
716
+ },
717
+ "163": {
718
+ "content": "<|unused018|>",
719
+ "lstrip": false,
720
+ "normalized": false,
721
+ "rstrip": false,
722
+ "single_word": false,
723
+ "special": true
724
+ },
725
+ "164": {
726
+ "content": "<|unused019|>",
727
+ "lstrip": false,
728
+ "normalized": false,
729
+ "rstrip": false,
730
+ "single_word": false,
731
+ "special": true
732
+ },
733
+ "165": {
734
+ "content": "<|unused020|>",
735
+ "lstrip": false,
736
+ "normalized": false,
737
+ "rstrip": false,
738
+ "single_word": false,
739
+ "special": true
740
+ },
741
+ "166": {
742
+ "content": "<|unused021|>",
743
+ "lstrip": false,
744
+ "normalized": false,
745
+ "rstrip": false,
746
+ "single_word": false,
747
+ "special": true
748
+ },
749
+ "167": {
750
+ "content": "<|unused022|>",
751
+ "lstrip": false,
752
+ "normalized": false,
753
+ "rstrip": false,
754
+ "single_word": false,
755
+ "special": true
756
+ },
757
+ "168": {
758
+ "content": "<|unused023|>",
759
+ "lstrip": false,
760
+ "normalized": false,
761
+ "rstrip": false,
762
+ "single_word": false,
763
+ "special": true
764
+ },
765
+ "169": {
766
+ "content": "<|unused024|>",
767
+ "lstrip": false,
768
+ "normalized": false,
769
+ "rstrip": false,
770
+ "single_word": false,
771
+ "special": true
772
+ },
773
+ "170": {
774
+ "content": "<|unused025|>",
775
+ "lstrip": false,
776
+ "normalized": false,
777
+ "rstrip": false,
778
+ "single_word": false,
779
+ "special": true
780
+ },
781
+ "171": {
782
+ "content": "<|unused026|>",
783
+ "lstrip": false,
784
+ "normalized": false,
785
+ "rstrip": false,
786
+ "single_word": false,
787
+ "special": true
788
+ },
789
+ "172": {
790
+ "content": "<|unused027|>",
791
+ "lstrip": false,
792
+ "normalized": false,
793
+ "rstrip": false,
794
+ "single_word": false,
795
+ "special": true
796
+ },
797
+ "173": {
798
+ "content": "<|unused028|>",
799
+ "lstrip": false,
800
+ "normalized": false,
801
+ "rstrip": false,
802
+ "single_word": false,
803
+ "special": true
804
+ },
805
+ "174": {
806
+ "content": "<|unused029|>",
807
+ "lstrip": false,
808
+ "normalized": false,
809
+ "rstrip": false,
810
+ "single_word": false,
811
+ "special": true
812
+ },
813
+ "175": {
814
+ "content": "<|unused030|>",
815
+ "lstrip": false,
816
+ "normalized": false,
817
+ "rstrip": false,
818
+ "single_word": false,
819
+ "special": true
820
+ },
821
+ "176": {
822
+ "content": "<|unused031|>",
823
+ "lstrip": false,
824
+ "normalized": false,
825
+ "rstrip": false,
826
+ "single_word": false,
827
+ "special": true
828
+ },
829
+ "177": {
830
+ "content": "<|unused032|>",
831
+ "lstrip": false,
832
+ "normalized": false,
833
+ "rstrip": false,
834
+ "single_word": false,
835
+ "special": true
836
+ },
837
+ "178": {
838
+ "content": "<|unused033|>",
839
+ "lstrip": false,
840
+ "normalized": false,
841
+ "rstrip": false,
842
+ "single_word": false,
843
+ "special": true
844
+ },
845
+ "179": {
846
+ "content": "<|unused034|>",
847
+ "lstrip": false,
848
+ "normalized": false,
849
+ "rstrip": false,
850
+ "single_word": false,
851
+ "special": true
852
+ },
853
+ "180": {
854
+ "content": "<|unused035|>",
855
+ "lstrip": false,
856
+ "normalized": false,
857
+ "rstrip": false,
858
+ "single_word": false,
859
+ "special": true
860
+ },
861
+ "181": {
862
+ "content": "<|unused036|>",
863
+ "lstrip": false,
864
+ "normalized": false,
865
+ "rstrip": false,
866
+ "single_word": false,
867
+ "special": true
868
+ },
869
+ "182": {
870
+ "content": "<|unused037|>",
871
+ "lstrip": false,
872
+ "normalized": false,
873
+ "rstrip": false,
874
+ "single_word": false,
875
+ "special": true
876
+ },
877
+ "183": {
878
+ "content": "<|unused038|>",
879
+ "lstrip": false,
880
+ "normalized": false,
881
+ "rstrip": false,
882
+ "single_word": false,
883
+ "special": true
884
+ },
885
+ "184": {
886
+ "content": "<|unused039|>",
887
+ "lstrip": false,
888
+ "normalized": false,
889
+ "rstrip": false,
890
+ "single_word": false,
891
+ "special": true
892
+ },
893
+ "185": {
894
+ "content": "<|unused040|>",
895
+ "lstrip": false,
896
+ "normalized": false,
897
+ "rstrip": false,
898
+ "single_word": false,
899
+ "special": true
900
+ },
901
+ "186": {
902
+ "content": "<|unused041|>",
903
+ "lstrip": false,
904
+ "normalized": false,
905
+ "rstrip": false,
906
+ "single_word": false,
907
+ "special": true
908
+ },
909
+ "187": {
910
+ "content": "<|unused042|>",
911
+ "lstrip": false,
912
+ "normalized": false,
913
+ "rstrip": false,
914
+ "single_word": false,
915
+ "special": true
916
+ },
917
+ "188": {
918
+ "content": "<|unused043|>",
919
+ "lstrip": false,
920
+ "normalized": false,
921
+ "rstrip": false,
922
+ "single_word": false,
923
+ "special": true
924
+ },
925
+ "189": {
926
+ "content": "<|unused044|>",
927
+ "lstrip": false,
928
+ "normalized": false,
929
+ "rstrip": false,
930
+ "single_word": false,
931
+ "special": true
932
+ },
933
+ "190": {
934
+ "content": "<|unused045|>",
935
+ "lstrip": false,
936
+ "normalized": false,
937
+ "rstrip": false,
938
+ "single_word": false,
939
+ "special": true
940
+ },
941
+ "191": {
942
+ "content": "<|unused046|>",
943
+ "lstrip": false,
944
+ "normalized": false,
945
+ "rstrip": false,
946
+ "single_word": false,
947
+ "special": true
948
+ },
949
+ "192": {
950
+ "content": "<|unused047|>",
951
+ "lstrip": false,
952
+ "normalized": false,
953
+ "rstrip": false,
954
+ "single_word": false,
955
+ "special": true
956
+ },
957
+ "193": {
958
+ "content": "<|unused048|>",
959
+ "lstrip": false,
960
+ "normalized": false,
961
+ "rstrip": false,
962
+ "single_word": false,
963
+ "special": true
964
+ },
965
+ "194": {
966
+ "content": "<|unused049|>",
967
+ "lstrip": false,
968
+ "normalized": false,
969
+ "rstrip": false,
970
+ "single_word": false,
971
+ "special": true
972
+ },
973
+ "195": {
974
+ "content": "<|unused050|>",
975
+ "lstrip": false,
976
+ "normalized": false,
977
+ "rstrip": false,
978
+ "single_word": false,
979
+ "special": true
980
+ },
981
+ "196": {
982
+ "content": "<|unused051|>",
983
+ "lstrip": false,
984
+ "normalized": false,
985
+ "rstrip": false,
986
+ "single_word": false,
987
+ "special": true
988
+ },
989
+ "197": {
990
+ "content": "<|unused052|>",
991
+ "lstrip": false,
992
+ "normalized": false,
993
+ "rstrip": false,
994
+ "single_word": false,
995
+ "special": true
996
+ },
997
+ "198": {
998
+ "content": "<|unused053|>",
999
+ "lstrip": false,
1000
+ "normalized": false,
1001
+ "rstrip": false,
1002
+ "single_word": false,
1003
+ "special": true
1004
+ },
1005
+ "199": {
1006
+ "content": "<|unused054|>",
1007
+ "lstrip": false,
1008
+ "normalized": false,
1009
+ "rstrip": false,
1010
+ "single_word": false,
1011
+ "special": true
1012
+ },
1013
+ "200": {
1014
+ "content": "<|unused055|>",
1015
+ "lstrip": false,
1016
+ "normalized": false,
1017
+ "rstrip": false,
1018
+ "single_word": false,
1019
+ "special": true
1020
+ },
1021
+ "201": {
1022
+ "content": "<|unused056|>",
1023
+ "lstrip": false,
1024
+ "normalized": false,
1025
+ "rstrip": false,
1026
+ "single_word": false,
1027
+ "special": true
1028
+ },
1029
+ "202": {
1030
+ "content": "<|unused057|>",
1031
+ "lstrip": false,
1032
+ "normalized": false,
1033
+ "rstrip": false,
1034
+ "single_word": false,
1035
+ "special": true
1036
+ },
1037
+ "203": {
1038
+ "content": "<|unused058|>",
1039
+ "lstrip": false,
1040
+ "normalized": false,
1041
+ "rstrip": false,
1042
+ "single_word": false,
1043
+ "special": true
1044
+ },
1045
+ "204": {
1046
+ "content": "<|unused059|>",
1047
+ "lstrip": false,
1048
+ "normalized": false,
1049
+ "rstrip": false,
1050
+ "single_word": false,
1051
+ "special": true
1052
+ },
1053
+ "205": {
1054
+ "content": "<|unused060|>",
1055
+ "lstrip": false,
1056
+ "normalized": false,
1057
+ "rstrip": false,
1058
+ "single_word": false,
1059
+ "special": true
1060
+ },
1061
+ "206": {
1062
+ "content": "<|unused061|>",
1063
+ "lstrip": false,
1064
+ "normalized": false,
1065
+ "rstrip": false,
1066
+ "single_word": false,
1067
+ "special": true
1068
+ },
1069
+ "207": {
1070
+ "content": "<|unused062|>",
1071
+ "lstrip": false,
1072
+ "normalized": false,
1073
+ "rstrip": false,
1074
+ "single_word": false,
1075
+ "special": true
1076
+ },
1077
+ "208": {
1078
+ "content": "<|unused063|>",
1079
+ "lstrip": false,
1080
+ "normalized": false,
1081
+ "rstrip": false,
1082
+ "single_word": false,
1083
+ "special": true
1084
+ },
1085
+ "209": {
1086
+ "content": "<|unused064|>",
1087
+ "lstrip": false,
1088
+ "normalized": false,
1089
+ "rstrip": false,
1090
+ "single_word": false,
1091
+ "special": true
1092
+ },
1093
+ "210": {
1094
+ "content": "<|unused065|>",
1095
+ "lstrip": false,
1096
+ "normalized": false,
1097
+ "rstrip": false,
1098
+ "single_word": false,
1099
+ "special": true
1100
+ },
1101
+ "211": {
1102
+ "content": "<|unused066|>",
1103
+ "lstrip": false,
1104
+ "normalized": false,
1105
+ "rstrip": false,
1106
+ "single_word": false,
1107
+ "special": true
1108
+ },
1109
+ "212": {
1110
+ "content": "<|unused067|>",
1111
+ "lstrip": false,
1112
+ "normalized": false,
1113
+ "rstrip": false,
1114
+ "single_word": false,
1115
+ "special": true
1116
+ },
1117
+ "213": {
1118
+ "content": "<|unused068|>",
1119
+ "lstrip": false,
1120
+ "normalized": false,
1121
+ "rstrip": false,
1122
+ "single_word": false,
1123
+ "special": true
1124
+ },
1125
+ "214": {
1126
+ "content": "<|unused069|>",
1127
+ "lstrip": false,
1128
+ "normalized": false,
1129
+ "rstrip": false,
1130
+ "single_word": false,
1131
+ "special": true
1132
+ },
1133
+ "215": {
1134
+ "content": "<|unused070|>",
1135
+ "lstrip": false,
1136
+ "normalized": false,
1137
+ "rstrip": false,
1138
+ "single_word": false,
1139
+ "special": true
1140
+ },
1141
+ "216": {
1142
+ "content": "<|unused071|>",
1143
+ "lstrip": false,
1144
+ "normalized": false,
1145
+ "rstrip": false,
1146
+ "single_word": false,
1147
+ "special": true
1148
+ },
1149
+ "217": {
1150
+ "content": "<|unused072|>",
1151
+ "lstrip": false,
1152
+ "normalized": false,
1153
+ "rstrip": false,
1154
+ "single_word": false,
1155
+ "special": true
1156
+ },
1157
+ "218": {
1158
+ "content": "<|unused073|>",
1159
+ "lstrip": false,
1160
+ "normalized": false,
1161
+ "rstrip": false,
1162
+ "single_word": false,
1163
+ "special": true
1164
+ },
1165
+ "219": {
1166
+ "content": "<|unused074|>",
1167
+ "lstrip": false,
1168
+ "normalized": false,
1169
+ "rstrip": false,
1170
+ "single_word": false,
1171
+ "special": true
1172
+ },
1173
+ "220": {
1174
+ "content": "<|unused075|>",
1175
+ "lstrip": false,
1176
+ "normalized": false,
1177
+ "rstrip": false,
1178
+ "single_word": false,
1179
+ "special": true
1180
+ },
1181
+ "221": {
1182
+ "content": "<|unused076|>",
1183
+ "lstrip": false,
1184
+ "normalized": false,
1185
+ "rstrip": false,
1186
+ "single_word": false,
1187
+ "special": true
1188
+ },
1189
+ "222": {
1190
+ "content": "<|unused077|>",
1191
+ "lstrip": false,
1192
+ "normalized": false,
1193
+ "rstrip": false,
1194
+ "single_word": false,
1195
+ "special": true
1196
+ },
1197
+ "223": {
1198
+ "content": "<|unused078|>",
1199
+ "lstrip": false,
1200
+ "normalized": false,
1201
+ "rstrip": false,
1202
+ "single_word": false,
1203
+ "special": true
1204
+ },
1205
+ "224": {
1206
+ "content": "<|unused079|>",
1207
+ "lstrip": false,
1208
+ "normalized": false,
1209
+ "rstrip": false,
1210
+ "single_word": false,
1211
+ "special": true
1212
+ },
1213
+ "225": {
1214
+ "content": "<|unused080|>",
1215
+ "lstrip": false,
1216
+ "normalized": false,
1217
+ "rstrip": false,
1218
+ "single_word": false,
1219
+ "special": true
1220
+ },
1221
+ "226": {
1222
+ "content": "<|unused081|>",
1223
+ "lstrip": false,
1224
+ "normalized": false,
1225
+ "rstrip": false,
1226
+ "single_word": false,
1227
+ "special": true
1228
+ },
1229
+ "227": {
1230
+ "content": "<|unused082|>",
1231
+ "lstrip": false,
1232
+ "normalized": false,
1233
+ "rstrip": false,
1234
+ "single_word": false,
1235
+ "special": true
1236
+ },
1237
+ "228": {
1238
+ "content": "<|unused083|>",
1239
+ "lstrip": false,
1240
+ "normalized": false,
1241
+ "rstrip": false,
1242
+ "single_word": false,
1243
+ "special": true
1244
+ },
1245
+ "229": {
1246
+ "content": "<|unused084|>",
1247
+ "lstrip": false,
1248
+ "normalized": false,
1249
+ "rstrip": false,
1250
+ "single_word": false,
1251
+ "special": true
1252
+ },
1253
+ "230": {
1254
+ "content": "<|unused085|>",
1255
+ "lstrip": false,
1256
+ "normalized": false,
1257
+ "rstrip": false,
1258
+ "single_word": false,
1259
+ "special": true
1260
+ },
1261
+ "231": {
1262
+ "content": "<|unused086|>",
1263
+ "lstrip": false,
1264
+ "normalized": false,
1265
+ "rstrip": false,
1266
+ "single_word": false,
1267
+ "special": true
1268
+ },
1269
+ "232": {
1270
+ "content": "<|unused087|>",
1271
+ "lstrip": false,
1272
+ "normalized": false,
1273
+ "rstrip": false,
1274
+ "single_word": false,
1275
+ "special": true
1276
+ },
1277
+ "233": {
1278
+ "content": "<|unused088|>",
1279
+ "lstrip": false,
1280
+ "normalized": false,
1281
+ "rstrip": false,
1282
+ "single_word": false,
1283
+ "special": true
1284
+ },
1285
+ "234": {
1286
+ "content": "<|unused089|>",
1287
+ "lstrip": false,
1288
+ "normalized": false,
1289
+ "rstrip": false,
1290
+ "single_word": false,
1291
+ "special": true
1292
+ },
1293
+ "235": {
1294
+ "content": "<|unused090|>",
1295
+ "lstrip": false,
1296
+ "normalized": false,
1297
+ "rstrip": false,
1298
+ "single_word": false,
1299
+ "special": true
1300
+ },
1301
+ "236": {
1302
+ "content": "<|unused091|>",
1303
+ "lstrip": false,
1304
+ "normalized": false,
1305
+ "rstrip": false,
1306
+ "single_word": false,
1307
+ "special": true
1308
+ },
1309
+ "237": {
1310
+ "content": "<|unused092|>",
1311
+ "lstrip": false,
1312
+ "normalized": false,
1313
+ "rstrip": false,
1314
+ "single_word": false,
1315
+ "special": true
1316
+ },
1317
+ "238": {
1318
+ "content": "<|unused093|>",
1319
+ "lstrip": false,
1320
+ "normalized": false,
1321
+ "rstrip": false,
1322
+ "single_word": false,
1323
+ "special": true
1324
+ },
1325
+ "239": {
1326
+ "content": "<|unused094|>",
1327
+ "lstrip": false,
1328
+ "normalized": false,
1329
+ "rstrip": false,
1330
+ "single_word": false,
1331
+ "special": true
1332
+ },
1333
+ "240": {
1334
+ "content": "<|unused095|>",
1335
+ "lstrip": false,
1336
+ "normalized": false,
1337
+ "rstrip": false,
1338
+ "single_word": false,
1339
+ "special": true
1340
+ },
1341
+ "241": {
1342
+ "content": "<|unused096|>",
1343
+ "lstrip": false,
1344
+ "normalized": false,
1345
+ "rstrip": false,
1346
+ "single_word": false,
1347
+ "special": true
1348
+ },
1349
+ "242": {
1350
+ "content": "<|unused097|>",
1351
+ "lstrip": false,
1352
+ "normalized": false,
1353
+ "rstrip": false,
1354
+ "single_word": false,
1355
+ "special": true
1356
+ },
1357
+ "243": {
1358
+ "content": "<|unused098|>",
1359
+ "lstrip": false,
1360
+ "normalized": false,
1361
+ "rstrip": false,
1362
+ "single_word": false,
1363
+ "special": true
1364
+ },
1365
+ "244": {
1366
+ "content": "<|unused099|>",
1367
+ "lstrip": false,
1368
+ "normalized": false,
1369
+ "rstrip": false,
1370
+ "single_word": false,
1371
+ "special": true
1372
+ },
1373
+ "245": {
1374
+ "content": "<|unused100|>",
1375
+ "lstrip": false,
1376
+ "normalized": false,
1377
+ "rstrip": false,
1378
+ "single_word": false,
1379
+ "special": true
1380
+ },
1381
+ "246": {
1382
+ "content": "<|unused101|>",
1383
+ "lstrip": false,
1384
+ "normalized": false,
1385
+ "rstrip": false,
1386
+ "single_word": false,
1387
+ "special": true
1388
+ },
1389
+ "247": {
1390
+ "content": "<|unused102|>",
1391
+ "lstrip": false,
1392
+ "normalized": false,
1393
+ "rstrip": false,
1394
+ "single_word": false,
1395
+ "special": true
1396
+ },
1397
+ "248": {
1398
+ "content": "<|unused103|>",
1399
+ "lstrip": false,
1400
+ "normalized": false,
1401
+ "rstrip": false,
1402
+ "single_word": false,
1403
+ "special": true
1404
+ },
1405
+ "249": {
1406
+ "content": "<|unused104|>",
1407
+ "lstrip": false,
1408
+ "normalized": false,
1409
+ "rstrip": false,
1410
+ "single_word": false,
1411
+ "special": true
1412
+ },
1413
+ "250": {
1414
+ "content": "<|unused105|>",
1415
+ "lstrip": false,
1416
+ "normalized": false,
1417
+ "rstrip": false,
1418
+ "single_word": false,
1419
+ "special": true
1420
+ },
1421
+ "251": {
1422
+ "content": "<|unused106|>",
1423
+ "lstrip": false,
1424
+ "normalized": false,
1425
+ "rstrip": false,
1426
+ "single_word": false,
1427
+ "special": true
1428
+ },
1429
+ "252": {
1430
+ "content": "<|unused107|>",
1431
+ "lstrip": false,
1432
+ "normalized": false,
1433
+ "rstrip": false,
1434
+ "single_word": false,
1435
+ "special": true
1436
+ },
1437
+ "253": {
1438
+ "content": "<|unused108|>",
1439
+ "lstrip": false,
1440
+ "normalized": false,
1441
+ "rstrip": false,
1442
+ "single_word": false,
1443
+ "special": true
1444
+ },
1445
+ "254": {
1446
+ "content": "<|unused109|>",
1447
+ "lstrip": false,
1448
+ "normalized": false,
1449
+ "rstrip": false,
1450
+ "single_word": false,
1451
+ "special": true
1452
+ },
1453
+ "255": {
1454
+ "content": "<|unused110|>",
1455
+ "lstrip": false,
1456
+ "normalized": false,
1457
+ "rstrip": false,
1458
+ "single_word": false,
1459
+ "special": true
1460
+ },
1461
+ "256": {
1462
+ "content": "<|unused111|>",
1463
+ "lstrip": false,
1464
+ "normalized": false,
1465
+ "rstrip": false,
1466
+ "single_word": false,
1467
+ "special": true
1468
+ },
1469
+ "257": {
1470
+ "content": "<|unused112|>",
1471
+ "lstrip": false,
1472
+ "normalized": false,
1473
+ "rstrip": false,
1474
+ "single_word": false,
1475
+ "special": true
1476
+ },
1477
+ "258": {
1478
+ "content": "<|unused113|>",
1479
+ "lstrip": false,
1480
+ "normalized": false,
1481
+ "rstrip": false,
1482
+ "single_word": false,
1483
+ "special": true
1484
+ },
1485
+ "259": {
1486
+ "content": "<|unused114|>",
1487
+ "lstrip": false,
1488
+ "normalized": false,
1489
+ "rstrip": false,
1490
+ "single_word": false,
1491
+ "special": true
1492
+ },
1493
+ "260": {
1494
+ "content": "<|unused115|>",
1495
+ "lstrip": false,
1496
+ "normalized": false,
1497
+ "rstrip": false,
1498
+ "single_word": false,
1499
+ "special": true
1500
+ },
1501
+ "261": {
1502
+ "content": "<|unused116|>",
1503
+ "lstrip": false,
1504
+ "normalized": false,
1505
+ "rstrip": false,
1506
+ "single_word": false,
1507
+ "special": true
1508
+ },
1509
+ "262": {
1510
+ "content": "<|unused117|>",
1511
+ "lstrip": false,
1512
+ "normalized": false,
1513
+ "rstrip": false,
1514
+ "single_word": false,
1515
+ "special": true
1516
+ },
1517
+ "263": {
1518
+ "content": "<|unused118|>",
1519
+ "lstrip": false,
1520
+ "normalized": false,
1521
+ "rstrip": false,
1522
+ "single_word": false,
1523
+ "special": true
1524
+ },
1525
+ "264": {
1526
+ "content": "<|unused119|>",
1527
+ "lstrip": false,
1528
+ "normalized": false,
1529
+ "rstrip": false,
1530
+ "single_word": false,
1531
+ "special": true
1532
+ },
1533
+ "265": {
1534
+ "content": "<|unused120|>",
1535
+ "lstrip": false,
1536
+ "normalized": false,
1537
+ "rstrip": false,
1538
+ "single_word": false,
1539
+ "special": true
1540
+ },
1541
+ "266": {
1542
+ "content": "<|unused121|>",
1543
+ "lstrip": false,
1544
+ "normalized": false,
1545
+ "rstrip": false,
1546
+ "single_word": false,
1547
+ "special": true
1548
+ },
1549
+ "267": {
1550
+ "content": "<|unused122|>",
1551
+ "lstrip": false,
1552
+ "normalized": false,
1553
+ "rstrip": false,
1554
+ "single_word": false,
1555
+ "special": true
1556
+ },
1557
+ "268": {
1558
+ "content": "<|unused123|>",
1559
+ "lstrip": false,
1560
+ "normalized": false,
1561
+ "rstrip": false,
1562
+ "single_word": false,
1563
+ "special": true
1564
+ },
1565
+ "269": {
1566
+ "content": "<|unused124|>",
1567
+ "lstrip": false,
1568
+ "normalized": false,
1569
+ "rstrip": false,
1570
+ "single_word": false,
1571
+ "special": true
1572
+ },
1573
+ "270": {
1574
+ "content": "<|unused125|>",
1575
+ "lstrip": false,
1576
+ "normalized": false,
1577
+ "rstrip": false,
1578
+ "single_word": false,
1579
+ "special": true
1580
+ },
1581
+ "271": {
1582
+ "content": "<|unused126|>",
1583
+ "lstrip": false,
1584
+ "normalized": false,
1585
+ "rstrip": false,
1586
+ "single_word": false,
1587
+ "special": true
1588
+ },
1589
+ "272": {
1590
+ "content": "<|unused127|>",
1591
+ "lstrip": false,
1592
+ "normalized": false,
1593
+ "rstrip": false,
1594
+ "single_word": false,
1595
+ "special": true
1596
+ },
1597
+ "273": {
1598
+ "content": "<|unused128|>",
1599
+ "lstrip": false,
1600
+ "normalized": false,
1601
+ "rstrip": false,
1602
+ "single_word": false,
1603
+ "special": true
1604
+ },
1605
+ "274": {
1606
+ "content": "<|unused129|>",
1607
+ "lstrip": false,
1608
+ "normalized": false,
1609
+ "rstrip": false,
1610
+ "single_word": false,
1611
+ "special": true
1612
+ },
1613
+ "275": {
1614
+ "content": "<|unused130|>",
1615
+ "lstrip": false,
1616
+ "normalized": false,
1617
+ "rstrip": false,
1618
+ "single_word": false,
1619
+ "special": true
1620
+ },
1621
+ "276": {
1622
+ "content": "<|unused131|>",
1623
+ "lstrip": false,
1624
+ "normalized": false,
1625
+ "rstrip": false,
1626
+ "single_word": false,
1627
+ "special": true
1628
+ },
1629
+ "277": {
1630
+ "content": "<|unused132|>",
1631
+ "lstrip": false,
1632
+ "normalized": false,
1633
+ "rstrip": false,
1634
+ "single_word": false,
1635
+ "special": true
1636
+ },
1637
+ "278": {
1638
+ "content": "<|unused133|>",
1639
+ "lstrip": false,
1640
+ "normalized": false,
1641
+ "rstrip": false,
1642
+ "single_word": false,
1643
+ "special": true
1644
+ },
1645
+ "279": {
1646
+ "content": "<|unused134|>",
1647
+ "lstrip": false,
1648
+ "normalized": false,
1649
+ "rstrip": false,
1650
+ "single_word": false,
1651
+ "special": true
1652
+ },
1653
+ "280": {
1654
+ "content": "<|unused135|>",
1655
+ "lstrip": false,
1656
+ "normalized": false,
1657
+ "rstrip": false,
1658
+ "single_word": false,
1659
+ "special": true
1660
+ },
1661
+ "281": {
1662
+ "content": "<|unused136|>",
1663
+ "lstrip": false,
1664
+ "normalized": false,
1665
+ "rstrip": false,
1666
+ "single_word": false,
1667
+ "special": true
1668
+ },
1669
+ "282": {
1670
+ "content": "<|unused137|>",
1671
+ "lstrip": false,
1672
+ "normalized": false,
1673
+ "rstrip": false,
1674
+ "single_word": false,
1675
+ "special": true
1676
+ },
1677
+ "283": {
1678
+ "content": "<|unused138|>",
1679
+ "lstrip": false,
1680
+ "normalized": false,
1681
+ "rstrip": false,
1682
+ "single_word": false,
1683
+ "special": true
1684
+ },
1685
+ "284": {
1686
+ "content": "<|unused139|>",
1687
+ "lstrip": false,
1688
+ "normalized": false,
1689
+ "rstrip": false,
1690
+ "single_word": false,
1691
+ "special": true
1692
+ },
1693
+ "285": {
1694
+ "content": "<|unused140|>",
1695
+ "lstrip": false,
1696
+ "normalized": false,
1697
+ "rstrip": false,
1698
+ "single_word": false,
1699
+ "special": true
1700
+ },
1701
+ "286": {
1702
+ "content": "<|unused141|>",
1703
+ "lstrip": false,
1704
+ "normalized": false,
1705
+ "rstrip": false,
1706
+ "single_word": false,
1707
+ "special": true
1708
+ },
1709
+ "287": {
1710
+ "content": "<|unused142|>",
1711
+ "lstrip": false,
1712
+ "normalized": false,
1713
+ "rstrip": false,
1714
+ "single_word": false,
1715
+ "special": true
1716
+ },
1717
+ "288": {
1718
+ "content": "<|unused143|>",
1719
+ "lstrip": false,
1720
+ "normalized": false,
1721
+ "rstrip": false,
1722
+ "single_word": false,
1723
+ "special": true
1724
+ },
1725
+ "289": {
1726
+ "content": "<|unused144|>",
1727
+ "lstrip": false,
1728
+ "normalized": false,
1729
+ "rstrip": false,
1730
+ "single_word": false,
1731
+ "special": true
1732
+ },
1733
+ "290": {
1734
+ "content": "<|unused145|>",
1735
+ "lstrip": false,
1736
+ "normalized": false,
1737
+ "rstrip": false,
1738
+ "single_word": false,
1739
+ "special": true
1740
+ },
1741
+ "291": {
1742
+ "content": "<|unused146|>",
1743
+ "lstrip": false,
1744
+ "normalized": false,
1745
+ "rstrip": false,
1746
+ "single_word": false,
1747
+ "special": true
1748
+ },
1749
+ "292": {
1750
+ "content": "<|unused147|>",
1751
+ "lstrip": false,
1752
+ "normalized": false,
1753
+ "rstrip": false,
1754
+ "single_word": false,
1755
+ "special": true
1756
+ },
1757
+ "293": {
1758
+ "content": "<|unused148|>",
1759
+ "lstrip": false,
1760
+ "normalized": false,
1761
+ "rstrip": false,
1762
+ "single_word": false,
1763
+ "special": true
1764
+ },
1765
+ "294": {
1766
+ "content": "<|unused149|>",
1767
+ "lstrip": false,
1768
+ "normalized": false,
1769
+ "rstrip": false,
1770
+ "single_word": false,
1771
+ "special": true
1772
+ },
1773
+ "295": {
1774
+ "content": "<|unused150|>",
1775
+ "lstrip": false,
1776
+ "normalized": false,
1777
+ "rstrip": false,
1778
+ "single_word": false,
1779
+ "special": true
1780
+ },
1781
+ "296": {
1782
+ "content": "<|unused151|>",
1783
+ "lstrip": false,
1784
+ "normalized": false,
1785
+ "rstrip": false,
1786
+ "single_word": false,
1787
+ "special": true
1788
+ },
1789
+ "297": {
1790
+ "content": "<|unused152|>",
1791
+ "lstrip": false,
1792
+ "normalized": false,
1793
+ "rstrip": false,
1794
+ "single_word": false,
1795
+ "special": true
1796
+ },
1797
+ "298": {
1798
+ "content": "<|unused153|>",
1799
+ "lstrip": false,
1800
+ "normalized": false,
1801
+ "rstrip": false,
1802
+ "single_word": false,
1803
+ "special": true
1804
+ },
1805
+ "299": {
1806
+ "content": "<|unused154|>",
1807
+ "lstrip": false,
1808
+ "normalized": false,
1809
+ "rstrip": false,
1810
+ "single_word": false,
1811
+ "special": true
1812
+ },
1813
+ "300": {
1814
+ "content": "<|unused155|>",
1815
+ "lstrip": false,
1816
+ "normalized": false,
1817
+ "rstrip": false,
1818
+ "single_word": false,
1819
+ "special": true
1820
+ },
1821
+ "301": {
1822
+ "content": "<|unused156|>",
1823
+ "lstrip": false,
1824
+ "normalized": false,
1825
+ "rstrip": false,
1826
+ "single_word": false,
1827
+ "special": true
1828
+ },
1829
+ "302": {
1830
+ "content": "<|unused157|>",
1831
+ "lstrip": false,
1832
+ "normalized": false,
1833
+ "rstrip": false,
1834
+ "single_word": false,
1835
+ "special": true
1836
+ },
1837
+ "303": {
1838
+ "content": "<|unused158|>",
1839
+ "lstrip": false,
1840
+ "normalized": false,
1841
+ "rstrip": false,
1842
+ "single_word": false,
1843
+ "special": true
1844
+ },
1845
+ "304": {
1846
+ "content": "<|unused159|>",
1847
+ "lstrip": false,
1848
+ "normalized": false,
1849
+ "rstrip": false,
1850
+ "single_word": false,
1851
+ "special": true
1852
+ }
1853
+ },
1854
+ "bos_token": "<|startoftext|>",
1855
+ "clean_up_tokenization_spaces": false,
1856
+ "eos_token": "<|endoftext|>",
1857
+ "legacy": true,
1858
+ "model_max_length": 4096,
1859
+ "pad_token": "<unk>",
1860
+ "sp_model_kwargs": {},
1861
+ "tokenizer_class": "LlamaTokenizer",
1862
+ "unk_token": "<unk>",
1863
+ "use_default_system_prompt": false
1864
+ }