Update README.md
Browse files
README.md
CHANGED
|
@@ -7,29 +7,6 @@ pipeline_tag: text-generation
|
|
| 7 |
library_name: transformers
|
| 8 |
---
|
| 9 |
|
| 10 |
-
#
|
| 11 |
|
| 12 |
-
|
| 13 |
-
|
| 14 |
-
## Model Introduction
|
| 15 |
-
|
| 16 |
-
The **GLM-4.5** series models are foundation models designed for intelligent agents. GLM-4.5 has **355** billion total parameters with **32** billion active parameters, while GLM-4.5-Air adopts a more compact design with **106** billion total parameters and **12** billion active parameters. GLM-4.5 models unify reasoning, coding, and intelligent agent capabilities to meet the complex demands of intelligent agent applications.
|
| 17 |
-
|
| 18 |
-
Both GLM-4.5 and GLM-4.5-Air are hybrid reasoning models that provide two modes: thinking mode for complex reasoning and tool usage, and non-thinking mode for immediate responses.
|
| 19 |
-
|
| 20 |
-
We have open-sourced the base models, hybrid reasoning models, and FP8 versions of the hybrid reasoning models for both GLM-4.5 and GLM-4.5-Air. They are released under the MIT open-source license and can be used commercially and for secondary development.
|
| 21 |
-
|
| 22 |
-
As demonstrated in our comprehensive evaluation across 12 industry-standard benchmarks, GLM-4.5 achieves exceptional performance with a score of **63.2**, in the **3rd** place among all the proprietary and open-source models. Notably, GLM-4.5-Air delivers competitive results at **59.8** while maintaining superior efficiency.
|
| 23 |
-
|
| 24 |
-

|
| 25 |
-
|
| 26 |
-
For more eval results, show cases, and technical details, please visit
|
| 27 |
-
our [technical blog](https://z.ai/blog/glm-4.5). The technical report will be released soon.
|
| 28 |
-
|
| 29 |
-
The model code, tool parser and reasoning parser can be found in the implementation of [transformers](https://github.com/huggingface/transformers/tree/main/src/transformers/models/glm4_moe), [vLLM](https://github.com/vllm-project/vllm/blob/main/vllm/model_executor/models/glm4_moe_mtp.py) and [SGLang](https://github.com/sgl-project/sglang/blob/main/python/sglang/srt/models/glm4_moe.py).
|
| 30 |
-
|
| 31 |
-
## Quick Start
|
| 32 |
-
|
| 33 |
-
**Note**: This is a base model, not for chat.
|
| 34 |
-
|
| 35 |
-
Please refer to our [github page](https://github.com/zai-org/GLM-4.5) for more details.
|
|
|
|
| 7 |
library_name: transformers
|
| 8 |
---
|
| 9 |
|
| 10 |
+
# INTELLECT-3-Base
|
| 11 |
|
| 12 |
+
This is a clone of [GLM-4.5-Air-Base](https://huggingface.co/zai-org/GLM-4.5-Air-Base) model with a custom chat template adapted from [Qwen3-Coder](https://huggingface.co/Qwen/Qwen3-Coder-30B-A3B-Instruct). For more detail, see this [Notion](https://www.notion.so/primeintellect/INTELLECT-3-Chat-Template-25f72940136f80fb970afaa1357ba9e7?source=copy_link) page.
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|