Update pipeline tag and add paper link to metadata and content
Browse filesThis PR improves the model card by:
* Updating the `pipeline_tag` to `text-generation`, which accurately reflects the model's capabilities in generating textual responses for medical reasoning tasks. This change will ensure the model is discoverable under the correct pipeline on the Hugging Face Hub (https://huggingface.co/models?pipeline_tag=text-generation).
* Adding the paper ID `2509.02208` to the metadata, linking the model to its official Hugging Face paper page ([`2509.02208`](https://huggingface.co/papers/2509.02208)) for enhanced visibility and context.
* Adding a prominent link to the paper at the top of the model card's content for better immediate visibility of the research artifact.
README.md
CHANGED
|
@@ -1,16 +1,21 @@
|
|
| 1 |
---
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
- chat
|
| 5 |
-
library_name: transformers
|
| 6 |
language:
|
| 7 |
- en
|
| 8 |
- zh
|
| 9 |
-
|
| 10 |
-
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
---
|
| 12 |
|
| 13 |
# Baichuan-M2-32B
|
|
|
|
|
|
|
|
|
|
| 14 |
[](https://opensource.org/licenses/Apache-2.0)
|
| 15 |
[](https://huggingface.co/baichuan-inc/Baichuan-M2-32B)
|
| 16 |
[](https://huggingface.co/baichuan-inc/Baichuan-M2-32B-GPTQ-Int4)
|
|
@@ -105,8 +110,10 @@ try:
|
|
| 105 |
except ValueError:
|
| 106 |
index = 0
|
| 107 |
|
| 108 |
-
thinking_content = tokenizer.decode(output_ids[:index], skip_special_tokens=True).strip("
|
| 109 |
-
|
|
|
|
|
|
|
| 110 |
|
| 111 |
print("thinking content:", thinking_content)
|
| 112 |
print("content:", content)
|
|
@@ -165,5 +172,4 @@ Thank you to the open-source community. We commit to continuous contribution and
|
|
| 165 |
|
| 166 |
**Empowering Healthcare with AI, Making Health Accessible to All**
|
| 167 |
|
| 168 |
-
</div>
|
| 169 |
-
|
|
|
|
| 1 |
---
|
| 2 |
+
base_model:
|
| 3 |
+
- Qwen/Qwen2.5-32B
|
|
|
|
|
|
|
| 4 |
language:
|
| 5 |
- en
|
| 6 |
- zh
|
| 7 |
+
library_name: transformers
|
| 8 |
+
license: apache-2.0
|
| 9 |
+
tags:
|
| 10 |
+
- chat
|
| 11 |
+
pipeline_tag: text-generation
|
| 12 |
+
paper: 2509.02208
|
| 13 |
---
|
| 14 |
|
| 15 |
# Baichuan-M2-32B
|
| 16 |
+
|
| 17 |
+
This repository contains the model presented in [Baichuan-M2: Scaling Medical Capability with Large Verifier System](https://huggingface.co/papers/2509.02208).
|
| 18 |
+
|
| 19 |
[](https://opensource.org/licenses/Apache-2.0)
|
| 20 |
[](https://huggingface.co/baichuan-inc/Baichuan-M2-32B)
|
| 21 |
[](https://huggingface.co/baichuan-inc/Baichuan-M2-32B-GPTQ-Int4)
|
|
|
|
| 110 |
except ValueError:
|
| 111 |
index = 0
|
| 112 |
|
| 113 |
+
thinking_content = tokenizer.decode(output_ids[:index], skip_special_tokens=True).strip("
|
| 114 |
+
")
|
| 115 |
+
content = tokenizer.decode(output_ids[index:], skip_special_tokens=True).strip("
|
| 116 |
+
")
|
| 117 |
|
| 118 |
print("thinking content:", thinking_content)
|
| 119 |
print("content:", content)
|
|
|
|
| 172 |
|
| 173 |
**Empowering Healthcare with AI, Making Health Accessible to All**
|
| 174 |
|
| 175 |
+
</div>
|
|
|