reaperdoesntknow commited on
Commit
812d32b
·
verified ·
1 Parent(s): 8dad452

Cross-link: DistilQwen collection spotlight — 2026-03-29

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -210,7 +210,7 @@ Not intended for:
210
 
211
  ## From the Convergent Intelligence Portfolio
212
 
213
- **[DistilQwen Collection](https://huggingface.co/collections/reaperdoesntknow/distilqwen-69bf40ec669117e3f069ef1c)** — Proof-weighted distillation from Qwen3-30B-A3B → 1.7B and 0.6B. Three teacher variants (Instruct, Thinking, Coder), nine models, 2,788 combined downloads. Structure beats scale.
214
 
215
  Top model: [Qwen3-1.7B-Coder-Distilled-SFT](https://huggingface.co/reaperdoesntknow/Qwen3-1.7B-Coder-Distilled-SFT) — 508 downloads
216
 
 
210
 
211
  ## From the Convergent Intelligence Portfolio
212
 
213
+ **[DistilQwen Collection](https://huggingface.co/collections/reaperdoesntknow/distilqwen-69bf40ec669117e3f069ef1c)** — Our only BF16 series. Proof-weighted distillation from Qwen3-30B-A3B → 1.7B and 0.6B on H100. Three teacher variants (Instruct, Thinking, Coder), nine models, 2,788 combined downloads. The rest of the portfolio proves structure beats scale on CPU. This collection shows what happens when you give the methodology real hardware.
214
 
215
  Top model: [Qwen3-1.7B-Coder-Distilled-SFT](https://huggingface.co/reaperdoesntknow/Qwen3-1.7B-Coder-Distilled-SFT) — 508 downloads
216