Shengkun nielsr HF Staff commited on
Commit
87ca365
·
verified ·
1 Parent(s): 8b3bb3f

Add library name and pipeline tag (#1)

Browse files

- Add library name and pipeline tag (1ee2f5de47cbc3728ccdfb393adcd755e7a09714)


Co-authored-by: Niels Rogge <nielsr@users.noreply.huggingface.co>

Files changed (1) hide show
  1. README.md +9 -6
README.md CHANGED
@@ -1,6 +1,9 @@
1
  ---
2
  license: apache-2.0
 
 
3
  ---
 
4
  **Paper**: [https://arxiv.org/pdf/2502.07780](https://arxiv.org/pdf/2502.07780)
5
  **Code**: https://github.com/IST-DASLab/DarwinLM
6
  **Models**: [DarwinLM-2.7B](https://huggingface.co/Shengkun/DarwinLM-2.7B), [DarwinLM-4.6B](https://huggingface.co/Shengkun/DarwinLM-4.6B), [DarwinLM-8.4B](https://huggingface.co/Shengkun/DarwinLM-8.4B)
@@ -9,14 +12,14 @@ license: apache-2.0
9
  ---
10
 
11
  This repository contains the weights of DarwinLM, an evolutionary structured pruning methods for large language models, as introduced in our paper. DarwinLM builds upon an evolutionary search process, generating multiple offspring models in each generation through mutation, and selecting the fittest for survival.
12
- ```
13
  # Please add trust_remote_code=True as the repo includes custom code to load and run DarwinLM
 
14
  model = AutoModelForCausalLM.from_pretrained("Shengkun/DarwinLM-8.4B", trust_remote_code=True)
15
  ```
16
 
17
  ## Downstream Tasks
18
 
19
-
20
  **2.7B**
21
 
22
  | Method | Param. | SciQ | PIQA | WG | ArcE | ArcC | HS | LogiQA | BoolQ | Avg |
@@ -52,14 +55,14 @@ model = AutoModelForCausalLM.from_pretrained("Shengkun/DarwinLM-8.4B", trust_rem
52
  | | **OLMO-0424 (2.05T)** | 7B | 96.1 | 80.1 | 72.1 | 73.8 | 49.2 | 78.0 | 29.3 | 80.8 | 52.1 | 67.9 |
53
  | | *DarwinLM (10.0B)* | 8.4B | 89.5 | 78.1 | 70.7 | 79.6 | 57.6 | 74.9 | 33.5 | 73.9 | 57.9 | 68.4 |
54
 
55
-
56
-
57
  ## Bibtex
58
- ```
59
  @article{tang2025darwinlm,
60
  title={DarwinLM: Evolutionary Structured Pruning of Large Language Models},
61
  author={Tang, Shengkun and Sieberling, Oliver and Kurtic, Eldar and Shen, Zhiqiang and Alistarh, Dan},
62
  journal={arXiv preprint arXiv:2502.07780},
63
  year={2025}
64
  }
65
- ```
 
 
 
1
  ---
2
  license: apache-2.0
3
+ library_name: transformers
4
+ pipeline_tag: text-generation
5
  ---
6
+
7
  **Paper**: [https://arxiv.org/pdf/2502.07780](https://arxiv.org/pdf/2502.07780)
8
  **Code**: https://github.com/IST-DASLab/DarwinLM
9
  **Models**: [DarwinLM-2.7B](https://huggingface.co/Shengkun/DarwinLM-2.7B), [DarwinLM-4.6B](https://huggingface.co/Shengkun/DarwinLM-4.6B), [DarwinLM-8.4B](https://huggingface.co/Shengkun/DarwinLM-8.4B)
 
12
  ---
13
 
14
  This repository contains the weights of DarwinLM, an evolutionary structured pruning methods for large language models, as introduced in our paper. DarwinLM builds upon an evolutionary search process, generating multiple offspring models in each generation through mutation, and selecting the fittest for survival.
15
+ ```python
16
  # Please add trust_remote_code=True as the repo includes custom code to load and run DarwinLM
17
+ from transformers import AutoModelForCausalLM
18
  model = AutoModelForCausalLM.from_pretrained("Shengkun/DarwinLM-8.4B", trust_remote_code=True)
19
  ```
20
 
21
  ## Downstream Tasks
22
 
 
23
  **2.7B**
24
 
25
  | Method | Param. | SciQ | PIQA | WG | ArcE | ArcC | HS | LogiQA | BoolQ | Avg |
 
55
  | | **OLMO-0424 (2.05T)** | 7B | 96.1 | 80.1 | 72.1 | 73.8 | 49.2 | 78.0 | 29.3 | 80.8 | 52.1 | 67.9 |
56
  | | *DarwinLM (10.0B)* | 8.4B | 89.5 | 78.1 | 70.7 | 79.6 | 57.6 | 74.9 | 33.5 | 73.9 | 57.9 | 68.4 |
57
 
 
 
58
  ## Bibtex
59
+ ```bibtex
60
  @article{tang2025darwinlm,
61
  title={DarwinLM: Evolutionary Structured Pruning of Large Language Models},
62
  author={Tang, Shengkun and Sieberling, Oliver and Kurtic, Eldar and Shen, Zhiqiang and Alistarh, Dan},
63
  journal={arXiv preprint arXiv:2502.07780},
64
  year={2025}
65
  }
66
+ ```
67
+
68
+ For any issues or questions, please open an issue or contact us directly. 🚀