Laomaodiaoyu commited on
Commit
bbe4303
·
verified ·
1 Parent(s): 471e0f7

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -5
README.md CHANGED
@@ -38,7 +38,7 @@ tags:
38
  These are the official pre-trained model weights and configuration files for **D<sup>2</sup>MoRA**, a novel **diversity-regulated asymmetric MoE-LoRA decomposition framework** for **parameter-efficient fine-tuning (PEFT)** of large language models in **multi-task adaptation** scenarios.
39
 
40
  🔗 **Paper:** [Accepted by AAAI 2026]
41
- 🔗 **GitHub Repository:** [softwavec/D2MoRA](https://github.com/softwavec/D2MoRA)
42
 
43
  ---
44
 
@@ -96,7 +96,7 @@ These weights are designed to be used directly with the official **D<sup>2</sup>
96
  Clone the GitHub repository and install dependencies following the official repository instructions:
97
 
98
  ```bash
99
- git clone https://github.com/softwavec/D2MoRA.git
100
  cd D2MoRA
101
  ```
102
 
@@ -141,10 +141,13 @@ Please use the official repository scripts for training and evaluation.
141
  If you find our work or these model weights useful in your research, please consider leaving a **Star** ⭐️ on our GitHub repo and citing our paper:
142
 
143
  ```bibtex
144
- @inproceedings{INTENT,
145
- title={INTENT: Invariance and Discrimination-aware Noise Mitigation for Robust Composed Image Retrieval},
146
- author={Chen, Zhiwei and Hu, Yupeng and Fu, Zhiheng and Li, Zixu and Huang, Jiale and Huang, Qinlei and Wei, Yinwei},
147
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
 
 
 
148
  year={2026}
149
  }
150
  ```
 
38
  These are the official pre-trained model weights and configuration files for **D<sup>2</sup>MoRA**, a novel **diversity-regulated asymmetric MoE-LoRA decomposition framework** for **parameter-efficient fine-tuning (PEFT)** of large language models in **multi-task adaptation** scenarios.
39
 
40
  🔗 **Paper:** [Accepted by AAAI 2026]
41
+ 🔗 **GitHub Repository:** [softwavec/D2MoRA](https://github.com/iLearn-Lab/AAAI26-D2MoRA)
42
 
43
  ---
44
 
 
96
  Clone the GitHub repository and install dependencies following the official repository instructions:
97
 
98
  ```bash
99
+ git clone https://github.com/iLearn-Lab/AAAI26-D2MoRA.git
100
  cd D2MoRA
101
  ```
102
 
 
141
  If you find our work or these model weights useful in your research, please consider leaving a **Star** ⭐️ on our GitHub repo and citing our paper:
142
 
143
  ```bibtex
144
+ @inproceedings{zuo2026d2mora,
145
+ title={D2MoRA: Diversity-Regulated Asymmetric MoE-LoRA Decomposition for Efficient Multi-Task Adaptation},
146
+ author={Zuo, Jianhui and Song, Xuemeng and Wen, Haokun and Liu, Meng and Hu, Yupeng and Wang, Jiuru and Nie, Liqiang},
147
  booktitle={Proceedings of the AAAI Conference on Artificial Intelligence},
148
+ volume={40},
149
+ number={34},
150
+ pages={29286--29294},
151
  year={2026}
152
  }
153
  ```