Update README.md
Browse files
README.md
CHANGED
@@ -15,3 +15,15 @@ Mistral-Interact is a powerful and robust variant or Mistral, capable of judging
|
|
15 |
- **Comparable performance with closed-source GPT-4:** We prove that smaller-scale model experts can approach or even exceed general-purpose large-scale models across various aspects including vagueness judgment, comprehensiveness of summaries, and friendliness of interaction.
|
16 |
|
17 |
We utilize the [model-center](https://github.com/OpenBMB/ModelCenter) framework to conduct full-parameter fine-tuning of [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) using [Intention-in-Interaction(IN3)](https://huggingface.co/datasets/hbx/IN3) dataset on two 80GB A800s. For full details and the usage of this model please read our [paper](https://arxiv.org/abs/2402.09205) and [repo](https://github.com/HBX-hbx/Mistral-Interact).
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
15 |
- **Comparable performance with closed-source GPT-4:** We prove that smaller-scale model experts can approach or even exceed general-purpose large-scale models across various aspects including vagueness judgment, comprehensiveness of summaries, and friendliness of interaction.
|
16 |
|
17 |
We utilize the [model-center](https://github.com/OpenBMB/ModelCenter) framework to conduct full-parameter fine-tuning of [Mistral-7B-v0.1](https://huggingface.co/mistralai/Mistral-7B-v0.1) using [Intention-in-Interaction(IN3)](https://huggingface.co/datasets/hbx/IN3) dataset on two 80GB A800s. For full details and the usage of this model please read our [paper](https://arxiv.org/abs/2402.09205) and [repo](https://github.com/HBX-hbx/Mistral-Interact).
|
18 |
+
|
19 |
+
# Citation
|
20 |
+
|
21 |
+
Feel free to cite our paper if you find it is useful.
|
22 |
+
|
23 |
+
```shell
|
24 |
+
@article{cheng2024tell,
|
25 |
+
title={Tell Me More! Towards Implicit User Intention Understanding of Language Model Driven Agents},
|
26 |
+
author={Cheng Qian, Bingxiang He, Zhong Zhuang, Jia Deng, Yujia Qin, Xin Cong, Zhong Zhang, Jie Zhou, Yankai Lin, Zhiyuan Liu, Maosong Sun},
|
27 |
+
journal={arXiv preprint arXiv:2402.09205},
|
28 |
+
year={2024}
|
29 |
+
}
|