File size: 1,980 Bytes
3dcc203 42ba46a 396be05 42ba46a 3975b08 42ba46a 3975b08 42ba46a 3975b08 42ba46a 3975b08 42ba46a e43bf73 3975b08 e43bf73 3609ec2 3975b08 e43bf73 3975b08 3dcc203 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 |
---
base_model: Models/llama3-8b-instruct
library_name: peft
language:
- en
---
# Model Card for Model ID
# π€ PIP-KAG: Mitigating Knowledge Conflicts in Knowledge-Augmented Generation via Parametric Pruning
This is the official model for **[PIP-KAG: Mitigating Knowledge Conflicts in Knowledge-Augmented Generation via Parametric Pruning](https://arxiv.org/pdf/2502.15543)**.
The PIP-KAG model is designed to address **knowledge conflicts** in **knowledge-augmented generation** tasks by leveraging a **parametric pruning** strategy, improving the **contextual faithfulness** of language models during knowledge-intensive generation.
## π **Paper**
For a detailed explanation of the methodology and experiments, please refer to our paper:
[**PIP-KAG: Mitigating Knowledge Conflicts in Knowledge-Augmented Generation via Parametric Pruning**](https://arxiv.org/abs/2502.15543)
## π Reproduce the Results
To reproduce the experiments and benchmarks from the paper, follow the instructions provided in the official GitHub repository:
[π GitHub: OpenBMB/PIP-KAG](https://github.com/OpenBMB/PIP-KAG).
## π Model Details
- Model Name: PIP-KAG-7B
- Architecture: LLaMA3-8B-Instruct with Parametric Pruning
- Training Data: [CoConflictQA](https://huggingface.co/datasets/chengpingan/PIP-KAG) Dataset
- Pretrained Tasks: Knowledge-Augmented Generation, Contextual Faithfulness Evaluation
## π Citation
If you use PIP-KAG in your work, please consider citing our paper:
```
@misc{huang2025pipkagmitigatingknowledgeconflicts,
title={PIP-KAG: Mitigating Knowledge Conflicts in Knowledge-Augmented Generation via Parametric Pruning},
author={Pengcheng Huang and Zhenghao Liu and Yukun Yan and Xiaoyuan Yi and Hao Chen and Zhiyuan Liu and Maosong Sun and Tong Xiao and Ge Yu and Chenyan Xiong},
year={2025},
eprint={2502.15543},
archivePrefix={arXiv},
primaryClass={cs.CL},
url={https://arxiv.org/abs/2502.15543},
}
``` |