WhyTheMoon's picture
Update README.md
fad1d5b verified
metadata
license: mit
language:
  - en
pipeline_tag: text-generation
arxiv:
  - https://arxiv.org/abs/2508.06595
library_name: transformers

Model Details

Best Meta-Llama-3-8B-Instruct checkpoint unlearned using RMU with the Keyword-Bio forget set. For more details, please check our paper.

sources

Performance

WMDP-Bio tinyMMLU GSM8k TriviaQA
Llama-3-8B-Instruct 71.01 59.21 75.28 51.09
Llama-3-8B-Instruct_RMU_Keyword-Bio 70.30 60.42 75.97 51.31

Citation

If you find this useful in your research, please consider citing our paper:

@misc{zhu2025llmunlearningexpertcurated,
      title={LLM Unlearning Without an Expert Curated Dataset}, 
      author={Xiaoyuan Zhu and Muru Zhang and Ollie Liu and Robin Jia and Willie Neiswanger},
      year={2025},
      eprint={2508.06595},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2508.06595}, 
}