jisx's picture
Upload README.md with huggingface_hub
07d3529 verified
---
language:
- bg
- cs
- zh
- de
- fi
- fr
- ru
- es
tags:
- generation
- question answering
- instruction tuning
license: cc-by-nc-4.0
---
### Model Description
This HF repository contains LLMs instruction tuned with full-parameter fine-tuning and then used to study whether monolingual or multilingual instruction tuning is more favourable.
* [GitHub](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main)
* [Paper](https://arxiv.org/abs/2309.08958)
#### Instruction tuning details
* Base model: [bigscience/bloom-7b1](https://huggingface.co/bigscience/bloom-7b1)
* Instruction tuning language: multilingual downsampled (Bulgarian, Czech, Chinese, German, Finnish, French, Russian, and Spanish)
* Training method: full-parameter fine-tuning.
* Best checkpoint: best cross-entropy on a validation set, trained for 3 epochs.
* Dataset: machine-translated from [yahma/alpaca-cleaned](https://huggingface.co/datasets/yahma/alpaca-cleaned). You can download our data [HERE](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/training-data).
#### Usage
The model checkpoint should be loaded using `transformers` library.
Please refer to our Github repository [HERE](https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/fpft) for inference and training instructions.
#### Citation
```
@inproceedings{chen-etal-2024-monolingual,
title="Monolingual or multilingual instruction tuning: Which makes a better {Alpaca}",
author="Pinzhen Chen and Shaoxiong Ji and Nikolay Bogoychev and Andrey Kutuzov and Barry Haddow and Kenneth Heafield",
year="2024",
booktitle = "Findings of the Association for Computational Linguistics: EACL 2024",
}
```