Edit model card

Model Card for Model ID

LOLA: Large and Open Source Multilingual Language Model

Model Description

This is a fine-tuned version of dice-research/lola_v1 trained on multilingual Alpaca dataset for 2 epochs. The training dataset can be found here: https://github.com/hplt-project/monolingual-multilingual-instruction-tuning/tree/main/training-data. The following languages are covered: Bulgarian (bg), Czech (cs), English (en), German (de), Spanish (es), Finnish (fi), French (fr), Portuguese (pt), Russian (ru), and Chinese (zh).

Downloads last month
0
Safetensors
Model size
7.46B params
Tensor type
F32
·
Inference Examples
Inference API (serverless) does not yet support model repos that contain custom code.