Edit model card
Alpacoom logo

DOLLOOM: Dolly ๐Ÿ‘ + BLOOMz ๐Ÿ’ฎ

Adapter Description

This adapter was created with the PEFT library and allowed the base model BigScience/BLOOMz 7B1 to be fine-tuned on the Dolly's Dataset (tanslated to Spanish) by using the method LoRA.

Model Description

Instruction Tuned version of BigScience Large Open-science Open-access Multilingual.

BLOOMz 7B1 MT

Training data

TBA

Supported Tasks and Leaderboards

TBA

Training procedure

TBA

How to use

TBA

Citation

@misc {manuel_romero_2023,
    author       = { {Manuel Romero} },
    title        = { dolloom (Revision 599b95a) },
    year         = 2023,
    url          = { https://huggingface.co/mrm8488/dolloom },
    doi          = { 10.57967/hf/0540 },
    publisher    = { Hugging Face }
}
Downloads last month
0
Inference Examples
Inference API (serverless) has been turned off for this model.

Dataset used to train mrm8488/dolloom