File size: 2,250 Bytes
2c4a5c0 7a4233f 2c4a5c0 7a4233f 2c4a5c0 7a4233f 0155493 3fe6285 2c4a5c0 7a4233f 2c4a5c0 7a4233f 2c4a5c0 7a4233f 2c4a5c0 7a4233f 2c4a5c0 02019c4 2c4a5c0 02019c4 2c4a5c0 4d6918c 2c4a5c0 4d6918c 2c4a5c0 7a4233f 2c4a5c0 7a4233f 2c4a5c0 7a4233f 2c4a5c0 7a4233f 0155493 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 |
---
base_model:
- tiiuae/falcon-11B
library_name: transformers
tags:
- mergekit
- merge
- lazymergekit
- tiiuae/falcon-11B
license: apache-2.0
language:
- es
- fr
- de
- 'no'
- sv
- da
- nl
- pt
- pl
- ro
- it
- cs
---
# sliced
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was pruned using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [tiiuae/falcon-11B](https://huggingface.co/tiiuae/falcon-11B)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- model: tiiuae/falcon-11B
layer_range: [0, 24]
- sources:
- model: tiiuae/falcon-11B
layer_range: [55, 59]
merge_method: passthrough
dtype: bfloat16
```
[PruneMe](https://github.com/arcee-ai/PruneMe) has been utilized using the wikimedia/wikipedia subsets of 11 languages by investigating layer similarity with 2000 samples per language. The layer ranges for pruning were determined based on the averages of each language analysis to maintain performance while reducing model size.
![Layer Similarity Plot](https://cdn-uploads.huggingface.co/production/uploads/660c0a02cf274b3ab77dd6b7/47CiSRvJpmKGGfF-eUY6U.png)
## Direct Use
Research on large language models; as a foundation for further specialization and finetuning for specific usecases (e.g., summarization, text generation, chatbot, etc.)
## Out-of-Scope Use
Production use without adequate assessment of risks and mitigation; any use cases which may be considered irresponsible or harmful.
## Bias, Risks, and Limitations
Falcon2-5.5B is trained mostly on English, but also German, Spanish, French, Italian, Portuguese, Polish, Dutch, Romanian, Czech, Swedish. It will not generalize appropriately to other languages. Furthermore, as it is trained on a large-scale corpora representative of the web, it will carry the stereotypes and biases commonly encountered online.
## Recommendations
We recommend users of Falcon2-5.5B to consider finetuning it for the specific set of tasks of interest, and for guardrails and appropriate precautions to be taken for any production use. |