Table of Contents
TL;DR
Model Details
Model Description
- Developed by: https://www.tii.ae
- Model type: Causal decoder-only - instruct / chat version
- Architecture: Pure-transformer - 1.58bit version
- Language(s) (NLP): Mainly English
- License: TII Falcon License 2.0
Training details
The model has been trained following the training strategies from the recent 1-bit LLM HF blogpost and 1-bit LLM paper. For more details about the training protocol of this model, please refer to the Falcon-3 technical report, section Compression.
Usage
Currently to use this model you can rely on BitNet library. You can also play with the model using the falcon-1.58bit playground (only for the 7B instruct version).
BitNet
git clone https://github.com/microsoft/BitNet && cd BitNet
pip install -r requirements.txt
huggingface-cli download tiiuae/Falcon3-3B-Instruct-1.58bit-GGUF ggml-model-i2_s.gguf --local-dir models/Falcon3-3B-1.58bit/
python run_inference.py -m models/Falcon3-3B-1.58bit/ggml-model-i2_s.gguf -p "You are a helpful assistant" -cnv
Evaluation
We report in the following table our internal pipeline benchmarks:
Note evaluation results are normalized score from v2 leaderboard tasks - reported results of original models in the blogpost are raw scores
Benchmark | Llama3-8B-1.58-100B-tokens | Falcon3-3B-Instruct-1.58bit |
---|---|---|
IFEval | 17.91 | 32.52 |
MUSR | 4.87 | 2.23 |
GPQA | 6.95 | 5.25 |
BBH | 5.36 | 5.79 |
MMLU-PRO | 2.78 | 3.41 |
MATH | 0.26 | 0.77 |
Average | 5.5 | 8.61 |
Useful links
- View our release blogpost.
- Feel free to join our discord server if you have any questions or to interact with our researchers and developers.
Citation
If the Falcon3 family of models were helpful to your work, feel free to give us a cite.
@misc{Falcon3,
title = {The Falcon 3 Family of Open Models},
author = {Falcon-LLM Team},
month = {December},
year = {2024}
}
- Downloads last month
- 25
Model tree for tiiuae/Falcon3-3B-Instruct-1.58bit-GGUF
Unable to build the model tree, the base model loops to the model itself. Learn more.