meta-llama/Llama-3.2-3B - 4b_2n4m_128bs Compression
This is a compressed model using deltazip.
Paper, Compression Tool, Inference Engine (Soon).
Compression Configuration
- Base Model: meta-llama/Llama-3.2-3B
- Compression Scheme: 4b_2n4m_128bs
- Dataset: HuggingFaceH4/ultrachat_200k
- Dataset Split: train_sft
- Max Sequence Length: 2048
- Number of Samples: 256
Sample Output
Prompt:
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Cutting Knowledge Date: December 2023
Today Date: 04 Dec 2024
<|eot_id|><|start_header_id|>user<|end_header_id|>
Who is Alan Turing?<|eot_id|>
Output:
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Cutting Knowledge Date: December 2023
Today Date: 04 Dec 2024
<|eot_id|><|start_header_id|>user<|end_header_id|>
<|begin_of_text|><|start_header_id|>system<|end_header_id|>
Cutting Knowledge Date: December 2023
Today Date: 04 Dec 2024
<|eot_id|><|start_header_id|>user<|end_header_id|>
Who is Alan Turing?<|eot_id|><|eot_id|><|start_header_id|>assistant<|end_header_id|>
Alan Turing (1912-1954) was a British mathematician, computer scientist, logician, and philosopher. He is widely considered one of the most influential figures in the development of computer science and artificial intelligence.
Turing was born in London, England, and showed exceptional mathematical
Evaluation
- Downloads last month
- 5
Model tree for eth-easl/meta-llama.Llama-3.2-3B-Instruct.4b_2n4m_128bs
Base model
meta-llama/Llama-3.2-3B