Update README.md
Browse files
README.md
CHANGED
@@ -28,6 +28,12 @@ This model is a <b>causal</b> language model for the <b>Italian</b> language, ba
|
|
28 |
|
29 |
The model has ~6.6B parameters and a vocabulary of 50.335 tokens. It's an instruction-based model, trained with low-rank adaptation, and it's mainly suitable for general purpose prompt-based tasks involving natural language inputs and outputs.
|
30 |
|
|
|
|
|
|
|
|
|
|
|
|
|
31 |
<h3>Quantization</h3>
|
32 |
|
33 |
The released checkpoint is quantized in 8-bit, so that it can easily be loaded and used for training and inference on ordinary hardware like consumer GPUs, and it requires the installation of the <b>transformers</b> library version >= 4.30.1 and the <b>bitsandbytes</b> library, version >= 0.37.2
|
|
|
28 |
|
29 |
The model has ~6.6B parameters and a vocabulary of 50.335 tokens. It's an instruction-based model, trained with low-rank adaptation, and it's mainly suitable for general purpose prompt-based tasks involving natural language inputs and outputs.
|
30 |
|
31 |
+
<h3>Example</h3>
|
32 |
+
|
33 |
+
This is an example of intended use of the model:
|
34 |
+
|
35 |
+
![example](primo_screen.png)
|
36 |
+
|
37 |
<h3>Quantization</h3>
|
38 |
|
39 |
The released checkpoint is quantized in 8-bit, so that it can easily be loaded and used for training and inference on ordinary hardware like consumer GPUs, and it requires the installation of the <b>transformers</b> library version >= 4.30.1 and the <b>bitsandbytes</b> library, version >= 0.37.2
|