Upload folder using huggingface_hub
Browse files- .gitattributes +5 -0
- README.md +44 -0
- zephyr-speakleash-007-pl-8192-32-16-0.05.Q3_K_S.gguf +3 -0
- zephyr-speakleash-007-pl-8192-32-16-0.05.Q4_K_M.gguf +3 -0
- zephyr-speakleash-007-pl-8192-32-16-0.05.Q5_K_M.gguf +3 -0
- zephyr-speakleash-007-pl-8192-32-16-0.05.Q6_K.gguf +3 -0
- zephyr-speakleash-007-pl-8192-32-16-0.05.Q8_0.gguf +3 -0
- zephyr-speakleash-007-pl-8192-32-16-0.05.fp16.bin +3 -0
.gitattributes
CHANGED
@@ -33,3 +33,8 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
36 |
+
zephyr-speakleash-007-pl-8192-32-16-0.05.Q3_K_S.gguf filter=lfs diff=lfs merge=lfs -text
|
37 |
+
zephyr-speakleash-007-pl-8192-32-16-0.05.Q4_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
38 |
+
zephyr-speakleash-007-pl-8192-32-16-0.05.Q5_K_M.gguf filter=lfs diff=lfs merge=lfs -text
|
39 |
+
zephyr-speakleash-007-pl-8192-32-16-0.05.Q6_K.gguf filter=lfs diff=lfs merge=lfs -text
|
40 |
+
zephyr-speakleash-007-pl-8192-32-16-0.05.Q8_0.gguf filter=lfs diff=lfs merge=lfs -text
|
README.md
ADDED
@@ -0,0 +1,44 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
+
---
|
3 |
+
license: openrail
|
4 |
+
pipeline_tag: text-generation
|
5 |
+
library_name: transformers
|
6 |
+
language:
|
7 |
+
- zh
|
8 |
+
- en
|
9 |
+
---
|
10 |
+
|
11 |
+
|
12 |
+
## Original model card
|
13 |
+
|
14 |
+
Buy me a coffee if you like this project ;)
|
15 |
+
<a href="https://www.buymeacoffee.com/s3nh"><img src="https://www.buymeacoffee.com/assets/img/guidelines/download-assets-sm-1.svg" alt=""></a>
|
16 |
+
|
17 |
+
#### Description
|
18 |
+
|
19 |
+
GGUF Format model files for [This project](https://huggingface.co/Nondzu/zephyr-speakleash-007-pl-8192-32-16-0.05).
|
20 |
+
|
21 |
+
### GGUF Specs
|
22 |
+
|
23 |
+
GGUF is a format based on the existing GGJT, but makes a few changes to the format to make it more extensible and easier to use. The following features are desired:
|
24 |
+
|
25 |
+
Single-file deployment: they can be easily distributed and loaded, and do not require any external files for additional information.
|
26 |
+
Extensible: new features can be added to GGML-based executors/new information can be added to GGUF models without breaking compatibility with existing models.
|
27 |
+
mmap compatibility: models can be loaded using mmap for fast loading and saving.
|
28 |
+
Easy to use: models can be easily loaded and saved using a small amount of code, with no need for external libraries, regardless of the language used.
|
29 |
+
Full information: all information needed to load a model is contained in the model file, and no additional information needs to be provided by the user.
|
30 |
+
The key difference between GGJT and GGUF is the use of a key-value structure for the hyperparameters (now referred to as metadata), rather than a list of untyped values.
|
31 |
+
This allows for new metadata to be added without breaking compatibility with existing models, and to annotate the model with additional information that may be useful for
|
32 |
+
inference or for identifying the model.
|
33 |
+
|
34 |
+
|
35 |
+
|
36 |
+
### inference
|
37 |
+
|
38 |
+
|
39 |
+
User: Tell me story about what is an quantization and what do we need to build.
|
40 |
+
I have a little bit of experience with the term “quantization” from physics, but not much. When I hear it, the first thing that comes to mind is some kind of measuring instrument, like a ruler or voltmeter. What does the phrase “quantized by 1024” mean?
|
41 |
+
It sounds more mathematical than physical.
|
42 |
+
The term quantization comes from quantum mechanics and refers to a process whereby a continuous function is approximated by discrete values, that is, it is “discretized”. In this sense, we can say that the “quanta” are the differences between adjacent
|
43 |
+
|
44 |
+
# Original model card
|
zephyr-speakleash-007-pl-8192-32-16-0.05.Q3_K_S.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2d1bc6e8015ebacc38e47e6a03a96bd66d1cbdef4994c2b155c0883f67cd6842
|
3 |
+
size 3164567744
|
zephyr-speakleash-007-pl-8192-32-16-0.05.Q4_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6e1ad13fb5668d35cd7c2a6e5890f448e58c49b26015ea65fd3185f010607d52
|
3 |
+
size 4368439488
|
zephyr-speakleash-007-pl-8192-32-16-0.05.Q5_K_M.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:ce64931fb877b5f94670d3d47ac1a6ad8ff1b00e10886aa512b4f5182d84a505
|
3 |
+
size 5131409600
|
zephyr-speakleash-007-pl-8192-32-16-0.05.Q6_K.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:2808e9f618bf68751b19926b03e0684b1715f9b13886a545eb1afbb6d545aa1c
|
3 |
+
size 5942065344
|
zephyr-speakleash-007-pl-8192-32-16-0.05.Q8_0.gguf
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:3e1442fa84612c9c8239753fd5d5a5252f57ffe9fdd9ce435d205a79e45f1400
|
3 |
+
size 7695857856
|
zephyr-speakleash-007-pl-8192-32-16-0.05.fp16.bin
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:6b509d7153a7a53d78d217fde7d317e446d7287ac0d3601924fb8587e1ca24d7
|
3 |
+
size 14484732032
|