Nexesenex commited on
Commit
a9fc5ee
1 Parent(s): 0820c25

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +1 -1
README.md CHANGED
@@ -1,6 +1,6 @@
1
  Requantizations of a Q5_K_M quant of a trending 70b model without better quant/fp16 available, this through a Q8_0 intermediary step.
2
 
3
- Q3_K_M, Q2_K_S, and IQ3_XXS already available. IQ2_XS and Q3_K_S otw. Miqudev provided Q5_K_M, Q4_K_M, and Q2_K.
4
 
5
  Miku 70b has a theta of 1,000,000, like CodeLlama, and not 10,000, like Llama 2 models usually have.
6
  That feature singularizes it to my knowledge to ALL Llama 2 models, beside Codellamas which also have a theta of 1,000,000..
 
1
  Requantizations of a Q5_K_M quant of a trending 70b model without better quant/fp16 available, this through a Q8_0 intermediary step.
2
 
3
+ Q3_K_M, Q3_K_S, Q3_K_XS, Q2_K_S, IQ3_XXS, IQ2_XS available. Miqudev provided Q5_K_M, Q4_K_M, and Q2_K from his probable FP16.
4
 
5
  Miku 70b has a theta of 1,000,000, like CodeLlama, and not 10,000, like Llama 2 models usually have.
6
  That feature singularizes it to my knowledge to ALL Llama 2 models, beside Codellamas which also have a theta of 1,000,000..