Edit model card

GPTQ 4 bits with 128-Group size and Act Order Quantization

Introduction

This model is a variant of the original parrot_en_es model, utilizing 128-Act Order Quantization. It serves as an experiment to study the effects of quantization on language models.

Warning: Catastrophic Loss Through Quantization

Please be advised that the quantization process has resulted in catastrophic loss of functionality. The model is no longer capable of acting as a translator. It has issues with context understanding and exhibits hallucinatory behavior.

Original Model

For detailed usage instructions and prompt formatting, please refer to the original model card.

Usage

Given the current limitations, this quantized model is not recommended for translation tasks. However, it may still be of interest for research purposes, specifically in studying the effects of quantization on neural networks.

Contributing

If you have insights or suggestions on how to improve the model or mitigate the issues caused by quantization, please feel free to open an issue or submit a pull request.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference API
Unable to determine this model’s pipeline type. Check the docs .