--- license: apache-2.0 datasets: - lambada language: - en library_name: transformers pipeline_tag: text-generation tags: - text-generation-inference - causal-lm - int8 - PyTorch - PostTrainingStatic - IntelĀ® Neural Compressor - neural-compressor --- # INT8 GPT-J 6B GPT-J 6B is a transformer model trained using Ben Wang's [Mesh Transformer JAX](https://github.com/kingoflolz/mesh-transformer-jax/). "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters. This int8 PyTorch model is generated by [neural-compressor](https://github.com/intel/neural-compressor).