Update README.md
#2
by
agoncharenko1992
- opened
README.md
CHANGED
@@ -18,7 +18,7 @@ tags:
|
|
18 |
|
19 |
GPT-J 6B is a transformer model trained using Ben Wang's [Mesh Transformer JAX](https://github.com/kingoflolz/mesh-transformer-jax/). "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.
|
20 |
|
21 |
-
This repository contains TensorRT engines with mixed precission int8 + fp32. You can find prebuilt engines for
|
22 |
* RTX 4090
|
23 |
* RTX 3080 Ti
|
24 |
* RTX 2080 Ti
|
|
|
18 |
|
19 |
GPT-J 6B is a transformer model trained using Ben Wang's [Mesh Transformer JAX](https://github.com/kingoflolz/mesh-transformer-jax/). "GPT-J" refers to the class of model, while "6B" represents the number of trainable parameters.
|
20 |
|
21 |
+
This repository contains TensorRT engines with mixed precission int8 + fp32. You can find prebuilt engines for the following GPUs:
|
22 |
* RTX 4090
|
23 |
* RTX 3080 Ti
|
24 |
* RTX 2080 Ti
|