Thank you so much!
#1
by
itnomad
- opened
This is the first text-generation model which is not utterly dumb, but also which works with my setup (Radeon Instinct MI25 with ROCm & CUDA-wrapper). All these other models either don't really work with transformers, or rely on libraries such as xformers, which also needs "native NVIDIA CUDA". Another challenge is that my MI25 only has 16 GB of RAM, so 7B is the largest model which works at all...
I'm now gonna look at your Patreon; looking forward for other models which fit into my GPU :-)
All the best from DE,
Alex.