Will this run on a 128GB Ram system (ir-13900k) with a RTX 4090?
#2
by
clevnumb
- opened
I have 64GB ram total, but will add more RAM if that's needed.
I'm wondering if a setup of 2x3090 can use this with sli.
One would have to repartition the model, I guess. Can that be done with these weights, after they have been quantized?
There are claims that people run it, whilst not being able to use the full context.
I doubt it not on 65b at least.