Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
ritvik77 
posted an update about 19 hours ago
Post
1002
Big companies are now training huge AI models with tons of data and billions of parameters, and the future seems to be about quantization—making those models smaller by turning big numbers into simpler ones, like going from 32-bit to 8-bit without reducing accuracy by +/- 0.01%. There should be some standard unit of measurement for the ratio of model size reduction to accuracy lost.

What do you all thing about this ?

Soonest future yes, but there is also soon the change that is going to make wind blow quite different direction.