Join the conversation

Join the community of Machine Learners and AI enthusiasts.

Sign Up
its5Q 
posted an update 3 days ago
Post
2739
Am I missing something, or there is still no way to filter by model size while searching for models? It has been a requested feature since 2022, but I haven't seen any updates since! With the amount of different models coming out, I think the size filter would be a great extension of the search functionality, especially when looking for smaller models, which are a lot less prevalent.

You see, model sizes like 0.5B, 1.5B and 2.5 or 3-3.5B are all smaller sizes. So that is how I search for it. I look into the model list and filter it by those numbers. Then you can look into files to see the sizes, if you see something like 8 GB, that may fit even into 4 GB GPU quantized, and so on, you have to experiment with it to find which one is to fit nicely to you.

My experience so far is that Phi-3.5-mini works very well as Q3_K_M within GPU and serves me very well for local needs.

And I have thousands of website pages that are improved, summarized, speech generated, keywords generated for hyperlinking between pages and making them relevant to each other, including across domains. Local agent is programmed by GNU Emacs Lisp and runs when I am as user idle, doing those tasks. I find it a great tool for sales and marketing.

And I have faced same problem as you due to my low-end hardware with 16 GB RAM and 4 GB Nvidia GTX 1050 Ti. Though I love it that it all runs locally.

In this post