8_0 Model and thank you so much for your work!!!

#2
by deleted - opened
deleted

Firstly, I just wanted to show my appreciation for your models! I think it's safe to say you make the best models available and we all appreciate your efforts in bring us closer to GPT-4 at home!

In regards to the 8_0 model, would it be possible to upload it on a different site that allows larger than 50GB files? It would be interesting to test it vs other models and 5_1.

Thank you!

deleted changed discussion status to closed
deleted changed discussion status to open

Who deletes their HF account!? :)

Anyway if you ever see this message: yes OK I will give that some thought next time I do a 65B. I could put it on Google Drive I guess.

GGML used to work with multiple files. I think the transition to single file was due to the use of mmap. There shouldn't be any technical limitations in doing multiple mmap instead of only one, though that would complicate the code a lot.

Yeah. Maybe it still does work with multiple files? It says "n_part" in the header info.

But the issue is I have no idea how to make multi-part GGMLs. I use the provided convert.py to make GGMLs, and that currently only supports making single part files.

Might not be too hard to update convert.py for making multi-part GGMLs. But I've not looked at it myself yet

Yeah. Maybe it still does work with multiple files? It says "n_part" in the header info.

But the issue is I have no idea how to make multi-part GGMLs. I use the provided convert.py to make GGMLs, and that currently only supports making single part files.

Might not be too hard to update convert.py for making multi-part GGMLs. But I've not looked at it myself yet

I think you can split files with compression programs like 7zip. Have you considered that?

Yes I have considered exactly that. I was talking about it with the llama.cpp team recently, to see if there was any way to do it with llama.cpp natively. It may be possible, but the consensus was just to use multi part ZIP.

So I'll look into doing that soon when I have more time.

Sign up or log in to comment