MLX
4 languages
Mixture of Experts
multimodal
j.o.s.i.e.
Edit model card

This will be the repo for J.O.S.I.E.v4o

Like OpenAIs GPT-4o, it's natively Multimodal, based on the NExT-GPT combined with ROPE, RMS Normalisation, and MoE, parred with the GPT-4o Tokenizer from OpenAI. This is a future project and will take it's time.

Also I will make a UI application with that model too.

Further updates comming soon!!!

Source code and more info will be available on my GitHub Repo

Downloads last month
0
Unable to determine this model’s pipeline type. Check the docs .

Datasets used to train Isaak-Carter/J.O.S.I.E.v4o