Edit model card

MGM-34B Model Card

Model details

The framework supports a series of dense and MoE Large Language Models (LLMs) from 2B to 34B with HD image understanding, reasoning, and generation simultaneously. You can also try our other MGM series models:

Normal resolution setting: MGM-2B, MGM-7B, MGM-13B, MGM-8x7B

High resolution setting: MGM-7B-HD, MGM-13B, MGM-8x7B-HD, MGM-34B-HD

Model type: MGM is an open-source chatbot trained by fine-tuning Nous-Hermes-2-Yi-34B on GPT-generated multimodal instruction-following data.

It empowers existing frameworks to support HD image understanding, reasoning, and generation simultaneously.

Model version: MGM with LLM Nous-Hermes-2-Yi-34B

Model date: MGM-34B was trained on 03/2024.

License

Nous-Hermes-2-Yi-34B is licensed under the apache-2.0 License,

Where to send questions or comments about the model: https://github.com/dvlab-research/MGM/issues

Intended use

Primary intended uses: The primary use of MGM is research on large multimodal models and chatbots.

Primary intended users: The primary intended users of the model are researchers and hobbyists in computer vision, natural language processing, machine learning, and artificial intelligence.

Training data

This model is trained based on MGM-Instruction dataset, please to the Github for more detail.

Acknowledgement

This project is not affiliated with Google LLC.

Downloads last month
16
Safetensors
Model size
35B params
Tensor type
BF16
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train YanweiLi/MGM-34B

Collection including YanweiLi/MGM-34B