Update @ 2024.03.13
This model is a DPO fine-tuned version of liminerity/M7-7b
Model Developers Chihoon Lee(chlee10), T3Q