nextai-team's picture
Update README.md
2767bc9 verified
|
raw
history blame
798 Bytes
metadata
license: apache-2.0
language:
  - en
tags:
  - code
  - QA
  - reasoning
library_name: transformers

Model Card for Model ID

Model Details

Model Description

A power full MOE 4x7b mixtral of mistral models consists of for more accuracy and precision in general reasoning, QA and code. HuggingFaceH4/zephyr-7b-beta mistralai/Mistral-7B-Instruct-v0.2 teknium/OpenHermes-2.5-Mistral-7B Intel/neural-chat-7b-v3-3

  • Developed by: NEXT AI
  • Funded by : Zpay Labs Pvt Ltd.
  • Model type: Mixtral of Mistral 4x7b
  • Language(s) (NLP): Code-Reasoning-QA

Model Sources

  • Demo : Https://nextai.co.in