jeffra's picture
Update README.md
e7ae5bd verified
metadata
license: apache-2.0
tags:
  - snowflake
  - arctic
  - moe

Model Details

Arctic is a dense-MoE Hybrid transformer architecture pre-trained from scratch by the Snowflake AI Research Team. We are releasing model checkpoints for both the base and instruct-tuned versions of Arctic under an Apache-2.0 license. This means you can use them freely in your own research, prototypes, and products. Please see our blog Snowflake Arctic: The Best LLM for Enterprise AI — Efficiently Intelligent, Truly Open for more information on Arctic and links to other relevant resources such as our series of cookbooks covering topics around training your own custom MoE models, how to produce high-quality training data, and much more.

For the latest details about Snowflake Arctic including tutorials, etc. please refer to our github repo:

Model developers Snowflake AI Research Team

License Apache-2.0

Input Models input text only.

Output Models generate text and code only.

Model Release Date April, 24th 2024.

Model Architecture

Arctic combines a 10B dense transformer model with a residual 128x3.66B MoE MLP resulting in 480B total and 17B active parameters chosen using a top-2 gating. For more details about Arctic's model Architecture, training process, data, etc. see our series of cookbooks.

Usage

As of 4/24/2024 we are actively working with the maintainers of transformers to include the Arctic model implementation. Until this support is released please follow these instructions to get the required dependencies for using Arctic:

pip install git+https://github.com/Snowflake-Labs/transformers.git@arctic

Arctic leverages several features from DeepSpeed, you will need to install the latest version of DeepSpeed to get all of these required features:

pip install "deepspeed>=0.14.2"

Inference

The Arctic github page has several resources around running inference: