APUS-xDAN-4.0-MOE / README.md
xDAN2099's picture
Update README.md
4c4e3ee verified
|
raw
history blame
2.08 kB
metadata
license: apache-2.0

Introduction

APUS-xDAN-4.0-MOE is a transformer-based decoder-only language model, developed on a vast corpus of data to ensure robust performance.

For more comprehensive information, please visit our blog post and GitHub repository.

Model Details

APUS-xDAN-4.0-MOE leverages the innovative Mixture of Experts (MoE) architecture, incorporating components from dense language models. Specifically, it inherits its capabilities from the highly performant xDAN-L2 Series. With a total of 136 billion parameters, of which 30 billion are activated during runtime, APUS-xDAN-4.0-MOE demonstrates unparalleled efficiency. Through advanced quantization techniques, our open-source version occupies a mere 42GB, making it seamlessly compatible with consumer-grade GPUs like the 4090 and 3090.

Requirements

The codebase for APUS-xDAN-4.0-MOE is integrated into the latest Hugging Face transformers library. We recommend building from source using the command pip install git+https://github.com/huggingface/transformers to ensure compatibility. Failure to do so may result in encountering the following error:

Copy code Usage llama.cpp

Usage

import torch
from transformers import AutoModelForCausalLM, AutoTokenizer

torch.set_default_dtype(torch.bfloat16)

tokenizer = AutoTokenizer.from_pretrained("hpcai-tech/grok-1", trust_remote_code=True)

model = AutoModelForCausalLM.from_pretrained(
    "hpcai-tech/grok-1",
    trust_remote_code=True,
    device_map="auto",
    torch_dtype=torch.bfloat16,
)
model.eval()

text = "Replace this with your text"
input_ids = tokenizer(text, return_tensors="pt").input_ids
input_ids = input_ids.cuda()
attention_mask = torch.ones_like(input_ids)
generate_kwargs = {}  # Add any additional args if you want
inputs = {
    "input_ids": input_ids,
    "attention_mask": attention_mask,
    **generate_kwargs,
}
outputs = model.generate(**inputs)
print(outputs)

License

APUS-xDAN-4.0-MOE is distributed under the LLAMA 2 Community License, Copyright (c) Meta Platforms, Inc. All Rights Reserved.