FloatLM_2.4B / README.md
Ayushk44's picture
Upload 7 files
6107a85 verified
|
raw
history blame
No virus
761 Bytes
metadata
license: apache-2.0

FloatLM 2.4B

The good ol' FP16 LLMs with LLaMa architecture.

import transformers as tf, torch
model_name = "SpectraSuite/FloatLM_2.4B"

# Please adjust the temperature, repetition penalty, top_k, top_p and other sampling parameters according to your needs.
pipeline = tf.pipeline("text-generation", model=model_id, model_kwargs={"torch_dtype": torch.float16}, device_map="auto")

# These are base (pretrained) LLMs that are not instruction and chat tuned. You may need to adjust your prompt accordingly.
pipeline("Once upon a time")