Edit model card

Chat Vector

CHAT VECTOR: A SIMPLE APPROACH TO EQUIP LLMS WITH NEW LANGUAGE CHAT CAPABILITIES https://arxiv.org/pdf/2310.04799.pdf

With the advancements in conversational AI, such as ChatGPT, this paper focuses on exploring developing Large Language Models (LLMs) for non-English languages, especially emphasizing alignment with human preferences. We introduce a computationally efficient method, leveraging β€œchat vector,” to synergize pre-existing knowledge and behaviors in LLMs, restructuring the conventional training paradigm from continual pretrain SFT RLHF to continual pretrain + chat. Our empirical studies, primarily focused on Traditional Chinese, employ LLaMA2 as the base model and acquire the chat vector by subtracting the pre-trained weights, LLaMA2, from the weights of LLaMA2-chat. Evaluating from three distinct facets, which are toxicity, ability of instruction following and multi-turn dialogue demonstrates the chat vector's superior efficacy in β€œchatting”. To confirm the adaptability of our approach, we extend our experiments to include models pre-trained in both Korean and Simplified Chinese, illustrating the versatility of our methodology. Overall, we present a significant solution in aligning LLMs with human preferences efficiently across various languages, accomplished by the chat vector.

Merged LM

  • mistral 7b
  • chat vector
    • neural-chat
    • marconroni

Open LLM Leaderboard Evaluation Results

Detailed results can be found here

Metric Value
Avg. 71.27
AI2 Reasoning Challenge (25-Shot) 69.20
HellaSwag (10-Shot) 86.26
MMLU (5-Shot) 65.07
TruthfulQA (0-shot) 60.03
Winogrande (5-shot) 80.90
GSM8k (5-shot) 66.19
Downloads last month
3,133
Safetensors
Model size
7.24B params
Tensor type
F32
Β·

Spaces using aqweteddy/mistral_tv-neural-marconroni 8

Evaluation results