Papers
arxiv:2312.16171

Principled Instructions Are All You Need for Questioning LLaMA-1/2, GPT-3.5/4

Published on Dec 26, 2023
· Featured in Daily Papers on Dec 27, 2023
Authors:
,

Abstract

This paper introduces 26 guiding principles designed to streamline the process of querying and prompting large language models. Our goal is to simplify the underlying concepts of formulating questions for various scales of large language models, examining their abilities, and enhancing user comprehension on the behaviors of different scales of large language models when feeding into different prompts. Extensive experiments are conducted on LLaMA-1/2 (7B, 13B and 70B), GPT-3.5/4 to verify the effectiveness of the proposed principles on instructions and prompts design. We hope that this work provides a better guide for researchers working on the prompting of large language models. Project page is available at https://github.com/VILA-Lab/ATLAS.

Community

I just created a learning GPT for the 26 prompting principles in this paper: https://chat.openai.com/g/g-i8Bwtxjk1-papergpt-principled-instructions-are-all-you-need/c/6d9777f9-c871-4f5e-a7d1-be04bf72de2a

** It incorporates the PDF from Sondos Bsharat Aidar Myrzakhan and Zhiqiang Shen's informative paper "Principled Instructions Are All You Need for Questioning LLaMA-1/2, GPT-3.5/4" from MBZUAI (Mohamed bin Zayed University of Artificial Intelligence)

Enjoy!

Screen Shot 2023-12-28 at 16.59.59 PM.png

Thanks @SondosMB @aidarmyrzakhan @Jason0214

This is an automated message from the Librarian Bot. I found the following papers similar to this paper.

The following papers were recommended by the Semantic Scholar API

Please give a thumbs up to this comment if you found it helpful!

If you want recommendations for any Paper on Hugging Face checkout this Space

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2312.16171 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2312.16171 in a dataset README.md to link it from this page.

Spaces citing this paper 1

Collections including this paper 22