--- license: apache-2.0 datasets: - jeopardy language: - en --- # Jeopardy Bot Jeopardy Bot is a reasobably good and fast bot at answering jeopardy questsions. Jeopardy is a great format for Language Models because the query is typically very short and the answer is typically even shorter. The `https://huggingface.co/datasets/jeopardy` dataset used to train this model is relatively incomplete with ~216K rows, but provides a rich evaluation set of data as https://j-archive.com/ provides daily updated clues and answers totaling ~468K clues and answers. # Sample Responses Here are some sampled questsions taken from recent episodes. We use Daily Doubles since those are often considered "harder". ``` Below is a Jeopardy clue paired with input providing the category of the clue. Write a concise response that best answers tbe clue given the category. ### Instruction: In 2023 16th St. Baptist in this city marks 150 years of the congregation, a history marred by a 1963 bombing ### Input: Take me to church ### Response: what is Birmingham ``` Correct! Daily Double! https://j-archive.com/suggestcorrection.php?clue_id=461177 ``` Below is a Jeopardy clue paired with input providing the category of the clue. Write a concise response that best answers tbe clue given the category. ### Instruction: You can wear your Crocs as you see the gators in this national park established in 1947 at Florida's southwestern tip ### Input: Florida Places ### Response: what is Everglades National Park ``` Correct! Daily Double! https://j-archive.com/suggestcorrection.php?clue_id=461065 Let's Just give it the Basic Alpaca prompt with no input or even a category... ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: On May 10-11, 1927 he flew from San Diego to New York City, with an overnight stop in St. Louis ### Response: what is Charles Lindbergh ``` Correct! Daily Double! https://j-archive.com/suggestcorrection.php?clue_id=459838 ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: He was born March 30, 1970, in a Virginia stable ### Response: what is George Washington ``` Womp womp womp! Let's give it the category inline... ``` Below is an instruction that describes a task. Write a response that appropriately completes the request. ### Instruction: He was born March 30, 1970, in a Virginia stable. The Category is Sports GOATs. ### Response: what is Michael Jordan ``` Well, even large language models aren't going to get these right all the time. # Training This model is SFT on LLaMa 7B. The training hyperparameters are provided in the YML config, which can be used directly with Axolotl. Training was stopped after ~2 epochs as the evaluation loss had started to increase after that. See https://wandb.ai/wing-lian/jeopardy-bot-7b