Model Card for Model ID

Itโ€™s based on the Gemma-2B-IT model and has been specifically trained for question generation tasks. And other tasks beside the question is also working fine.

Model Details

Model Description

This is the model card of a ๐Ÿค— transformers model that has been pushed on the Hub. This model card has been automatically generated.

  • Developed by: [Hyun Lee]
  • Model type: [LLM]
  • Language(s) (NLP): [Python]
  • License: [N/A]
  • Finetuned from model: [gemma-2b-it]

How to use the model

from transformers import pipeline

pipe = pipeline("text-generation", model="Lucas-Hyun-Lee/gemma-2b-it-Question-generation-en-sft-qlora")
#for exmample you can write the document like this.
doc = "Graham returned to the WWWF in April 1977 after an agreement with promoter Vincent J. McMahon (Senior). Graham defeated Bruno Sammartino for the WWWF Heavyweight Championship on April 30, 1977, in Baltimore, Maryland. Graham held the title for nine and a half months.
 During his reign, he wrestled across America and in Japan (February 1978), facing challengers such as former champion Bruno Sammartino, Jack Brisco, Dusty Rhodes, Pedro Morales, Don Muraco, Mil Mascaras, Strong Kobayashi and Riki Choshu. On 25 January 1978 in Miami, Florida
 at the Orange Bowl football stadium, Graham wrestled against then-NWA World Heavyweight Champion Harley Race in a WWWF World Heavyweight Championship vs. NWA World Heavyweight Championship unification match which ended in a one-hour time-limit draw. Although a defeat by Bob Backlund,
 who was to embody the virtuous junior \"all-American\" wrestler, had been written into Grahams current contract with the WWWF, Graham suggested another outcome to McMahon: that Ivan Koloff should turn on him, thus starting a feud that would make Graham a fan favorite.
 McMahon refused because of the handshake deal to make Backlund the new fan favorite champion and he did not want to go back on his word.
 It was also unheard of for a counter-cultural character like Graham to be a fan favorite, because McMahon and many old promoters saw Graham as a confirmed heel and therefore a negative role model.
 Graham eventually \"lost\" the title to Backlund on February 20, 1978. Another feud Graham had as champion was with Dusty Rhodes,
 which culminated in a Texas Bullrope match. His confrontations with Rhodes continued after Graham had been forced to drop the belt to Backlund. Rhodes himself,
 a long-time friend of Graham's, recalled these matches with Graham in 1978 as among the most exciting and memorable of his career. Disillusioned by the premature loss of his belt, Graham left the WWWF in December 1978 and accepted an offer to join Paul Boesch's promotion in Houston,
 Texas, lending himself out for other NWA events in California and Florida as well. In April 1979 he embarked on his third IWA tour of Japan, where he wrestled the same men he had worked with in 1974. In March 1979, the new Continental Wrestling Association (CWA) named Graham as their World Champion.
 On November 8th, 1979 Graham lost the belt to Jerry Lawler in Lexington, Kentucky. His following NWA engagements in Kentucky, Tennessee, Georgia and Texas became fewer and rarer until he stopped wrestling in April 1980. Graham wrestled only two matches (one in Canada and one in Los Angeles) in the whole of 1981.
 He spent some time competing in Japan, where he added some martial arts techniques to his repertoire. CANNOTANSWER"

prompt = pipe.tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)

messages = [{"role": "user", "content": f"Make some questions out of the context:\n\n{doc}"}]

outputs = pipe(prompt, do_sample=True, temperature=0.2, top_k=50, top_p=0.95, max_new_tokens=1000)

print(outputs[0]["generated_text"][len(prompt):])
>>> Sure, here are some questions out of the context:

  1. In what city did Graham defeat Bruno Sammartino for the WWWF Heavyweight Championship?


  2. What was the outcome of Graham's match against Harley Race at the Orange Bowl football stadium in Miami, Florida?


  3. What was the result of Graham's feud with Dusty Rhodes?


  4. In what promotion did Graham join after leaving the WWWF?


  5. In what year did Graham embark on his third IWA tour of Japan?

https://colab.research.google.com/drive/1-elSI0MbgA-iLlYilQhWtKgLVwzQ2pg-?usp=sharing

Downstream Use [optional]

[More Information Needed]

Out-of-Scope Use

[More Information Needed]

Bias, Risks, and Limitations

[More Information Needed]

Recommendations

Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.

How to Get Started with the Model

Use the code below to get started with the model.

[More Information Needed]

Training Details

Training Data

[More Information Needed]

Training Procedure

Preprocessing [optional]

[More Information Needed]

Training Hyperparameters

  • Training regime: [More Information Needed]

Speeds, Sizes, Times [optional]

[More Information Needed]

Evaluation

Testing Data, Factors & Metrics

Testing Data

[More Information Needed]

Factors

[More Information Needed]

Metrics

[More Information Needed]

Results

[More Information Needed]

Summary

Model Examination [optional]

[More Information Needed]

Environmental Impact

Carbon emissions can be estimated using the Machine Learning Impact calculator presented in Lacoste et al. (2019).

\usepackage{hyperref}

\subsection{CO2 Emission Related to Experiments}

Experiments were conducted using Google Cloud Platform in region northamerica-northeast1, which has a carbon efficiency of 0.03 kgCO$_2$eq/kWh. A cumulative of 3 hours of computation was performed on hardware of type A100 PCIe 40/80GB (TDP of 250W).

Total emissions are estimated to be 0.02 kgCO$_2$eq of which 100 percents were directly offset by the cloud provider.

%Uncomment if you bought additional offsets: %XX kg CO2eq were manually offset through \href{link}{Offset Provider}.

Estimations were conducted using the \href{https://mlco2.github.io/impact#compute}{MachineLearning Impact calculator} presented in \cite{lacoste2019quantifying}.

@article{lacoste2019quantifying, title={Quantifying the Carbon Emissions of Machine Learning}, author={Lacoste, Alexandre and Luccioni, Alexandra and Schmidt, Victor and Dandres, Thomas}, journal={arXiv preprint arXiv:1910.09700}, year={2019} }

Technical Specifications [optional]

Model Architecture and Objective

[More Information Needed]

Compute Infrastructure

[More Information Needed]

Hardware

[More Information Needed]

Software

[More Information Needed]

Citation [optional]

BibTeX:

[More Information Needed]

APA:

[More Information Needed]

Glossary [optional]

[More Information Needed]

More Information [optional]

[More Information Needed]

Model Card Authors [optional]

[More Information Needed]

Model Card Contact

[More Information Needed]

Downloads last month
6
Safetensors
Model size
2.51B params
Tensor type
FP16
ยท
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Dataset used to train Lucas-Hyun-Lee/gemma-2b-it-Question-generation-en-sft-qlora

Space using Lucas-Hyun-Lee/gemma-2b-it-Question-generation-en-sft-qlora 1