taskGPT2-xl-v0.2a / README (1).md
AlexWortega's picture
Upload README (1).md
623af53
metadata
datasets:
  - cot
  - cos_e
  - math_qa
  - CShorten/ML-ArXiv-Papers
  - gsm8k
  - code_x_glue_tc_text_to_code
  - Muennighoff/P3
  - HuggingFaceH4/self-instruct-seed
  - truthful_qa
  - empathetic_dialogues
inference:
  parameters:
    max_new_tokens: 32
    temperature: 1
    top_k: 1
license: apache-2.0
language:
  - en
pipeline_tag: text-generation
widget:
  - example_title: QA
    text: What is a BERT?
  - example_title: Open domain QA
    text: >-
      Please answer the following question. What is the boiling point of
      Nitrogen?
  - example_title: Theme text Generation
    text: Generate text about BERT
  - null

taskGPT2-xl v0.2a

Model Summary

I finetuned GPT2 on text2code, cot, math and FLAN tasks, on some tasks its performs better than GPT-JT

I create a collection of open techniques and datasets to build taskGPT2-xl:

Quick Start

from transformers import pipeline
pipe = pipeline(model='AlexWortega/taskGPT2-xl')
pipe('''"I love this!" Is it positive? A:''')

or

from transformers import AutoTokenizer, AutoModelForCausalLM
tokenizer = AutoTokenizer.from_pretrained("taskGPT2-xl")
model = AutoModelForCausalLM.from_pretrained("taskGPT2-xl")

License

The weights of taskGPT2-xl are licensed under version 2.0 of the Apache License.

Training Details

I used datasets from huggingface:

  • strategyqa_train
  • aqua_train
  • qed_train

Hyperparameters

I used Novograd with a learning rate of 2e-5 and global batch size of 6 (3 for each data parallel worker). I use both data parallelism and pipeline parallelism to conduct training. During training, we truncate the input sequence to 512 tokens, and for input sequence that contains less than 512 tokens, we concatenate multiple sequences into one long sequence to improve the data efficiency.

References

#Metrics

SOON

BibTeX entry and citation info

@article{
  title={GPT2xl is underrated task solver},
  author={Nickolich Aleksandr, Karina Romanova, Arseniy Shahmatov, Maksim Gersimenko},
  year={2023}
}