cornelliusyudhawijaya's picture
Pushed by DataDreamer
51933ef verified
metadata
base_model: google/t5-v1_1-base
tags:
  - datadreamer
  - datadreamer-0.18.0
  - synthetic
  - openai-community/gpt2
  - openai-community/gpt2
  - text2text-generation
widget:
  - text: >-
      Note that not all scientists will apply, but there may be a handful.


      The abstract can be downloaded from the papers cited in the paper for use
      within your project. We also recommend posting the results of the
      experiment, using our mailing list format, on these pages.


      For other papers, see How to obtain the data from your source publication
      in NLP.


      This project was last reported with NLP 3.10.6. The journal publishes NLP
      3.10.6 once every seven years.
    example_title: Example 1
  - text: >-
      No supporting documents.


      The URL is http://csjn.acscentral.org/article/9780-1222-116600-3,
      arxiv.org/abs/12220153. Inline citations accepted.


      See http://sciencebook.org/reviews/2013/11/27/is-math-theory or
      https://www.npr.org/content/early/2012/5/17/209732.full


      Read more.


      Related articles and papers by Jonathan Blumberg.


      Books


      Gottfried Wernick (2013), The Mathematics of Arithmetic and Mathematical
      Environments. Journal of Linear Science, 1:1. ISBN 97803544-01-1 CrossRef
      Full Text


      Mikayla Sotjianis and Peter Kudzimbe (2008), Mathematical Applications of
      Arxiv: Arithmetic in the Riemann–Kosmogorov Puzzle: Results from A
      Simulation. Riemann–Kosmogorov Puzzle, 1:1. ISBN 978-1-415-4589-6 Google
      Scholar


      Thomas M. Leeson, Benjamin Gagnon, Paul E. Sowardson, Mark J. Alder,
      Robert F. Blanchard, Alan K. O'Brien, and Alan B. Caffey (2013),
      Statistical Analysis for Probabilistic Complexity. J. Prodd. Math, 6:3157.
      Google Scholar Crossref, ISI


      Schlott and Gee (2013), Theory of Differential Order and Complexity:
      Exploring the Complexness and Complexness of the Efficient and Operative
      Eigenvalues, 5th ed. Berkeley, CA: Google Scholar


      Cafu K. Nixen (1990), Computational Statistics with RISC for the
      Riemann–Kosmogorov Puzzle. L.Citation: 16352909


      Konrad, A. M., F. Gomes, J. J. Fortunini, and M. Mascariel (2011), The LSE
      and Kratz scale of polynomials (LSE = n polynomials). Environ., 36:3109.
      Google Scholar SAGE Journals, ISI


      Friesberg, P. A., E. R. Hirsch, F. M. Schubert, R. Oskarbrunner, L.
      Eckermeyer Cen. G. Ziemann, P. W. Ziemann (2015), Mathematical
      Mathematical Formulae. Proc. ICLS, 67, 471–482. doi: 10.1023/jpj.1516085
      PubMed Abstract | CrossRef Full Text | Google Scholar


      McNally, R. P., Gagnon, D. G. Trenberth, M. S., and E. P. Hildebrandt
      (2010), Analysis of the Arithmetic of K(−4)\. J. Probabil. Exp. Prob.
      Prod., 59:738–749. doi: 10.1308/JPM-C4S1020-0815.55509864 PubMed Abstract
      | CrossRef Full Text | Google Scholar
    example_title: Example 2
  - text: >-
      You will get:


      A short overview of NLP research paper A review of all scientific articles
      related to the subject (in alphabetical order). You will understand why
      authors of journals using NLP paper are using NLP papers. Authorly search
      for: A list of the papers cited. To add citations, include all of your
      abstracts on top. Review and publish published papers of NLP and arXiv
      papers in the subject for this NLP paper, as well as for all other papers
      submitted for publication.
    example_title: Example 3
pipeline_tag: text2text-generation

Model Card

Add more information here

Example Usage

from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline

tokenizer = AutoTokenizer.from_pretrained('cornelliusyudhawijaya/abstracts_to_post_model', revision=None) # Load tokenizer
model = AutoModelForSeq2SeqLM.from_pretrained('cornelliusyudhawijaya/abstracts_to_post_model', revision=None) # Load model
pipe = pipeline('text2text-generation', model=model, tokenizer=tokenizer, pad_token_id=tokenizer.pad_token_id)

inputs = ['Note that not all scientists will apply, but there may be a handful.\n\nThe abstract can be downloaded from the papers cited in the paper for use within your project. We also recommend posting the results of the experiment, using our mailing list format, on these pages.\n\nFor other papers, see How to obtain the data from your source publication in NLP.\n\nThis project was last reported with NLP 3.10.6. The journal publishes NLP 3.10.6 once every seven years.']
print(pipe(inputs, max_length=512, do_sample=False))

This model was trained with a synthetic dataset with DataDreamer 🤖💤. The synthetic dataset card and model card can be found here. The training arguments can be found here.