cornelliusyudhawijaya
commited on
Commit
•
51933ef
1
Parent(s):
4c67951
Pushed by DataDreamer
Browse files
README.md
CHANGED
@@ -1,3 +1,40 @@
|
|
|
|
1 |
---
|
2 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
|
2 |
---
|
3 |
+
base_model: google/t5-v1_1-base
|
4 |
+
|
5 |
+
tags:
|
6 |
+
- datadreamer
|
7 |
+
- datadreamer-0.18.0
|
8 |
+
- synthetic
|
9 |
+
- openai-community/gpt2
|
10 |
+
- openai-community/gpt2
|
11 |
+
- text2text-generation
|
12 |
+
|
13 |
+
widget:
|
14 |
+
- text: "Note that not all scientists will apply, but there may be a handful.\n\nThe abstract can be downloaded from the papers cited in the paper for use within your project. We also recommend posting the results of the experiment, using our mailing list format, on these pages.\n\nFor other papers, see How to obtain the data from your source publication in NLP.\n\nThis project was last reported with NLP 3.10.6. The journal publishes NLP 3.10.6 once every seven years."
|
15 |
+
example_title: "Example 1"
|
16 |
+
- text: "No supporting documents.\n\nThe URL is http://csjn.acscentral.org/article/9780-1222-116600-3, arxiv.org/abs/12220153. Inline citations accepted.\n\nSee http://sciencebook.org/reviews/2013/11/27/is-math-theory or https://www.npr.org/content/early/2012/5/17/209732.full\n\nRead more.\n\nRelated articles and papers by Jonathan Blumberg.\n\nBooks\n\nGottfried Wernick (2013), The Mathematics of Arithmetic and Mathematical Environments. Journal of Linear Science, 1:1. ISBN 97803544-01-1 CrossRef Full Text\n\nMikayla Sotjianis and Peter Kudzimbe (2008), Mathematical Applications of Arxiv: Arithmetic in the Riemann\u2013Kosmogorov Puzzle: Results from A Simulation. Riemann\u2013Kosmogorov Puzzle, 1:1. ISBN 978-1-415-4589-6 Google Scholar\n\nThomas M. Leeson, Benjamin Gagnon, Paul E. Sowardson, Mark J. Alder, Robert F. Blanchard, Alan K. O'Brien, and Alan B. Caffey (2013), Statistical Analysis for Probabilistic Complexity. J. Prodd. Math, 6:3157. Google Scholar Crossref, ISI\n\nSchlott and Gee (2013), Theory of Differential Order and Complexity: Exploring the Complexness and Complexness of the Efficient and Operative Eigenvalues, 5th ed. Berkeley, CA: Google Scholar\n\nCafu K. Nixen (1990), Computational Statistics with RISC for the Riemann\u2013Kosmogorov Puzzle. L.Citation: 16352909\n\nKonrad, A. M., F. Gomes, J. J. Fortunini, and M. Mascariel (2011), The LSE and Kratz scale of polynomials (LSE = n polynomials). Environ., 36:3109. Google Scholar SAGE Journals, ISI\n\nFriesberg, P. A., E. R. Hirsch, F. M. Schubert, R. Oskarbrunner, L. Eckermeyer Cen. G. Ziemann, P. W. Ziemann (2015), Mathematical Mathematical Formulae. Proc. ICLS, 67, 471\u2013482. doi: 10.1023/jpj.1516085 PubMed Abstract | CrossRef Full Text | Google Scholar\n\nMcNally, R. P., Gagnon, D. G. Trenberth, M. S., and E. P. Hildebrandt (2010), Analysis of the Arithmetic of K(\u22124)\\. J. Probabil. Exp. Prob. Prod., 59:738\u2013749. doi: 10.1308/JPM-C4S1020-0815.55509864 PubMed Abstract | CrossRef Full Text | Google Scholar"
|
17 |
+
example_title: "Example 2"
|
18 |
+
- text: "You will get:\n\nA short overview of NLP research paper A review of all scientific articles related to the subject (in alphabetical order). You will understand why authors of journals using NLP paper are using NLP papers. Authorly search for: A list of the papers cited. To add citations, include all of your abstracts on top. Review and publish published papers of NLP and arXiv papers in the subject for this NLP paper, as well as for all other papers submitted for publication."
|
19 |
+
example_title: "Example 3"
|
20 |
+
pipeline_tag: text2text-generation
|
21 |
---
|
22 |
+
# Model Card
|
23 |
+
|
24 |
+
[Add more information here](https://huggingface.co/templates/model-card-example)
|
25 |
+
|
26 |
+
## Example Usage
|
27 |
+
|
28 |
+
```python3
|
29 |
+
from transformers import AutoModelForSeq2SeqLM, AutoTokenizer, pipeline
|
30 |
+
|
31 |
+
tokenizer = AutoTokenizer.from_pretrained('cornelliusyudhawijaya/abstracts_to_post_model', revision=None) # Load tokenizer
|
32 |
+
model = AutoModelForSeq2SeqLM.from_pretrained('cornelliusyudhawijaya/abstracts_to_post_model', revision=None) # Load model
|
33 |
+
pipe = pipeline('text2text-generation', model=model, tokenizer=tokenizer, pad_token_id=tokenizer.pad_token_id)
|
34 |
+
|
35 |
+
inputs = ['Note that not all scientists will apply, but there may be a handful.\n\nThe abstract can be downloaded from the papers cited in the paper for use within your project. We also recommend posting the results of the experiment, using our mailing list format, on these pages.\n\nFor other papers, see How to obtain the data from your source publication in NLP.\n\nThis project was last reported with NLP 3.10.6. The journal publishes NLP 3.10.6 once every seven years.']
|
36 |
+
print(pipe(inputs, max_length=512, do_sample=False))
|
37 |
+
```
|
38 |
+
|
39 |
+
---
|
40 |
+
This model was trained with a synthetic dataset with [DataDreamer 🤖💤](https://datadreamer.dev). The synthetic dataset card and model card can be found [here](datadreamer.json). The training arguments can be found [here](training_args.json).
|