turingmachine commited on
Commit
c7d01aa
·
1 Parent(s): 8078f5d

Model card

Browse files
Files changed (1) hide show
  1. README.md +73 -0
README.md ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ language:
3
+ - en
4
+ tags:
5
+ - hupd
6
+ - t5
7
+ - summarization
8
+ - conditional-generation
9
+ - patents
10
+ license: cc-by-sa-4.0
11
+ datasets:
12
+ - HUPD/hupd
13
+ ---
14
+
15
+ # HUPD T5-Small Summarization Model
16
+
17
+ This HUPD T5-Small summarization model was fine-tuned on the HUPD dataset. It was originally introduced in [this paper](TBD).
18
+
19
+ For more information about the Harvard USPTO Patent Dataset, please feel free to visit the [project website](https://patentdataset.org/) or the [project's GitHub repository](https://github.com/suzgunmirac/hupd).
20
+
21
+
22
+ ### How to Use
23
+
24
+ You can use this model directly with a pipeline for masked language modeling:
25
+
26
+ ```python
27
+ from transformers import pipeline
28
+ summarizer = pipeline(task="summarization", model="turingmachine/hupd-t5-small")
29
+
30
+ TEXT = "1. An optical coherent receiver for an optical communication network, said optical coherent receiver being configured to receive a modulated optical signal and to process said modulated optical signal for generating an in-phase component and a quadrature component, said in-phase component and said quadrature component being electrical signals, said optical coherent receiver comprising a power adjuster in turn comprising: a multiplying unit configured to multiply said in-phase component by an in-phase gain thereby providing a power-adjusted in-phase component, and to multiply said quadrature component by a quadrature gain thereby providing a power-adjusted quadrature component; and a digital circuit connected between output and input of said multiplying unit and configured to compute: a common gain indicative of a sum of a power of said power-adjusted in-phase component and a power of said power-adjusted quadrature component, and a differential gain indicative of a difference between said power of said power-adjusted in-phase component and said power of said power-adjusted quadrature component; and said in-phase gain as a product between said common gain and said differential gain, and said quadrature gain as a ratio between said common gain and said differential gain. 2. An optical coherent receiver according to claim 1, wherein it further comprises an analog-to-digital unit connected at the input of said power adjuster, said analog-to-digital unit being configured to ..."
31
+
32
+ summarizer(TEXT)
33
+ ```
34
+
35
+ Here is the output:
36
+ ```python
37
+ [{'summary_text': 'An optical coherent receiver for an optical communication network includes a power adjuster and a digital circuit connected between output and input of the multiplying unit and configured to compute a common gain indicative of a sum of the power of an in-phase component and the power-adjusted quadrature component, and the differential gain as a product between the common gain and the diffractive gain.'}]
38
+ ```
39
+
40
+ Alternatively, you can load the model and use it as follows:
41
+
42
+ ```python
43
+ import torch
44
+ from transformers import AutoTokenizer, AutoModelWithLMHead
45
+ # cuda/cpu
46
+ device = 'cuda' if torch.cuda.is_available() else 'cpu'
47
+ tokenizer = AutoTokenizer.from_pretrained("turingmachine/hupd-t5-small")
48
+ model = AutoModelWithLMHead.from_pretrained("turingmachine/hupd-t5-small").to(device)
49
+
50
+ inputs = tokenizer(TEXT, return_tensors="pt").to(device)
51
+
52
+ with torch.no_grad():
53
+ outputs = model.generate(inputs.input_ids, max_new_tokens=256)
54
+
55
+ generated_text = tokenizer.decode(outputs[0], skip_special_tokens=True)
56
+ ```
57
+
58
+ ## Citation
59
+
60
+ For more information, please take a look at the original paper.
61
+
62
+ * Paper: [The Harvard USPTO Patent Dataset: A Large-Scale, Well-Structured, and Multi-Purpose Corpus of Patent Applications](TBD)
63
+
64
+ * Authors: *Mirac Suzgun, Luke Melas-Kyriazi, Suproteem K. Sarkar, Scott Duke Kominers, and Stuart M. Shieber*
65
+
66
+ * BibTeX:
67
+ ```
68
+ @article{suzgun2022hupd,
69
+ title={The Harvard USPTO Patent Dataset: A Large-Scale, Well-Structured, and Multi-Purpose Corpus of Patent Applications},
70
+ author={Suzgun, Mirac and Melas-Kyriazi, Luke and Sarkar, Suproteem K and Kominers, Scott and Shieber, Stuart},
71
+ year={2022}
72
+ }
73
+ ```