Edit model card

PGT

PGT is a GPT-2 prompt-based model trained to facilitate 3 patent generation-related tasks, namely: part-of-patent generation, part-of-patent editing and patent coherence check. For more information about the dataset and the training procedure with refer the reader to our paper.

The task specification is taken place by appending a short sentence at the end of a given input. The general format is:

input <|sep|> task specific prompt <|sep|>

In all cases, the generated output ends with the special token <|endoftext|> to facilitate postprocessing.

Supported tasks

Part-of-patent generation attempts to generate a part of a patent given as input another, already existing part of it. The model has been trained to perform title-to-abstract, abstract-to-claim as well as their inverse generations. For the claim case, the model was only exposed to independent claims during the training. Input example for part-of-patent generation for the abstract-to-title case:

An interesting patent abstract. <|sep|> Given the above abstract, suggest a title <|sep|>

Part-of-patent editing attempts to suggest alternatives for some highlighted parts of a patent abstract or claim. These parts are defined in the input with the special [MASK] token. The expected size of these masked parts can be from a single word to a small phrase. If more than one masks are given in the input, then the generated suggestions are distinguished in the output but the special <|mask_sep|> token. Input example for part-of-patent editing working on a claim input:

An interesting patent claim with a [MASK] part. <|sep|> Replace the [MASK] tokens in the above claim <|sep|>

The coherence check assesses the quality of a patent by examining whether to given parts of a patent could belong to the same patent in terms of content and syntax. The input patent parts can be title, abstract or claim. The expected output is Yes or No. Input example for the coherence check task having as input a title and a claim:

A patent title <|sep|> An interesting patent claim. <|sep|> Do the above title and claim belong to the same patent? <|sep|>"

Further prompts and tasks can be tried in a zero-shot fashion.

The model and the tasks are also integrated and available via the GT4SD python library.

Example

A full example of part-of-patent generation

from transformers import AutoModelForCausalLM, AutoTokenizer

tokenizer = AutoTokenizer.from_pretrained("christofid/pgt")
model = AutoModelForCausalLM.from_pretrained("christofid/pgt")

text = "Automated patent generation <|sep|> Given the above title, suggest an abstract <|sep|>"

text_encoded = tokenizer.encode(text, return_tensors="pt")

generated = model.generate(text_encoded, do_sample=True, top_k=50, num_return_sequences = 3, max_length=512)

generated_text = [tokenizer.decode(case).split("<|endoftext|>")[0].strip() for case in generated]

BibTeX entry and citation info

@inproceedings{christofidellis2022pgt,
  title={PGT: a prompt based generative transformer for the patent domain},
  author={Christofidellis, Dimitrios and Torres, Antonio Berrios and Dave, Ashish and Roveri, Manuel and Schmidt, Kristin and Swaminathan, Sarath and Vandierendonck, Hans and Zubarev, Dmitry and Manica, Matteo},
  booktitle={ICML 2022 Workshop on Knowledge Retrieval and Language Models},
  year={2022}
}
Downloads last month
9
Safetensors
Model size
137M params
Tensor type
F32
·
U8
·