Elron commited on
Commit
560c546
1 Parent(s): 51f54c2

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -4
README.md CHANGED
@@ -2,6 +2,9 @@
2
  tags:
3
  - text-2-text-generation
4
  - t5
 
 
 
5
  ---
6
 
7
  # Model Card for qcpg-sentences
@@ -11,7 +14,7 @@ tags:
11
  # Quality Controlled Paraphrase Generation (ACL 2022)
12
  > Paraphrase generation has been widely used in various downstream tasks. Most tasks benefit mainly from high quality paraphrases, namely those that are semantically similar to, yet linguistically diverse from, the original sentence. Generating high-quality paraphrases is challenging as it becomes increasingly hard to preserve meaning as linguistic diversity increases. Recent works achieve nice results by controlling specific aspects of the paraphrase, such as its syntactic tree. However, they do not allow to directly control the quality of the generated paraphrase, and suffer from low flexibility and scalability.
13
 
14
- <img src="/assets/images/ilus.jpg" width="40%">
15
 
16
  > Here we propose `QCPG`, a quality-guided controlled paraphrase generation model, that allows directly controlling the quality dimensions. Furthermore, we suggest a method that given a sentence, identifies points in the quality control space that are expected to yield optimal generated paraphrases. We show that our method is able to generate paraphrases which maintain the original meaning while achieving higher diversity than the uncontrolled baseline.
17
 
@@ -20,7 +23,7 @@ The code for training, evaluation and inference for both `QCPG` and `QP` is loca
20
 
21
  Make sure to run `QCPG/scripts/prepare_data.sh` and set the missing datasets directories accordingly before training!
22
 
23
- <img src="/assets/images/arch.png" width="90%">
24
 
25
  ## Trained Models
26
 
@@ -285,5 +288,4 @@ model = AutoModelForSeq2SeqLM.from_pretrained("ibm/qcpg-sentences")
285
 
286
 
287
  ```
288
- </details>
289
-
 
2
  tags:
3
  - text-2-text-generation
4
  - t5
5
+ - augmentation
6
+ - paraphrase
7
+ - paraphrasing
8
  ---
9
 
10
  # Model Card for qcpg-sentences
 
14
  # Quality Controlled Paraphrase Generation (ACL 2022)
15
  > Paraphrase generation has been widely used in various downstream tasks. Most tasks benefit mainly from high quality paraphrases, namely those that are semantically similar to, yet linguistically diverse from, the original sentence. Generating high-quality paraphrases is challenging as it becomes increasingly hard to preserve meaning as linguistic diversity increases. Recent works achieve nice results by controlling specific aspects of the paraphrase, such as its syntactic tree. However, they do not allow to directly control the quality of the generated paraphrase, and suffer from low flexibility and scalability.
16
 
17
+ <img src="https://github.com/IBM/quality-controlled-paraphrase-generation/raw/main/assets/images/ilus.jpg" width="40%">
18
 
19
  > Here we propose `QCPG`, a quality-guided controlled paraphrase generation model, that allows directly controlling the quality dimensions. Furthermore, we suggest a method that given a sentence, identifies points in the quality control space that are expected to yield optimal generated paraphrases. We show that our method is able to generate paraphrases which maintain the original meaning while achieving higher diversity than the uncontrolled baseline.
20
 
 
23
 
24
  Make sure to run `QCPG/scripts/prepare_data.sh` and set the missing datasets directories accordingly before training!
25
 
26
+ <img src="https://github.com/IBM/quality-controlled-paraphrase-generation/raw/main/assets/images/arch.png" width="90%">
27
 
28
  ## Trained Models
29
 
 
288
 
289
 
290
  ```
291
+ </details>