qbao775 commited on
Commit
5d1a424
1 Parent(s): fdc9854

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -0
README.md CHANGED
@@ -23,6 +23,21 @@ Project: https://github.com/Strong-AI-Lab/Logical-Equivalence-driven-AMR-Data-Au
23
  Leaderboard: https://eval.ai/web/challenges/challenge-page/503/leaderboard/1347
24
 
25
  In this repository, we trained the DeBERTa-V2-XXLarge on the sentence pair constructed by our AMR-LE. We use AMR with one logical equivalence law `(Contraposition law)` to construct different logical equivalence/inequivalence sentences.
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
26
  ## How to load the model weight?
27
  ```
28
  from transformers import AutoModel
 
23
  Leaderboard: https://eval.ai/web/challenges/challenge-page/503/leaderboard/1347
24
 
25
  In this repository, we trained the DeBERTa-V2-XXLarge on the sentence pair constructed by our AMR-LE. We use AMR with one logical equivalence law `(Contraposition law)` to construct different logical equivalence/inequivalence sentences.
26
+
27
+ ## How to interact model in this web page?
28
+ Some test examples that you may copy and paste them into the right side user input area.
29
+ The expected answer for the following example is they are logically inequivalent which is 0. Use constraposition law `(If A then B <=> If not B then not A)` to show that following example is false.
30
+ ```
31
+ If Alice is happy, then Bob is smart.
32
+ If Alice is not happy, then Bob is smart.
33
+ ```
34
+
35
+ The expected answer for the following example is they are logically equivalent which is 1. Use constraposition law `(If A then B <=> If not B then not A)` to show that following example is true.
36
+ ```
37
+ If Alice is happy, then Bob is smart.
38
+ If Bob is not smart, then Alice is not happy.
39
+ ```
40
+
41
  ## How to load the model weight?
42
  ```
43
  from transformers import AutoModel