zjiayao commited on
Commit
aa4d28d
1 Parent(s): 5cf26b0

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -1
README.md CHANGED
@@ -1,5 +1,8 @@
1
  ---
2
  license: mit
 
 
 
3
  ---
4
  # roberta-temporal-predictor
5
  A RoBERTa-base model that is fine-tuned on the [The New York Times Annotated Corpus](https://catalog.ldc.upenn.edu/LDC2008T19)
@@ -8,7 +11,14 @@ in our ROCK framework for reasoning about commonsense causality. See our [paper]
8
 
9
  # Usage
10
 
11
- For simplicity, we implement the following TempPredictor class. Example usage using the ``TempPredictor`` class:
 
 
 
 
 
 
 
12
  ```python
13
  from transformers import (RobertaForMaskedLM, RobertaTokenizer)
14
  from src.temp_predictor import TempPredictor
1
  ---
2
  license: mit
3
+ widget:
4
+ - text: "The man turned on the faucet <mask> water flows out."
5
+ - text: "The woman received her pension <mask> she retired."
6
  ---
7
  # roberta-temporal-predictor
8
  A RoBERTa-base model that is fine-tuned on the [The New York Times Annotated Corpus](https://catalog.ldc.upenn.edu/LDC2008T19)
11
 
12
  # Usage
13
 
14
+ You can directly use this model for filling-mask tasks, as shown in the example widget.
15
+ However, for better temporal inference, it is recommended to symmetrize the outputs as
16
+ $$
17
+ P(E_1 \prec E_2) = \frac{1}{2} (f(E_1,E_2) + f(E_2,E_1))
18
+ $$
19
+ where ``f(E_1,E_2)`` denotes the predicted probability for ``E_1`` to occur preceding ``E_2``.
20
+ For simplicity, we implement the following TempPredictor class that incorporate this symmetrization automatically.
21
+ Below is an example usage for the ``TempPredictor`` class:
22
  ```python
23
  from transformers import (RobertaForMaskedLM, RobertaTokenizer)
24
  from src.temp_predictor import TempPredictor