update links from sgnlp to sgnlp-models
Browse files
README.md
CHANGED
@@ -72,8 +72,8 @@ The train and evaluation datasets were derived from the Twitter15, Twitter16 and
|
|
72 |
- **Training Time:** ~6 hours on a single V100 GPU.
|
73 |
|
74 |
# Model Parameters
|
75 |
-
- **Model Weights:** [link](https://storage.googleapis.com/sgnlp/models/rumour_detection_twitter/pytorch_model.bin)
|
76 |
-
- **Model Config:** [link](https://storage.googleapis.com/sgnlp/models/rumour_detection_twitter/config.json)
|
77 |
- **Model Inputs:** Thread of tweets. The first tweet should be the target tweet.
|
78 |
- **Model Outputs:** Array of logits for each class (True, False, Unverified, Non-Rumour). This can be converted into probabilities using the softmax function.
|
79 |
- **Model Size:** ~60mb
|
|
|
72 |
- **Training Time:** ~6 hours on a single V100 GPU.
|
73 |
|
74 |
# Model Parameters
|
75 |
+
- **Model Weights:** [link](https://storage.googleapis.com/sgnlp-models/models/rumour_detection_twitter/pytorch_model.bin)
|
76 |
+
- **Model Config:** [link](https://storage.googleapis.com/sgnlp-models/models/rumour_detection_twitter/config.json)
|
77 |
- **Model Inputs:** Thread of tweets. The first tweet should be the target tweet.
|
78 |
- **Model Outputs:** Array of logits for each class (True, False, Unverified, Non-Rumour). This can be converted into probabilities using the softmax function.
|
79 |
- **Model Size:** ~60mb
|