File size: 5,412 Bytes
7e0a547
23e9417
bd80181
23e9417
 
bd80181
23e9417
 
 
 
bd80181
23e9417
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
---
license: apache-2.0
task_categories:
  - text-classification
  - text-generation
language:
  - en
tags:
  - finance
pretty_name: FinFact
size_categories:
  - 1K<n<10K
dataset_info:
  - config_name: generation
    features:
      - name: url
        dtype: string
      - name: claim
        dtype: string
      - name: author
        dtype: string
      - name: posted
        dtype: string
      # - name: sci_digest
      #   sequence: string
      # - name: justification
      #   sequence: string
      # - name: issues
      #   dtype: string
      # - name: image_data
      #   sequence:
      #     - name: image_src
      #       dtype: string
      #     - name: image_caption
      #       dtype: string
      # - name: evidence
      #   sequence:
      #     - name: sentence
      #       dtype: string
      #     - name: hrefs
      #       dtype: string
      # - name: label
      #   dtype: string
      # - name: visualization_bias
      #   dtype: int32

---

<h1 align="center">Fin-Fact - Financial Fact-Checking Dataset</h1>


## Table of Contents

- [Overview](#overview)
- [Dataset Description](#dataset-description)
- [Dataset Usage](#dataset-usage)
- [Leaderboard](#leaderboard)
- [Dependencies](#dependencies)
- [Run models for paper metrics](#run-models-for-paper-metrics)
- [Citation](#citation)
- [Contribution](#contribution)
- [License](#license)
- [Contact](#contact)

## Overview

Welcome to the Fin-Fact repository! Fin-Fact is a comprehensive dataset designed specifically for financial fact-checking and explanation generation. This README provides an overview of the dataset, how to use it, and other relevant information. [Click here](https://arxiv.org/abs/2309.08793) to access the paper.

## Dataset Description

- **Name**: Fin-Fact
- **Purpose**: Fact-checking and explanation generation in the financial domain.
- **Labels**: The dataset includes various labels, including Claim, Author, Posted Date, Sci-digest, Justification, Evidence, Evidence href, Image href, Image Caption, Visualisation Bias Label, Issues, and Claim Label.
- **Size**: The dataset consists of 3121 claims spanning multiple financial sectors.
- **Additional Features**: The dataset goes beyond textual claims and incorporates visual elements, including images and their captions.

## Dataset Usage

Fin-Fact is a valuable resource for researchers, data scientists, and fact-checkers in the financial domain. Here's how you can use it:

1. **Download the Dataset**: You can download the Fin-Fact dataset [here](https://github.com/IIT-DM/Fin-Fact/blob/FinFact/finfact.json).

2. **Exploratory Data Analysis**: Perform exploratory data analysis to understand the dataset's structure, distribution, and any potential biases.

3. **Natural Language Processing (NLP) Tasks**: Utilize the dataset for various NLP tasks such as fact-checking, claim verification, and explanation generation.

4. **Fact Checking Experiments**: Train and evaluate machine learning models, including text and image analysis, using the dataset to enhance the accuracy of fact-checking systems.

## Leaderboard

## Dependencies
We recommend you create an anaconda environment:

`conda create --name finfact python=3.6 conda-build`

Then, install Python requirements:

`pip install -r requirements.txt`


## Run models for paper metrics

We provide scripts let you easily run our dataset on existing state-of-the-art models and re-create the metrics published in paper. You should be able to reproduce our results from the paper by following these instructions. Please post an issue if you're unable to do this.
To run existing ANLI models for fact checking. 

### Run:
1. BART
```bash
python anli.py --model_name 'ynie/bart-large-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
2. RoBERTa
```bash
python anli.py --model_name 'ynie/roberta-large-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
3. ELECTRA
```bash
python anli.py --model_name 'ynie/electra-large-discriminator-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
4. AlBERT
```bash
python anli.py --model_name 'ynie/albert-xxlarge-v2-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
5. XLNET
```bash
python anli.py --model_name 'ynie/xlnet-large-cased-snli_mnli_fever_anli_R1_R2_R3-nli' --data_file finfact.json --threshold 0.5
```
6. GPT-2
```bash
python gpt2_nli.py --model_name 'fractalego/fact-checking' --data_file finfact.json
```


## Citation

```
@misc{rangapur2023finfact,
      title={Fin-Fact: A Benchmark Dataset for Multimodal Financial Fact Checking and Explanation Generation}, 
      author={Aman Rangapur and Haoran Wang and Kai Shu},
      year={2023},
      eprint={2309.08793},
      archivePrefix={arXiv},
      primaryClass={cs.AI}
}
```

## Contribution

We welcome contributions from the community to help improve Fin-Fact. If you have suggestions, bug reports, or want to contribute code or data, please check our [CONTRIBUTING.md](CONTRIBUTING.md) file for guidelines.

## License

Fin-Fact is released under the [MIT License](/LICENSE). Please review the license before using the dataset.

## Contact
For questions, feedback, or inquiries related to Fin-Fact, please contact `arangapur@hawk.iit.edu`.

We hope you find Fin-Fact valuable for your research and fact-checking endeavors. Happy fact-checking!