rajistics commited on
Commit
66225e4
1 Parent(s): 0904c79

added codebook

Browse files
Files changed (1) hide show
  1. README.md +6 -13
README.md CHANGED
@@ -19,7 +19,7 @@ task_ids:
19
  paperswithcode_id: null
20
  pretty_name: Auditor_Review
21
  ---
22
- # Dataset Card for [Dataset Name]
23
 
24
  ## Table of Contents
25
  - [Table of Contents](#table-of-contents)
@@ -49,6 +49,7 @@ Auditor review data collected by News Department
49
 
50
  - **Point of Contact:**
51
  Talked to COE for Auditing, currently sue@demo.org
 
52
  ### Dataset Summary
53
 
54
  Auditor sentiment dataset of sentences from financial news. The dataset consists of 3500 sentences from English language financial news categorized by sentiment. The dataset is divided by the agreement rate of 5-8 annotators.
@@ -75,6 +76,8 @@ English
75
  - sentence: a tokenized line from the dataset
76
  - label: a label corresponding to the class as a string: 'positive' - (2), 'neutral' - (1), or 'negative' - (0)
77
 
 
 
78
  ### Data Splits
79
 
80
  A train/test split was created randomly with a 75/25 split
@@ -83,7 +86,7 @@ A train/test split was created randomly with a 75/25 split
83
 
84
  ### Curation Rationale
85
 
86
- To gather our auditor evaluations into one dataset. Previous attempts using off the shelf sentiment had only 70% F1, this dataset was an attempt to improve upon that performance.
87
 
88
  ### Source Data
89
 
@@ -101,7 +104,7 @@ The source data was written by various auditors.
101
 
102
  This release of the auditor reviews covers a collection of 4840
103
  sentences. The selected collection of phrases was annotated by 16 people with
104
- adequate background knowledge on financial markets. The subset here is where interannotation agreement was greater than 75%.
105
 
106
  #### Who are the annotators?
107
 
@@ -113,10 +116,6 @@ There is no personal or sensitive information in this dataset.
113
 
114
  ## Considerations for Using the Data
115
 
116
- ### Social Impact of Dataset
117
-
118
- [More Information Needed]
119
-
120
  ### Discussion of Biases
121
 
122
  All annotators were from the same institution and so interannotator agreement
@@ -126,12 +125,6 @@ should be understood with this taken into account.
126
 
127
  [More Information Needed]
128
 
129
- ## Additional Information
130
-
131
- ### Dataset Curators
132
-
133
- [More Information Needed]
134
-
135
  ### Licensing Information
136
 
137
  License: Demo.Org Proprietary - DO NOT SHARE
 
19
  paperswithcode_id: null
20
  pretty_name: Auditor_Review
21
  ---
22
+ # Dataset Card for Auditor_Review
23
 
24
  ## Table of Contents
25
  - [Table of Contents](#table-of-contents)
 
49
 
50
  - **Point of Contact:**
51
  Talked to COE for Auditing, currently sue@demo.org
52
+
53
  ### Dataset Summary
54
 
55
  Auditor sentiment dataset of sentences from financial news. The dataset consists of 3500 sentences from English language financial news categorized by sentiment. The dataset is divided by the agreement rate of 5-8 annotators.
 
76
  - sentence: a tokenized line from the dataset
77
  - label: a label corresponding to the class as a string: 'positive' - (2), 'neutral' - (1), or 'negative' - (0)
78
 
79
+ Complete data code is [available here](https://www.datafiles.samhsa.gov/get-help/codebooks/what-codebook)
80
+
81
  ### Data Splits
82
 
83
  A train/test split was created randomly with a 75/25 split
 
86
 
87
  ### Curation Rationale
88
 
89
+ To gather our auditor evaluations into one dataset. Previous attempts using off-the-shelf sentiment had only 70% F1, this dataset was an attempt to improve upon that performance.
90
 
91
  ### Source Data
92
 
 
104
 
105
  This release of the auditor reviews covers a collection of 4840
106
  sentences. The selected collection of phrases was annotated by 16 people with
107
+ adequate background knowledge of financial markets. The subset here is where inter-annotation agreement was greater than 75%.
108
 
109
  #### Who are the annotators?
110
 
 
116
 
117
  ## Considerations for Using the Data
118
 
 
 
 
 
119
  ### Discussion of Biases
120
 
121
  All annotators were from the same institution and so interannotator agreement
 
125
 
126
  [More Information Needed]
127
 
 
 
 
 
 
 
128
  ### Licensing Information
129
 
130
  License: Demo.Org Proprietary - DO NOT SHARE