lhoestq HF staff commited on
Commit
5d3b2b7
1 Parent(s): 1e08e92

add dataset_info in dataset metadata

Browse files
Files changed (1) hide show
  1. README.md +106 -0
README.md CHANGED
@@ -20,6 +20,112 @@ task_ids:
20
  - text-scoring
21
  tags:
22
  - toxicity-prediction
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
23
  ---
24
 
25
  # Dataset Card for Jigsaw Unintended Bias in Toxicity Classification
20
  - text-scoring
21
  tags:
22
  - toxicity-prediction
23
+ dataset_info:
24
+ features:
25
+ - name: target
26
+ dtype: float32
27
+ - name: comment_text
28
+ dtype: string
29
+ - name: severe_toxicity
30
+ dtype: float32
31
+ - name: obscene
32
+ dtype: float32
33
+ - name: identity_attack
34
+ dtype: float32
35
+ - name: insult
36
+ dtype: float32
37
+ - name: threat
38
+ dtype: float32
39
+ - name: asian
40
+ dtype: float32
41
+ - name: atheist
42
+ dtype: float32
43
+ - name: bisexual
44
+ dtype: float32
45
+ - name: black
46
+ dtype: float32
47
+ - name: buddhist
48
+ dtype: float32
49
+ - name: christian
50
+ dtype: float32
51
+ - name: female
52
+ dtype: float32
53
+ - name: heterosexual
54
+ dtype: float32
55
+ - name: hindu
56
+ dtype: float32
57
+ - name: homosexual_gay_or_lesbian
58
+ dtype: float32
59
+ - name: intellectual_or_learning_disability
60
+ dtype: float32
61
+ - name: jewish
62
+ dtype: float32
63
+ - name: latino
64
+ dtype: float32
65
+ - name: male
66
+ dtype: float32
67
+ - name: muslim
68
+ dtype: float32
69
+ - name: other_disability
70
+ dtype: float32
71
+ - name: other_gender
72
+ dtype: float32
73
+ - name: other_race_or_ethnicity
74
+ dtype: float32
75
+ - name: other_religion
76
+ dtype: float32
77
+ - name: other_sexual_orientation
78
+ dtype: float32
79
+ - name: physical_disability
80
+ dtype: float32
81
+ - name: psychiatric_or_mental_illness
82
+ dtype: float32
83
+ - name: transgender
84
+ dtype: float32
85
+ - name: white
86
+ dtype: float32
87
+ - name: created_date
88
+ dtype: string
89
+ - name: publication_id
90
+ dtype: int32
91
+ - name: parent_id
92
+ dtype: float32
93
+ - name: article_id
94
+ dtype: int32
95
+ - name: rating
96
+ dtype:
97
+ class_label:
98
+ names:
99
+ 0: rejected
100
+ 1: approved
101
+ - name: funny
102
+ dtype: int32
103
+ - name: wow
104
+ dtype: int32
105
+ - name: sad
106
+ dtype: int32
107
+ - name: likes
108
+ dtype: int32
109
+ - name: disagree
110
+ dtype: int32
111
+ - name: sexual_explicit
112
+ dtype: float32
113
+ - name: identity_annotator_count
114
+ dtype: int32
115
+ - name: toxicity_annotator_count
116
+ dtype: int32
117
+ splits:
118
+ - name: test_private_leaderboard
119
+ num_bytes: 49188921
120
+ num_examples: 97320
121
+ - name: test_public_leaderboard
122
+ num_bytes: 49442360
123
+ num_examples: 97320
124
+ - name: train
125
+ num_bytes: 914264058
126
+ num_examples: 1804874
127
+ download_size: 0
128
+ dataset_size: 1012895339
129
  ---
130
 
131
  # Dataset Card for Jigsaw Unintended Bias in Toxicity Classification