comments
stringlengths
4
135
contain_gender_bias
class label
2 classes
bias
class label
3 classes
hate
class label
3 classes
(ํ˜„์žฌ ํ˜ธํ…”์ฃผ์ธ ์‹ฌ์ •) ์•„18 ๋‚œ ๋งˆ๋ฅธํ•˜๋Š˜์— ๋‚ ๋ฒผ๋ฝ๋งž๊ณ  ํ˜ธํ…”๋งํ•˜๊ฒŒ์ƒ๊ฒผ๋Š”๋ฐ ๋ˆ„๊ตฐ ๊ณ„์† ์ถ”๋ชจ๋ฐ›๋„ค....
0False
2others
0hate
....ํ•œ๊ตญ์ ์ธ ๋ฏธ์ธ์˜ ๋Œ€ํ‘œ์ ์ธ ๋ถ„...๋„ˆ๋ฌด๋‚˜ ๊ณฑ๊ณ ์•„๋ฆ„๋‹ค์šด๋ชจ์Šต...๊ทธ๋ชจ์Šต๋’ค์˜ ์Šฌํ””์„ ๋ฏธ์ฒ˜ ์•Œ์ง€๋ชปํ–ˆ๋„ค์š”ใ… 
0False
0none
2none
...๋ชป๋œ ๋„˜๋“ค...๋‚จ์˜ ๊ณ ํ†ต์„ ์ฆ๊ฒผ๋˜ ๋„˜๋“ค..์ด์   ๋งˆ๋•…ํ•œ ์ฒ˜๋ฒŒ์„ ๋ฐ›์•„์•ผ์ง€..,๊ทธ๋ž˜์•ผ, ๊ณต์ •ํ•œ ์‚ฌํšŒ์ง€...์‹ฌ์€๋Œ€๋กœ ๊ฑฐ๋‘๊ฑฐ๋ผ...
0False
0none
0hate
1,2ํ™” ์–ด์„คํŽ๋Š”๋ฐ 3,4ํ™” ์ง€๋‚˜์„œ๋ถ€ํ„ฐ๋Š” ๊ฐˆ์ˆ˜๋ก ๋„ˆ๋ฌด ์žฌ๋ฐŒ๋˜๋ฐ
0False
0none
2none
1. ์‚ฌ๋žŒ ์–ผ๊ตด ์†ํ†ฑ์œผ๋กœ ๊ธ์€๊ฒƒ์€ ์ธ๊ฒฉ์‚ดํ•ด์ด๊ณ 2. ๋™์˜์ƒ์ด ๋ชฐ์นด๋ƒ? ๋ฉ”๊ฑธ๋ฆฌ์•ˆ๋“ค ์ƒ๊ฐ์ด ์—†๋…ธ
1True
1gender
0hate
10+8 ์ง„์งœ ์ด์Šน๊ธฐ๋ž‘ ๋น„๊ต๋œ๋‹ค
0False
0none
2none
100๋…„์•ˆ์— ๋‚จ๋…€๊ฐ„ ์„ฑ์ „์Ÿ ํ•œ๋ฒˆ ํฌ๊ฒŒ ์น˜๋ฃฌ ํ›„ ์ผ๋ถ€๋‹ค์ฒ˜์ œ, ์—ฌ์„ฑ์˜ ์ •์น˜์ฐธ์—ฌ ๊ธˆ์ง€, ์—ฌ์„ฑ ํˆฌํ‘œ๊ถŒ ์‚ญ์ œ๊ฐ€ ์„ธ๊ณ„์˜ ๊ณตํ†ต๋ฌธํ™”๋กœ ์ž๋ฆฌ์žก์„๋“ฏ. ์•”ํƒ‰์ด ๋„ˆ๋ฌด ์šธ์–ด๋Œ.
1True
1gender
0hate
10๋…„๋’ค ์œค์„œ์ธ์€ ๋ถ„๋ช…ํžˆ ์žฌํ‰๊ฐ€๋ ๊ฒƒ์ž„. ๋งํ•˜๋‚˜ํ•˜๋‚˜๊ฐ€ ํ‹€๋ฆฐ๊ฒŒ์—†์Œ
0False
0none
2none
10๋…„๋งŒ์— ์žฌ๋ฏธ๋ฅผ ๋Š๋ผ๋Š” ํ”„๋กœ์˜€๋Š”๋ฐ์™œ ๋‹ˆ๋“ค๋•Œ๋ฌธ์— ํ์ง€๋ฅผํ•ด์•ผ๋˜๋ƒ
0False
0none
1offensive
10๋…„์ฐจ๋ฐฉํƒ„ํŒฌ์ธ๋ฐ ์šฐ๋ฆฌ๋ฐฉํƒ„์ฒ˜๋Ÿผ ์„ฑ๊ณต์€๋ชปํ•˜๊ฒ ์ง€๋งŒ ์ผ๋‹จ ๋ฐฉํƒ„์˜ ๋ถ€ํ•˜๊ฐ€๋˜๊ณ ์‹ถ๋‹ค๋Š”๊ฑฐ๋‹ˆ ์ด๋ฆ„๊ธฐ์–ต์€ํ•ด๋‘ ใ…‡ใ…‡
0False
0none
1offensive
10์‚ด ์ฐจ์ด๊ฐ€ ๋„˜์„ํ…๋ฐ ๋ถ€๋ถ€๋ผ๊ณ ? ๋ฌด์Šจ ๋‚ด์šฉ์ธ์ง€ ๊ธ๊ธˆํ•˜๋„ค..
0False
2others
2none
12์›”์ด๋‚˜ 1์›”์ด๋” ๊ฒฐํ˜ผ์„ฑ์ˆ˜๊ธฐ์ง€ 5์›”์€ ์—ฌ๋ฆ„์ด๋ผ ๋น„์ˆ˜๊ธฐ์—์†ํ•œ๋‹ค
0False
0none
2none
13์‚ด ์ฐจ์ด๋ฉด ์š”์ฆ˜ ์„ธ์ƒ์— ๋†€๋ž„์ผ๋„ ์•„๋‹ˆ๋„ค์š”
0False
0none
2none
1400์–ต ๋œฏ๊ธฐ๊ณ  ์–ผ๋งˆ๋‚˜ ์–ต์šธ ํ–ˆ์„๊ผฌ ๋’ค์งˆ๋งŒํผ ์•„ํ”Œ๋งŒ ํ•˜์ง€~~~~ใ…‹ใ…‹ใ…‹ใ…‹
0False
0none
1offensive
15๋…„์ƒ์ธ ์šธ์•„๋“ค์€ ์œ ๋ชจ์ฐจ ์•ˆํƒ€๋Š”๋ฐ
0False
0none
2none
15์‚ด์ฐจ์ด๋ฉด ์–ด๋ฆด๋•Œ์‚ฌ๊ณ ์ณค์œผ๋ฉด ๋”ธ๋ฒŒ์ธ๋ฐ ใ…‹ใ…‹ ๋Šฅ๋ ฅ์ข‹๋„ค ์ „ํ˜„๋ฌด ์—ฌ์ž๋“ค์ด ์ „ํ˜„๋ฌด์Šคํƒ€์ผ์ด ์ข‹์€๊ฐ€๋ด
0False
2others
1offensive
166์— 57.4 ๋‹ค์ด์–ดํŠธ ํ•„์š”์—†์ง€๋งŒ๊ฑด๊ฐ•์œ„ํ•ด๋”๋บ„๊ฑฐ์•ผ๋‹ค๋“ค ๋”๋งˆ๋ฅด๊ฒŒ๋ด„
0False
0none
2none
17๋…„๋„ ์•„๋‹ˆ๊ณ  27๋…„์ด๋ฉด ์ง„์งœ ๋„ˆ๋ฌดํ–ˆ๋‹ค ์ด๊ฒƒ์ด ์‚ฌ๋ž‘๋งŒ์œผ๋กœ ๊ฐ€๋Šฅํ• ์ˆ˜ ์žˆ๋Š”์ง€ ๋„์ €ํžˆ ๋‚ฉ๋“์ด ๊ฐ€์งˆ ์•Š๋Š”๋‹ค
0False
2others
2none
180๋„˜์–ด๊ฐˆ์ •๋„์˜ ๊ธฐ๋Ÿญ์ง€๋Š” ์•„๋‹˜78์—์„œ79180๋„˜๋Š”์• ๋“ค์€ ์˜์‹ฌ.์˜๊ตฌ์‹ฌ์„ ์•ˆํ•˜๊ณ ์•ˆ๋“ฌ๊ทธ๋ž˜๋„ ๋น„์œจ์ข‹์•„ ์ปค๋ณด์ด๊ธดํ•จ
0False
0none
2none
180์ดํ•˜ ํ˜ธ๋น— ํ•œ๋‚จ๋“ค์€ ๊ฒฐํ˜ผ ํ•˜์ง€๋งˆ์…ˆ ใ…‹ ๋ˆ์—†์œผ๋ฉด ์—ฐ์• ๋„ ํ•˜์ง€๋งˆ์…ˆ ใ…‹ ๋‹ˆ๋“ค ํ˜ธ๋น— ์œ ์ „์ž ๋ฐ›์•„์„œ ๊ณ ํ†ต๋ฐ›์„ ๋„ค ํ›„์†์€ ๋ญ”์ฃ„์ž„?
1True
1gender
0hate
18ํ‚ค๋กœ๊ฐ๋Ÿ‰์„ํ•ด์•ผ๋ฐ•์ˆ˜์ณ์ฃผ์ง€ ๊ทธ๋ƒฅ ๋จน๊ณ ์‹ถ์€๊ฑฐ๋‹ค๋จน๊ณ ๋””๋น„์ ธ์ž๊ณ ํ•˜๋ฉด 18ํ‚ค๋กœ๋Š”๊ทธ๋ƒฅ์ช„
0False
0none
0hate
1959๋…„ ์ž์œ ๋‹น ์‹œ์ ˆ์— ํ•˜๋ฒ„๋“œ ๋กœ์Šค์ฟจ์— ์ž…ํ•™ํ•˜์…จ๋‹ค๊ฐ€ ํ• ๋ฆฌ์šฐ๋“œ๋กœ ์ง„์ถœํ•˜์…จ์–ด์š”? ๋Œ€๋‹จํ•˜์‹  ๋ถ„์ด์…จ๊ตฐ์š”.
0False
0none
2none
19์—์„œ15๋กœ ๋ฐ”๋€Œ๊ณ  ๋งํŠธ๋ฆฌ ํƒ„๊ฑฐ...
0False
0none
2none
1๋…„์—ฐ์˜ˆ ใ…‹ใ…‹ 2๋…„๋ด…๋‹ˆ๋‹ค....
0False
0none
1offensive
1๋…„์ „์ธ๊ฐ€ ๋‘˜์ด ์—ฐ๋‚จ๋™ ์ˆ ์ง‘์—์„œ ์ˆ ๋งˆ์‹œ๋Š”๊ฑฐ ๋ดค๋Š”๋ฐ ์ด์ œ์•ผ ๊ธฐ์‚ฌ๋œจ๋Š”๊ตฌ๋‚˜
0False
0none
2none
1๋ฐ•2์ผ ์žฌ๋ฏธ์—†๊ณ  ์ด์ œ ๋ถ€๋„๋•ํ•œ๊ฒƒ๋งŒ ์ƒ๊ฐ๋‚˜์š”. ํ์ง€ํ•˜์„ธ์š” ํ•™๋ถ€๋ชจ์ž…์žฅ์—์„œํ์ง€๋น„๋ž๋‹ˆ๋‹ค
0False
0none
2none
1๋ณด๋‹ค ์žฌ๋ฏธ์—†์–ด์š” ์ง„์งœ ๋ณด๋‹ค ๋‚˜์˜ฌ๋ป”
0False
0none
1offensive
1๋ถ€๋Š” ๋ˆˆ ์ฉ๋Š” ๊ฑฐ ๊ฐ™์•„์„œ 2๋ถ€๋งŒ ๋ดค๋‹ค ์กธ๋ผ ์›ƒ๊ธฐ๋”๋งŒ ใ…‹ใ…‹
0False
0none
2none
1์–ต๋„ ์•ˆ์•„๊น๋‹ค โ˜๏ธโ˜๏ธโ˜๏ธโ˜๏ธโ˜๏ธโ˜๏ธโ˜๏ธ
0False
0none
2none
1์ผ1์‹๋ณด๋‹จ ์•„์นจ,์ ์‹ฌ๋งŒ ๋จน๊ณ  ์ €๋…์€ ๊พธ์ค€ํžˆ ๊ณต๋ณต์œ ์ง€ ํ•˜๋Š”๊ฒŒ ๊ฑด๊ฐ•์— ๋” ์ข‹์€๋“ฏ
0False
0none
2none
1์ผ1์‹์€ ๋‚œ ๋ชปํ•˜๊ฒ ๋„ค ํํ ํ•œ์ƒ์ง„์”จ ์—ฐ๊ธฐ์ž˜ํ•จ
0False
0none
2none
1ํšŒ์— ๊ต๋„์†Œ์—์„œ ๋งˆ์ทจ๋„ ์—†์ด ์ž๊ธฐ๋ชธ ๊ฟฐ๋งค๋Š”๊ฑฐ๋ž‘๋œจ๊ฑฐ์šด ์ปคํ”ผ๋ง๊ณ  ์ฐจ๊ฐ€์šด๊ฑฐ๋งŒ๋จน๋Š”๊ฑฐ๋ณด๊ณ ์•„๋ž๋Š”๋ฐ ๋‚œ ๋ฌดํ†ต์ฆ ๋ฌด๊ฐ๊ฐ์ด๋ผ๊ณ 
0False
0none
2none
2%๋กœ๋„ ์•ˆ๋˜๋Š” ์‹œ์ฒญ์œจ ใ…Žใ…Ž ์—ฐ๊ธฐ๋„ ๋ชปํ•˜๋”๋งŒ ์ด๊ฑฐ ๋•Œ๋ฌธ์— ๋ถ์„ ๋ชป๊ฐ€? ใ…Žใ…Ž
0False
0none
2none
2,30๋Œ€ ๊ณจ๋นˆ์—ฌ์ž๋“ค์€ ์ด ๊ธฐ์‚ฌ์— ๋‹ค ๋ชจ์ด๋Š”๊ฑด๊ฐ€ใ…‹ใ…‹ใ…‹ใ…‹ ์ด๋ž˜์„œ ์—ฌ์ž๋Š” ํˆฌํ‘œ๊ถŒ ์ฃผ๋ฉด ์•ˆ๋œ๋‹ค. ์— ๋„ท์‚ฌ์ „ํˆฌํ‘œ๋‚˜ ํ•˜๊ณ  ์‚ด์•„์•ผ์ง€ ๊ณ„์ง‘๋“ค์€
1True
1gender
0hate
2002๋…„๋„ ์—ฌ๋ฆ„์— ๋ดค๋Š”๋ฐ ์„œ์ง€์˜์ด๋ž‘ ์Šคํฌ์ธ ์นดํƒ€๊ณ ์˜จ๊ฑฐ...18๋…„์ „์ด๋„ค ใ…Žใ…Ž
0False
0none
2none
2006๋…„ ํฌ๋„๋ฐญ๊ทธ์‚ฌ๋‚˜์ด์—์„œ ํฐ๊ฐ€์šด์ž…์€ ์—ฐ๊ตฌ์› ์ •์†Œ์˜๋ด๋ผ~!!์†”๊นŒ ๋Œ€ํ•œ๋ฏผ๊ตญ ์—ฌ๋ฐฐ์šฐ์ค‘์— ์ตœ๊ณ ๋กœ ์ด์˜๊ณ  ๋‹จ์•„ํ–ˆ๋‹ค~!!! ๊ธ‰์‹์ถฉ๋“ค์€ ๋ฉ‹๋„๋ชจ๋ฅด๊ณ  ์œค์•„ ์•„์ด์œ  ๋– ๋“ค๊ฒ ์ง€๋งŒ..
0False
2others
0hate
200์–ต๋Œ€ ์ณ๋ฐ”๋ฅด๊ณ  ํ™๋ณด๋ฅผ ๊ทธ๋ ‡๊ฒŒ๋‚˜ํ–ˆ๋Š”๋ฐ ์‹œ์ฒญ๋ฅ  5.0%๊ฐ€ ์ž๋ž‘์ด๋ผ๊ณ  ์˜ฌ๋ ธ๋ƒ... ๊ทธ๋ฆฌ๊ณ , ๋‚จ๋…€ ์ฃผ์ธ๊ณต์ด ๋น„์ฃผ์–ผ์ด ๋„ˆ๋ฌด ์‹ฌ๊ฐํ•˜๊ณ  ๋‚จ๋…€์ฃผ์—ฐ๋“ค ์‚ฌ๊ทน ๋งํˆฌ๊ฐ€ ๋”๋น™ํ•œ๊ฒƒ๋งˆ๋ƒฅ ์„ฑ์šฐ์Šค๋Ÿฌ์›Œ์„œ ์‹œ์ฒญ๋ฅ ์ด ๊ทธ ๋ชจ์–‘ ๊ทธ ๊ผด์ด์ž–์•„...
0False
2others
1offensive
2014๋…„ ์นดํ†ก์€ ์•ˆ์ง€์šฐ๊ณ  ์ง€๋”๊นŒ์ง€ ๊ฐ–๊ณ  ์žˆ์—ˆ๋‹ค๋Š”๊ฒŒ ๋†€๋ž๋‹ค~์ˆ˜์ง‘ํ•˜๋‹ˆ~์ง€๋•œ์— ์žฅ์”จ ๋‚ ๋ฒผ๋ฝ~
0False
0none
1offensive
2016๋…„ ์‚ฌ์ง„์ด๋ž˜์š” ใ…‹
0False
0none
2none
2018๋…„ ๋งˆ๋งˆ ์ง„์ •ํ•œ ์Šน์ž ํ™”์‚ฌ
0False
0none
2none
2020๋…„์ด ์˜ฌ๊ฑฐ๊ฐ™๋ƒ ์˜๋ฐฐ์•ผใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹
0False
0none
2none
206000๋‹ฌ๋Ÿฌ ์ค‘ ๊ฒจ์šฐ ๊ณ„์•ฝ๊ธˆ ์ˆ˜์ค€์ธ 34700์„ ์ง€๋ถˆ์•ˆํ–ˆ๋‹ค๋ฉด ๋ญ” ์‚ฌ์ •์ด ์žˆ์„๊ฑฐ๊ฐ™์•„ ์ค‘๋ฆฝ๊ธฐ์–ด ๋ฐ•์•„์š”
0False
0none
2none
20๋…„์—ฐ์•  ๋„ˆ๋ฌด ๋ฉ‹์ ธ์š” !! ์ด์ œ ๊ฒฐํ˜ผํ•˜์…”์„œ ํ–‰๋ณตํ•œ ๊ฐ€์ •์ด๋ฃจ์‹œ๊ธธ
0False
0none
2none
20๋…„์ด ๋„˜์—ˆ๋‹ค ๋ถ€๋ชจ๋Š” ์†์ด ํƒ€๊ณ  ์–ต์šธํ•˜๊ฒ ์ง€๋งŒ ์ด์   ๋ฌป์–ด๋‘ฌ์•ผ์ง€ ๊ทธ๋ ‡๋‹ค๋ฉด ์–ต์šธํ•˜๊ฒŒ ์ฃฝ์€ ์‚ฌ๋žŒ์ด ์–ผ๋งˆ๋‚˜ ๋งŽ์„๊นŒ? 30๋…„ ์ง€๋‚œ ํ™”์„ฑ์‚ฌ๊ฑด์€ ํ”ผํ•ด๊ฐ€์กฑ๋“ค์€?๊ทธ๋• ๊ฐ€ํ•ด์ž๋„ ๋˜์—ˆ์ง€๋งŒ ์‹œ๊ฐ„์ด ์ง€๋‚œ ์˜ค๋Š˜์€ ํ”ผํ•ด์ž๊ฐ€ ๋ ์ˆ˜์žˆ๋‹ค
0False
0none
1offensive
20๋…„์งธ ์—ด์• ์ค‘? ๋„ˆ๋ฌด ๊ธธ๋‹ค.
0False
2others
2none
20๋Œ€ 30๋Œ€ ๋‚จ์ž ๋น„์œจ๋ณด์†Œ ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹ํ• ์ง“ ๋“œ๋Ÿฝ๊ฒŒ์—†๋‚˜๋ด... ์ฃฝ์ž๊ณ  ๋ค๋ฒผ๋“œ๋Š”๊ฑฐ๋ดใ… ใ… 
1True
1gender
0hate
20๋Œ€๋‹ค. ์—ฐ์•  ๋งŽ์ด ํ•ด๋ด๋ผ.
0False
2others
1offensive
20์‚ด์ฐจ?? ใ…‰ใ…‰
0False
2others
1offensive
21C์— ์ฃผ์ˆ ์‚ฌ๋ž‘ ...
0False
0none
2none
24์‚ด์ธ๋ฐ ์ˆ ์„๋จน์–ด ๋ฐœ๋ž‘๊นŒ์ก‹๋„ค ๊ณ ๋”ฉ๋•Œ๋„ ๋จน์—‡์„๋“ฏ
0False
2others
1offensive
25๋…„์ „์ด๋ฉด ํ•œ์ฐธ ํŒ”ํŒ”ํ• ๋•Œ.. ์˜ค๋‹ฌ์ˆ˜๋„ ์•„๋งˆ ์ด์‹ญ๋Œ€ ์ดˆ๋ฐ˜.. ์„œ๋กœ ์–ด๋ ธ์„๋•Œ ๋ฐ˜์ฝฉ๊น ๋˜๊ฑฐ ๊ฐ™์€๋ฐ..์ €๋ ‡๊ฒŒ ๋”ฐ์ง€๋ฉด ์ด์„ธ์ƒ์— ์„ฑ๋ฒ”์ฃ„์ž ์•„๋‹Œ์ธ๊ฐ„์ด ์žˆ์„๊นŒ ๋ฌด์„ญ๋‹ค
1True
1gender
1offensive
25๋…„์ „์ด๋ฉด์˜ค๋‹ฌ์ˆ˜๋„ ํ˜ˆ๊ธฐ์™•์„ฑํ• ๋•Œ์ธ๋ฐ ๊ทธ๋Ÿด์ˆ˜๋„ ์žˆ์„๋“ฏ ๊ทธ๋ฆฌ๊ณ  ์ด์ œ์™€์„œ 25๋…„์ผ์„ ๋งํ•˜๋Š”์‚ฌ๋žŒ๋„ ์ฐธ ์ดํ•ด๋ชปํ•˜๊ฒ ๋„ค ํ˜“์žฌ์— ์ถฉ์‹คํ•˜๊ณ  ์‚ด์ง€ ์ฐธ
1True
1gender
2none
26๋…„ ์ตœ์ˆ˜์ข…๋‹˜์ฒ˜๋Ÿผ ํ•œ๊ฒฐ ๊ฐ™๊ธฐ๋Š” ํž˜๋“ค๊ฒ ์ง€๋งŒ ์„œ๋กœ ์–‘๋ณดํ•˜๊ณ  ๋ฐฐ๋ คํ•˜๋ฉฐ ๋น„์Šทํ•˜๊ฒŒ๋งŒ ์‚ด์•„๋„ ์‹ธ์šธ์ผ ๋ฐ˜์œผ๋กœ ์ค„์ด๊ณ  ์•„๋‚ด ํ–‰๋ณตํ•˜๊ฒŒ ๋งŒ๋“ค์–ด์คŒ
0False
0none
2none
290์–ต ์•„๋‹ˆ๊ณ ?
0False
0none
2none
29์‚ด๊นŒ์ง€ ์˜จ๊ฐ– ์ˆ˜๋‹จ ๋‹ค ๋™์›ํ•ด์„œ ์ž…๋Œ€ ์—ฐ๊ธฐํ•˜๋ฉฐ ๋ฐœ์•…ํ•˜๋‹ค๊ฐ€ ์–ด์ฉ”์ˆ˜์—†์ด ๋Œ๋ ค๊ฐ€๋Š”๊ฒŒ ๋จธ๊ฐ€ ๋Œ€๋‹จํ•˜๋‹ค๊ณ  ์˜ค๋ฐ”์งˆ์ด์•ผ ใ…‹ใ…‹
0False
2others
0hate
2~30๋Œ€๋…€ :(์ฟต์พ…์ฟต์พ…) ๊ฟ€๊ฟ€๊ฟ€!!!!!!
1True
1gender
0hate
2๋…„ ์‚ฌ๊ท€์—ˆ์œผ๋ฉด ... ์„œ๋กœ ์˜จ๋ชธ์˜ ์  ์œ„์น˜๊นŒ์ง€ ๋‹ค์•Œ ๋“ฏ ใ…‹ใ…‹ใ…‹
0False
2others
1offensive
2๋…„๋„์•ˆ๋˜ ใ…‡ใ…ˆใ…Ž ์–‘์•„์น˜์—๊ฒŒ ํ†ต์ˆ˜ ๋งž์œผ๋‹ˆ ๋ถ„ํ•˜๊ฒ ์ฃ  ......๊ทธ๋™์•ˆ sns์—์„œ ์ถฉ๋ถ„ํžˆ ๋Š๋‚„ ์ˆ˜ ์žˆ์—ˆใ†์ƒˆ๋กœ์šด ์‚ถ์„ ์‚ฌ์„ธ์š”
0False
0none
1offensive
2๋ฐ•2์ผ์€ ๋ญ๋‹ˆ? ๊ธฐ๋ ˆ๊ธฐ์•ผ
0False
0none
0hate
2์ฐจ๋กœ ๋‚จ์ž์ง‘์„ ์™œ ๊ฐ€.๋ฌด์กฐ๊ฑด ๋‚จ์ž๋งŒ ํƒ“ ํ•  ์ผ์€ ์•„๋‹ˆ๋‹ค.๋ช…๋ฐฑํžˆ ๊ฐ€๋ ค์ง€๊ธธ...
1True
1gender
1offensive
2์ฒœ๋งŒ์›์งœ๋ฆฌ ์กฐ๋ฆฌ์›ํ•˜๋ฉฐ๊ณ ๊ธฐ์ง‘ ์™ธ์‹ํ•˜๋ฉฐ...์ฐธ ๊ฐ€์ง€๊ฐ€์ง€ ํ•œ๋‹คใ…‹ใ…‹ใ…‹ใ…‹ใ…‹๊ด€์‹ฌ๋ณ‘์œผ๋กœ ๋”์•„ํ”„์‹ ๊ฑฐ ์•„๋‹Œ๊ฐ??์•„๋ฌดํŠผ ๊ฑด๊ฐ•ํ•˜๋‹ค๋‹ˆ ๋‹คํ–‰์ด๋„ค...ใ…‹ใ…‹ใ…‹
0False
2others
1offensive
30๋„˜์–ด์„œ ๊ณต์ต๊ฐ€๋Š” ๊ผด์ด๋ž€ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹์ง„์งœ ๋ชจ๋ƒฅ๋น ์ง„๋‹คใ…‹ใ…‹ใ…‹
0False
2others
1offensive
30๋Œ€ ์•„์คŒ๋งˆ๋“ค ๋‚จ์— ์ง‘์— ๊ฐ๋†”๋ผ๋ฐฐ๋†”๋ผ ใ…‹ใ…‹ใ…‹ใ…‹ใ…‹
1True
1gender
1offensive
30๋Œ€ ์•„์คŒ๋งˆ๋“ค;; ๊ณ ๋งŒ๋“ค ํ•˜์„ธ์š”;
1True
1gender
1offensive
30๋Œ€ ์—ฌ์ž๋“ค์€ ๊ตฐ๋Œ€๋„ ์•ˆ๊ฐ€๋Š”๋ฐ ๊ตฐ๋Œ€ ์ด์•ผ๊ธฐ์— ์ด๋ ‡๊ฒŒ ๊ด€์‹ฌ์ด ๋งŽ์„๊นŒ?? ์ ค ์‹ซ์–ดํ•˜๋Š” ์ด์•ผ๊ธฐ ํƒ‘3๋ผ๋ฉด์„œ??
1True
1gender
1offensive
30๋Œ€ ์—ฌ์ž๋น„์œจ์ด ๊ฐ€์žฅ๋†’์€๊ฑฐ๋ณด๋‹ˆ ๊ณ ์†Œ์˜์ด๋ป์š” ํ•˜๋Š” ์ด์œ ๋ฅผ ์•Œ๊ฒ ๋‹ค ^^ ์—ฌ์ž๋“ค์•„ ์ง„์‹ฌ์œผ๋กœ ํ•œ๋งˆ๋”” ํ•˜๊ฒ ๋Š”๋ฐ 30๋Œ€ ์—ฌ์ž๋Š” ์ง„์‹ฌ์œผ๋กœ ์—ฌ์ž๋กœ ์•„์˜ˆ์•ˆ๋ณด์—ฌ์š”~^^
1True
1gender
0hate
30๋Œ€์—ฌ์ž๋“ค ํ•œํ˜œ์ง„ ๋น™์˜ํ•ด์„œ ์•…ํ”Œ๋‹ค๋Š”๊ฑฐ ์ง„์งœ ์•ˆ์“ฐ๋Ÿฝ๊ณ  ์—ญ๊ฒน๋‹ค ๋ณธ์ธ์ธ์ƒ์ด๋‚˜ ์ž˜์‚ด๊ธฐ๋ฅผ ์ Š์€์—ฌ์ž๋“ค ์งˆํˆฌํ•˜์ง€๋ง๊ณ 
1True
1gender
1offensive
33์‚ด์ด ์ฒญ๋…„์ž„? ์•„์ €์”จ์•„๋‹˜?
0False
2others
0hate
38์‚ด ๋˜๋„๋ก ์‹œ์ง‘๋ชป๊ฐ”์œผ๋ฉด ๋ญ”๊ฐ€ ๋ฌธ์ œ์žˆ๋Š” ์—ฌ์ž ์•„๋‹Œ๊ฐ€?? ๊ฑด๋ชจํ˜• ๊ทธ๋ƒฅ ๊ฑฐ๋ฅด์„ธ์š”...์ƒ๋Œ€ํ•  ํ•„์š”๋„ ์—†์–ด์š”...
1True
1gender
0hate
38์„ธ48์„ธ๋กœ ์ž˜๋ชป์ ์€๊ฑฐ์•„๋‹Œ๊ฐ€ใ…‹ใ…‹ใ…‹ใ…‹ํ• ์ •๋„๋กœ ๋™์•ˆ์ด๋„ค์š”~~~
0False
2others
2none
39์„ธ๋ผ๊ณ  ๋‚˜์™€์žˆ๊ตฌ๋งŒ ใ…กใ…ก ์–‘์„ธ์ข…์€ 20๋Œ€ ์•„๋‹˜???
0False
0none
2none
3๋ช…์ค‘์— ์ตœ์ข…ํ›ˆ์ด ์ œ์ผ ์ž˜์ƒ๊ฒผ๋„ค์š”โ™ก
0False
0none
2none
3๋ฒˆ์œผ๋กœ ํƒˆ๋ฝ์„ ๋ฉด ํ–ˆ๊ตฐ, ์ด์ œ ์ˆ ์ˆ˜๋ฅผ ๋ถ€๋ฆฌ์ง€๋ง๋ผ~ํ‹ฐ๊ฐ€๋‚œ๋‹ค!
0False
0none
2none
3์ผ์— 12์–ต์›์–ด์น˜ ํŒ”๋งŒํผ ํŒŒ๊ธ‰๋ ฅ์žˆ๋Š” ์‚ฌ๋žŒ์ž…๋‹ˆ๋‹ค. ์™œ ๋‹ค๋“ค ์กฐ๋กฑ๋งŒ ํ•˜๋‚˜์š”? ์ž˜ ์•Œ์ง€ ๋ชปํ•œ๋‹ค๋ฉด ๋”๋”์šฑ ํ•จ๋ถ€๋กœ ๋งํ•˜๋ฉด ์•ˆ๋˜์ฃ .
0False
0none
2none
40-50๋Œ€๋กœ ๊ฐ€์•ผ์ง€..ํ• ๋งค๋ฅผ ๋ถ€๋ฅด๋ฉด ์šฐ์•ผ๋…ธ
1True
1gender
0hate
40๋Œ€ ์•„์คŒ๋งˆ ์ง€๋“ค ์ž์‹๋“ค์€ ํ”ผ์”จ๋ฐฉ์— ์žˆ๊ณ  ์—„๋งˆ๋“ค์€ ๋ฐฉํƒ„ ๋™์˜์ƒ ์ฒ˜๋ณด๊ณ  ์žˆ๋‹ค๊ณ  ์ƒ๊ฐํ•ด๋ด๋ผ ๊ทธ๋ž˜๋†“๊ณ  ๊ฒฝํฌ๋Œ€ ๋ฌธ์žฌ์ธ ์•„๋ชฐ๋ž‘ ์ง€์ง€ํ•˜๊ณ  ์–ดํ›„ ํ† ๋‚˜์™€
0False
2others
0hate
40๋Œ€ ์•„์คŒ๋งˆ๋“ค ๋Œ“๊ธ€๋น„์œจ์ข€๋ด....
1True
1gender
1offensive
40๋Œ€ ์—ฌ์ž๋“ค์ด ๋‚˜์ด๊ฐ’๋ชปํ•˜๊ณ  ํ˜„๋นˆ ๋ฉ‹์žˆ๋‹ค๊ณ  ํ•˜์•…ํ•˜์•… ๊ฑฐ๋ฆฌ๋Š”๊ฑฐ ๊ฐœ๊ทนํ˜ใ…‹ใ…‹ใ…‹ใ…‹๋‹น์‹ ๋“ค์˜ ํŠ€์–ด๋‚˜์˜จ ๋ฑƒ์‚ด๊ณผ ํ‰์ธกํ•œ ์–ผ๊ตด์„๋ณด๊ณ  ๋ฐ˜์„ฑํ•˜๊ธธ๋ฐ”๋žŒ^^;;;
1True
1gender
0hate
40๋Œ€ ์ด๋ชจ๋ผ ์ง‘์ค‘์ด ์•ˆ๋จ... 20์‚ด ์ดˆ๋ฐ˜์œผ๋กœ์ข€ ์บ์ŠคํŒ…์ข€ ใ… ใ…œใ… ใ…œ
1True
1gender
1offensive
40๋Œ€๋Š™์€์ด 32% ๋Œ์•„์ด๋ จ๋“ค...์ฏง์ฏง์ฏง
1True
1gender
0hate
40์‚ด์— ํ๊ฒฝ์— ๋‹ค๋ฆฌ๊นŒ์ง€ ๋…ธ๋‹ต์ด๋ผ์ด ใ…‹ใ…‹ใ…‹
1True
1gender
0hate
40์‚ด์—ฌ์ž๋ž‘ ๋กœ๋งจ์Šค ใ…‹ใ…‹๋ณ„๋กœ๋‹ค
0False
2others
1offensive
40์„ธ ์ด์ƒ ํ•œ๋…€๋“ค์€ ์ž„์‹ ๋„ ๋ชปํ•˜๋Š” ์‚ฌ์‹ค์ƒ ์“ธ๋ชจ์—†๋Š” ๋ถ„๋ฆฌ์ˆ˜๊ฑฐ ๋Œ€์ƒ
1True
1gender
0hate
40์ด๋‹ค ์ƒ๊ด€ ์—†๋‹ค๋Š” ๋„˜๋“ค ์• ๋“ค์ด๊ตฐ
0False
2others
1offensive
43.1 ์ด๋ผ .. ํ–‰๋ณตํ•ด์„œ ์‚ด์ช˜๋‹ค๋Š”๊ฑธ ๋ณด์—ฌ์ฃผ๊ณ ์‹ถ์€๊ฑด๊ฐ€? ์•„๋‹ˆ๋ฉด ์ž„...์‹ ?
1True
1gender
1offensive
45๊ฐ€ ์ •์ƒ์ธ๊ฐ€? ์• ๋“ค์ด ๋”ฐ๋ผํ• ๊นŒ ๋ฌด์„ญ๋‹ค
0False
2others
1offensive
45์–ต ๋นš์ง„๊ฒƒ๋„ ์‹ ๊ธฐํ•˜๊ณ , ๊ทธ ๋นš์„ ์ด๋ฏธ ์™„์ „ ์ธ๊ธฐ๊ฐ€ ์ €๋ฌผ์–ด๋„ ๋ช‡๋…„๋งŒ์— ๊ฐš๋Š”๋‹ค๋Š”๊ฒŒ ๋” ์‹ ๊ธฐํ•˜๋‹ค. ๊ทธ๋Ÿฌ๋ฉด ํ™์ง„์˜ ์ด๋Ÿฐ์• ๋“ค์€ 1๋…„์— ์–ผ๋งˆ ๋ฒˆ๋‹ค๋Š”๊ฑฐ์•ผ ใ…‹ใ…‹
0False
0none
2none
47kg๋ฉด ๋‚ด ๋ชธ๋ฌด๊ฒ๋ฐ ๋‚œ ์†Œ๋ฉธ๋˜๊ฒ ๋‹ค.ใ…Žใ…Ž
0False
0none
2none
47์‚ด์— ์ถœ์‚ฐ?
0False
2others
1offensive
4๋…„์ „์ •์ค€์˜์นดํ†ก์ด ์ด๊ฒŒ ์ด๋ ‡๊ฒŒ ํฐ์ผ์ด๋ƒ?ใ…‹ ๋”๋Ÿฝ๋„ค์ •๋ง.. ๋ˆ„๊ฐ€๋ด๋„ ๊ผฌ๋ฆฌ์งœ๋ฅด๊ธฐ๊ตฌ๋งŒ
1True
1gender
1offensive
4์–ต ์žฅ์œค์ •์—๊ฒŒ๋Š” ๊ปŒ๊ฐ’์ผํ…๋ฐ ๋น„์ •ํ•œ๋”ธ
0False
2others
1offensive
52๋„ ๋นผ์•ผ๋˜๋ƒ
0False
0none
2none
52๋ฉด ๋‚ ์”ฌํ•œ๊ฑด๋ฐ..
0False
2others
2none
52ํ‚ค๋กœ ๋”ฑ ์ข‹์•™~~~
0False
0none
2none
52ํผ ๋˜๋Š” ์—ฌ์ž๋“ค ๊ฐ•ํ•œ๋‚˜ ํ…Œ๋Ÿฌํ•˜๋„ค ใ…‹ใ…‹ ๊ทธ๋งŒ๋“ค ํ•˜์„ธ์š” ๋‹˜๋“ค ๋„ˆ๋ฌด ์ถ”ํ•ด ๋ณด์ž„
1True
1gender
0hate
53์‚ด ์•„๋น ๋ฉด ์–ด๋•Œ ์•„๋น ๊ฐ€ ๊ณฝ๋ถ€์„ฑ์ธ๋ฐ.... ์ถ•ํ•˜ํ•ด์š”
0False
0none
2none
5๋ช…์ด๋ž‘ ๊ด€๊ณ„ํ•˜๋ฉด ๊ฑฐ๊ธฐ ์•„ํ”Œํ…๋ฐ... ๋ˆ„๊ตฌ๋ง์ด ๋งž๋Š”๊ฑด์ง€.. ์—ํœด
1True
1gender
1offensive
60๋งŒ์›์œผ๋กœ ๋ฌด์Šจ 5์–ต ํ”ผํ•ด์ž…์€ ๋งˆ๋ƒฅ ํญ๋กœํ•˜๋Š” ๋†ˆ๋„ ๊ทธ๋ ‡๊ณ , 60๋งŒ์›์„ ์•ˆ ๊ฐš๊ณ  ์Œฉ๊น ๋˜ ๋งค๋‹ˆ์ € ๋†ˆ๋„ ๊ทธ๋ ‡๊ณ .. ์ฐธ ์•ˆ์Šต์ด๋‹ค.. ์ธ๊ฐ„๋“ค ์š”์ฆ˜ ์™œ ๊ทธ๋Ÿฐ์ง€ ์ฐธ ใ…‰ใ…‰... ์‹ ์ƒํ„ธ๊ธฐ, ๋จผ์ง€ํ„ธ๊ธฐ๊ฐ€ ์ธ์ƒ ๋‚™์ธ๊ฐ‘๋‹ค..
0False
0none
2none
60์—์„œ 14๋ฉดํ‹ฐ๊ฐ€๋‚˜๋Š”๋ฐ 120์—์„œ 14๋ฉด ํ 
0False
0none
2none
68%์˜ ์•„์คŒ๋งˆ๋“ค์•„. ๋‹ˆ๋“ค ์ž์‹๋“คํ•œํ…Œ ๊ณ ์šด๋ง ํ•˜๋ผ๊ณ  ๊ฐ€๋ฅด์น˜๋ƒ? ๋‹ˆ๋“ค ๋Œ“๊ธ€ ๋ชจ์Œ์ด๋‚˜ ์ž์‹๋“ค ๋ณด์—ฌ์ค˜๋ผ
1True
1gender
1offensive

Dataset Card for [Dataset Name]

Dataset Summary

The Korean HateSpeech Dataset is a dataset of 8367 human-labeled entertainment news comments from a popular Korean news aggregation platform. Each comment was evaluated for either social bias (labels: gender, others none), hate speech (labels: hate, offensive, none) or gender bias (labels: True, False). The dataset was created to support the identification of toxic comments on online platforms where users can remain anonymous.

Supported Tasks and Leaderboards

  • multi-label classification: The dataset can be used to train a model for hate speech detection. A BERT model can be presented with a Korean entertainment news comment and be asked to label whether it contains social bias, gender bias and hate speech. Users can participate in a Kaggle leaderboard here.

Languages

The text in the dataset is in Korean and the associated is BCP-47 code is ko-KR.

Dataset Structure

Data Instances

An example data instance contains a comments containing the text of the news comment and then labels for each of the following fields: contain_gender_bias, bias and hate.

{'comments':'์„ค๋งˆ ใ…ˆ ํ˜„์ • ์ž‘๊ฐ€ ์•„๋‹ˆ์ง€??'
 'contain_gender_bias': 'True',
 'bias': 'gender',
 'hate': 'hate'
}

Data Fields

  • comments: text from the Korean news comment
  • contain_gender_bias: a binary True/False label for the presence of gender bias
  • bias: determines the type of social bias, which can be:
    • gender: if the text includes bias for gender role, sexual orientation, sexual identity, and any thoughts on gender-related acts
    • others: other kinds of factors that are considered not gender-related but social bias, including race, background, nationality, ethnic group, political stance, skin color, religion, handicaps, age, appearance, richness, occupations, the absence of military service experience
    • none: a comment that does not incorporate the bias
  • hate: determines how aggressive the comment is, which can be:
    • hate: if the text is defined as an expression that display aggressive stances towards individuals/groups with certain characteristics (gender role, sexual orientation, sexual identity, any thoughts on gender-related acts, race, background, nationality, ethnic group, political stance, skin color, religion, handicaps, age, appearance, richness, occupations, the absence of military service experience, etc.)
    • offensive: if the text contains rude or aggressive contents, can emit sarcasm through rhetorical question or irony, encompass an unethical expression or conveys unidentified rumors
    • none: a comment that does not incorporate hate

Data Splits

The data is split into a training and development (test) set. It contains 8371 annotated comments that are split into 7896 comments in the training set and 471 comments in the test set.

Dataset Creation

Curation Rationale

The dataset was created to provide the first human-labeled Korean corpus for toxic speech detection from a Korean online entertainment news aggregator. Recently, two young Korean celebrities suffered from a series of tragic incidents that led to two major Korean web portals to close the comments section on their platform. However, this only serves as a temporary solution, and the fundamental issue has not been solved yet. This dataset hopes to improve Korean hate speech detection.

Source Data

Initial Data Collection and Normalization

A total of 10.4 million comments were collected from an online Korean entertainment news aggregator between Jan. 1, 2018 and Feb. 29, 2020. 1,580 articles were drawn using stratified sampling and the top 20 comments were extracted ranked in order of their Wilson score on the downvote for each article. Duplicate comments, single token comments and comments with more than 100 characters were removed (because they could convey various opinions). From here, 10K comments were randomly chosen for annotation.

Who are the source language producers?

The language producers are users of the Korean online news platform between 2018 and 2020.

Annotations

Annotation process

Each comment was assigned to three random annotators to assign a majority decision. For more ambiguous comments, annotators were allowed to skip the comment. See Appendix A in the paper for more detailed guidelines.

Who are the annotators?

Annotation was performed by 32 annotators, consisting of 29 annotators from the crowdsourcing platform DeepNatural AI and three NLP researchers.

Personal and Sensitive Information

[N/A]

Considerations for Using the Data

Social Impact of Dataset

The purpose of this dataset is to tackle the social issue of users creating toxic comments on online platforms. This dataset aims to improve detection of toxic comments online.

Discussion of Biases

[More Information Needed]

Other Known Limitations

[More Information Needed]

Additional Information

Dataset Curators

This dataset is curated by Jihyung Moon, Won Ik Cho and Junbum Lee.

Licensing Information

[N/A]

Citation Information

@inproceedings
{moon-et-al-2020-beep
    title = "{BEEP}! {K}orean Corpus of Online News Comments for Toxic Speech Detection",
    author = "Moon, Jihyung  and
      Cho, Won Ik  and
      Lee, Junbum",
    booktitle = "Proceedings of the Eighth International Workshop on Natural Language Processing for Social Media",
    month = jul,
    year = "2020",
    address = "Online",
    publisher = "Association for Computational Linguistics",
    url = "https://www.aclweb.org/anthology/2020.socialnlp-1.4",
    pages = "25--31",
    abstract = "Toxic comments in online platforms are an unavoidable social issue under the cloak of anonymity. Hate speech detection has been actively done for languages such as English, German, or Italian, where manually labeled corpus has been released. In this work, we first present 9.4K manually labeled entertainment news comments for identifying Korean toxic speech, collected from a widely used online news platform in Korea. The comments are annotated regarding social bias and hate speech since both aspects are correlated. The inter-annotator agreement Krippendorff{'}s alpha score is 0.492 and 0.496, respectively. We provide benchmarks using CharCNN, BiLSTM, and BERT, where BERT achieves the highest score on all tasks. The models generally display better performance on bias identification, since the hate speech detection is a more subjective issue. Additionally, when BERT is trained with bias label for hate speech detection, the prediction score increases, implying that bias and hate are intertwined. We make our dataset publicly available and open competitions with the corpus and benchmarks.",
}

Contributions

Thanks to @stevhliu for adding this dataset.

Downloads last month
148

Models trained or fine-tuned on kor_hate