qgyd2021 commited on
Commit
5516d69
1 Parent(s): d3d09b9
README.md CHANGED
@@ -11,54 +11,22 @@ license: apache-2.0
11
  数据集从网上收集整理如下:
12
  | 数据 | 语言 | 任务类型 | 原始数据/项目地址 | 样本个数 | 原始数据描述 | 替代数据下载地址 |
13
  | :--- | :---: | :---: | :---: | :---: | :---: | :---: |
14
- | email_spam | 英语 | 垃圾短信分类 | [NotShrirang/email-spam-filter](https://huggingface.co/datasets/NotShrirang/email-spam-filter) | ham: 3672; spam: 1499 | | |
15
- | enron_spam | 英语 | 垃圾邮件分类 | [enron_spam_data](https://github.com/MWiechmann/enron_spam_data); [Enron-Spam](https://www2.aueb.gr/users/ion/data/enron-spam/); [spam-mails-dataset](https://www.kaggle.com/datasets/venky73/spam-mails-dataset) | ham: 16545; spam: 17171 | Enron-Spam 数据集是 V. Metsis、I. Androutsopoulos 和 G. Paliouras 收集的绝佳资源 | [SetFit/enron_spam](https://huggingface.co/datasets/SetFit/enron_spam) |
 
16
  | sms_spam | 英语 | 垃圾短信分类 | [SMS Spam Collection](https://archive.ics.uci.edu/dataset/228/sms+spam+collection); [SMS Spam Collection Dataset](https://www.kaggle.com/datasets/uciml/sms-spam-collection-dataset) | ham: 4827; spam: 747 | SMS 垃圾邮件集合是一组公开的 SMS 标记消息,为移动电话垃圾邮件研究而收集。 | [sms_spam](https://huggingface.co/datasets/sms_spam) |
17
- | spam_assassin | 英语 | 垃圾邮件分类 | [datasets-spam-assassin](https://github.com/stdlib-js/datasets-spam-assassin); [Apache SpamAssassin’s public datasets](https://spamassassin.apache.org/old/publiccorpus/); [Spam or Not Spam Dataset](https://www.kaggle.com/datasets/ozlerhakan/spam-or-not-spam-dataset) | ham: 3795; spam: 6954 | 这是一组邮件消息,适合用于测试垃圾邮件过滤系统。备注:text 很乱,不推荐使用。 | [talby/SpamAssassin](https://huggingface.co/datasets/talby/spamassassin) |
18
  | spam_detection | 英语 | 垃圾短信分类 | [Deysi/spam-detection-dataset](https://huggingface.co/datasets/Deysi/spam-detection-dataset) | ham: 5400; spam: 5500 | | |
 
19
  | spam_message | 汉语 | 垃圾短信分类 | [SpamMessage](https://github.com/hrwhisper/SpamMessage) | ham: 720000; spam: 80000 | 其中spam的数据是正确的数据,但是做了脱敏处理(招生电话:xxxxxxxxxxx),这里的 x 可能会成为显著特征。而ham样本像是从普通文本中截断出来充作样本的,建议不要用这些数据。 | |
 
 
 
20
 
21
 
22
- ### 样本示例
23
 
24
 
25
- <details>
26
- <summary>email_spam 样本示例</summary>
27
- <pre><code>------------
28
- Subject: photoshop , windows , office . cheap . main trending
29
- abasements darer prudently fortuitous undergone
30
- lighthearted charm orinoco taster
31
- railroad affluent pornographic cuvier
32
- irvin parkhouse blameworthy chlorophyll
33
- robed diagrammatic fogarty clears bayda
34
- inconveniencing managing represented smartness hashish
35
- academies shareholders unload badness
36
- danielson pure caffein
37
- spaniard chargeable levin
38
- <br>
39
- spam
40
- ------------
41
- Subject: re : indian springs
42
- this deal is to book the teco pvr revenue . it is my understanding that teco
43
- just sends us a check , i haven ' t received an answer as to whether there is a
44
- predermined price associated with this deal or if teco just lets us know what
45
- we are giving . i can continue to chase this deal down if you need .
46
- ham
47
- ------------
48
- Subject: report 01405 !
49
- wffur attion brom est inst siupied 1 pgst our riwe asently rest .
50
- tont to presyou tew cons of benco 4 . yee : fater 45 y . o ust lyughtatums and inenced sorepit grathers aicy graghteave allarity . oarity wow to yur coons , as were then 60 ve mers of oite .
51
- ithat yoit ? ! berst thar ! enth excives 2004 . . .
52
- <br>
53
- spam
54
- ------------
55
- Subject: nominations for oct . 21 - 23 , 2000
56
- ( see attached file : hplnl 021 . xls )
57
- - hplnl 021 . xls
58
- ham
59
- ------------
60
- </code></pre>
61
- </details>
62
 
63
 
64
  <details>
@@ -99,6 +67,74 @@ ham
99
  </details>
100
 
101
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
102
  <details>
103
  <summary>sms_spam 样本示例</summary>
104
  <pre><code>------------
@@ -224,6 +260,37 @@ ham
224
  </details>
225
 
226
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
227
  ### 参考来源
228
 
229
  <details>
 
11
  数据集从网上收集整理如下:
12
  | 数据 | 语言 | 任务类型 | 原始数据/项目地址 | 样本个数 | 原始数据描述 | 替代数据下载地址 |
13
  | :--- | :---: | :---: | :---: | :---: | :---: | :---: |
14
+ | enron_spam | 英语 | 垃圾邮件分类 | [enron_spam_data](https://github.com/MWiechmann/enron_spam_data); [Enron-Spam](https://www2.aueb.gr/users/ion/data/enron-spam/); [spam-mails-dataset](https://www.kaggle.com/datasets/venky73/spam-mails-dataset) | ham: 16545; spam: 17171 | Enron-Spam 数据集是 V. Metsis、I. Androutsopoulos 和 G. Paliouras 收集的绝佳资源 | [SetFit/enron_spam](https://huggingface.co/datasets/SetFit/enron_spam); [enron-spam](https://www.kaggle.com/datasets/wanderfj/enron-spam) |
15
+ | enron_spam_subset | 英语 | 垃圾邮件分类 | [email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset) | ham: 5000; spam: 5000 | | |
16
+ | ling_spam | 英语 | 垃圾邮件分类 | [email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset) | ham: 2172; spam: 433 | | |
17
  | sms_spam | 英语 | 垃圾短信分类 | [SMS Spam Collection](https://archive.ics.uci.edu/dataset/228/sms+spam+collection); [SMS Spam Collection Dataset](https://www.kaggle.com/datasets/uciml/sms-spam-collection-dataset) | ham: 4827; spam: 747 | SMS 垃圾邮件集合是一组公开的 SMS 标记消息,为移动电话垃圾邮件研究而收集。 | [sms_spam](https://huggingface.co/datasets/sms_spam) |
18
+ | spam_assassin | 英语 | 垃圾邮件分类 | [datasets-spam-assassin](https://github.com/stdlib-js/datasets-spam-assassin); [Apache SpamAssassin’s public datasets](https://spamassassin.apache.org/old/publiccorpus/); [Spam or Not Spam Dataset](https://www.kaggle.com/datasets/ozlerhakan/spam-or-not-spam-dataset) | ham: 4150; spam: 1896 | 数据集从[email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset)的completeSpamAssassin.csv文件而来。 | [email-spam-dataset](https://www.kaggle.com/datasets/nitishabharathi/email-spam-dataset); [talby/SpamAssassin](https://huggingface.co/datasets/talby/spamassassin); [spamassassin-2002](https://www.kaggle.com/datasets/cesaber/spam-email-data-spamassassin-2002) |
19
  | spam_detection | 英语 | 垃圾短信分类 | [Deysi/spam-detection-dataset](https://huggingface.co/datasets/Deysi/spam-detection-dataset) | ham: 5400; spam: 5500 | | |
20
+ | sms_spam_collection | 英语 | 垃圾短信分类 | [spam-emails](https://www.kaggle.com/datasets/abdallahwagih/spam-emails) | ham: 4825; spam: 747 | 该数据集包含电子邮件的集合 | [email-spam-detection-dataset-classification](https://www.kaggle.com/datasets/shantanudhakadd/email-spam-detection-dataset-classification); [spam-identification](https://www.kaggle.com/datasets/amirdhavarshinis/spam-identification); [sms-spam-collection](https://www.kaggle.com/datasets/thedevastator/sms-spam-collection-a-more-diverse-dataset); [spam-or-ham](https://www.kaggle.com/datasets/arunasivapragasam/spam-or-ham) |
21
  | spam_message | 汉语 | 垃圾短信分类 | [SpamMessage](https://github.com/hrwhisper/SpamMessage) | ham: 720000; spam: 80000 | 其中spam的数据是正确的数据,但是做了脱敏处理(招生电话:xxxxxxxxxxx),这里的 x 可能会成为显著特征。而ham样本像是从普通文本中截断出来充作样本的,建议不要用这些数据。 | |
22
+ | spam_message_lr | 汉语 | 垃圾短信分类 | [SpamMessagesLR](https://github.com/x-hacker/SpamMessagesLR) | ham: 3983; spam: 6990 | | |
23
+ | trec_2007 | 英语 | 垃圾邮件分类 | [2007 TREC Public Spam Corpus](https://plg.uwaterloo.ca/~gvcormac/treccorpus07/); [Spam Track](https://trec.nist.gov/data/spam.html) | 样本个数 | 2007 TREC Public Spam Corpus | [trec07p.tar.gz](https://pan.baidu.com/s/1jC9CxVaxwizFCvGtI1JvJA?pwd=g72z) |
24
+ | youtube_spam_collection | 英语 | 垃圾评论分类 | [youtube+spam+collection](https://archive.ics.uci.edu/dataset/380/youtube+spam+collection); [YouTube Spam Collection Data Set](https://www.kaggle.com/datasets/lakshmi25npathi/images) | ham: 951; spam: 1005 | 它是为垃圾邮件研究而收集的公共评论集。 | |
25
 
26
 
 
27
 
28
 
29
+ ### 样本示例
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
30
 
31
 
32
  <details>
 
67
  </details>
68
 
69
 
70
+ <details>
71
+ <summary>enron_spam_subset 样本示例</summary>
72
+ <pre><code>------------
73
+ Subject: edrugs online
74
+ viagra - proven step to start something all over again .
75
+ nothing is more useful than silence .
76
+ teachers open the door . you enter by yourself .
77
+ how sharper than a serpent ' s tooth it isto have a thankless child !
78
+ spam
79
+ ------------
80
+ Subject: start date : 12 / 13 / 01 ; hourahead hour : 5 ;
81
+ start date : 12 / 13 / 01 ; hourahead hour : 5 ; no ancillary schedules awarded . no variances detected .
82
+ log messages :
83
+ parsing file - - > > o : \ portland \ westdesk \ california scheduling \ iso final schedules \ 2001121305 . txt
84
+ ham
85
+ ------------
86
+ Subject: cheapestt medss !
87
+ mediccationns at lowesst pricess everyy !
88
+ over 80 . % offf , pricess wontt get lowerr
89
+ we selll vic ' od ( in v , ia . gra x , ana . x
90
+ http : / / www . pr 3 sdlugs . com / ? refid = 87
91
+ <br>
92
+ spam
93
+ ------------
94
+ Subject: fw : picture
95
+ >
96
+ >
97
+ > the following is an aerial photo of the wtc area . it kinda brings on
98
+ > vertigo , but is a phenomenal shot .
99
+ >
100
+ > http : / / userwww . service . emory . edu / ~ rdgarr / wtc . htm
101
+ ham
102
+ ------------
103
+ </code></pre>
104
+ </details>
105
+
106
+
107
+ <details>
108
+ <summary>ling_spam 样本示例</summary>
109
+ <pre><code>------------
110
+ Subject: internet specialist 007 - the spy
111
+ <br>
112
+ internet specialist 007 - the spy learn everything about your friends , neighbors , enemies , employees or anyone else ! - - even your boss ! - - even yourself ! this mammoth snoop collection of internet sites will provide you the newest and most current addresses available on the net today . . . = 20 * track down an old debt , or anyone else that has done you wrong ! it 's incredible , and so many new data sites have come online in the past 90 days . . . * over 300 giant resources to look up people , credit , social security , current or past employment , mail order purchases , = 20 addresses , phone numbers , maps to city locations . . . * investigate your family history ! check birth , death , adoption or social security records check service records or army , navy , air force or = 20 marine corps . * locate an old friend ( or an enemy who is hiding ) or a lost = 20 love - - find e-mail , telephone or address information on anyone ! = 20 even look up * unlisted * phone numbers ! * find work by searching classified ads all over the world ! * screen prospective employees - - check credit , driving or criminal records verify income or educational accomplishments = 20 * check out your daughter 's new boyfriend ! * find trial transcripts and court orders ! * enjoy the enchantment of finding out a juicy tid-bit about a co-worker . the internet is a powerful megasource of information , = 20 if you only know where to look . i tell you how to find = 20 out nearly anything about anybody , and tell you exactly where to find it ! you will be amazed to find out what personal information = 20 other people can find out about you ! check your credit = 20 report so you can correct wrong information that may be = 20 used to deny you credit . research yourself first ! you ' ll be horrified , as i was , = 20 at how much data has been accumulated about you . any my huge collection is only the beginning ! once you = 20 locate these free private , college and government web sites , you ' ll find even more links to even more = 20 information search engines ! = 20 if you believe ( like i do ) that the information that is stored about each one of us should be freely accessible , you ' ll want to see the snoop collection i ' ve compiled . verify your own records , or find out what you need to = 20 know about others . i ' m telling you , it 's incredible what you can find out using the internet ! we will accept checks by fax at 813-269 - 9651 or > > > send $ 14 . 95 cash , check or money order to : > > > the coldwell group > > > p . o . box 3787 > > > dept 1007 > > > petersburg , va 23805 i will rush back to you my snoop information for fastest service include your * e-mail * address . = 20 * what information is available - - and exact url to get there ! * exactly where to look for - - and the clever way to use - - = 20 the above search engines , and tons more ! * my easy-to - browse categorized megacenter of information has my own description of how to use each site , and what you ' ll find when you get there - - and tricky tips on how to = 20 extract the best data ! you can know everything about everybody with this internet specialist collection ! * * soon to be available - - the most complete international internet spy = 20 sites available on the web today * * don ' t miss this one or you ' ll be sorry = 20 to be removed from our list please fax your address to 813-269 - 9651 . l = e3 = 01 @ u = 0b
113
+ <br>
114
+ spam
115
+ ------------
116
+ Subject: usage - based models - symposium
117
+ <br>
118
+ announcing the sixth biennial symposium of the rice university department of linguistics usage-based models of language rice university march 15-18 , 1995 invited speakers : mira ariel tel aviv university joan bybee university of new mexico john du bois university of california , santa barbara michael israel university of california , san diego sydney lamb rice university ronald langacker university of california , san diego tom givon university of oregon brian macwhinney carnegie - mellon university janet pierrehumbert northwestern university john sinclair university of birmingham ( u . k . ) arie verhagen university of utrecht description : the goal of this symposium is to explore approaches to linguistic theory that have in common the aim of accounting for linguistic usage . the empirical data for such theories is not restricted to linguistic intuitions about acceptibility , but comes from usage events of varied types . the focus is on the patterns found in the various sorts of usage data examined , and how those patterns can be extracted , represented , and used by the human mind . research from a variety of traditions will be represented , including corpus-based analyses , discourse studies , experimental studies of language processing and language acquisition , and instrumental phonetics . the approaches taken can be called data-driven , rather than model-driven , in that the fewest possible prior assumptions are made about what types of data are relevant , and that large sets of usage events are observed so that the detailed patterns found in actual usage can emerge . moreover , the various approaches taken show signs of converging toward a view of language as a dynamic system in which linguistic knowledge is not separate from its processing in language use . the linguistic models representing this view are usage-based by virtue of three factors : ( 1 ) the importance placed on usage data for theory construction ; ( 2 ) the direct incorporation of processing ( production and comprehension ) into linguistic theory ; and ( 3 ) the requirement that the models arrived at , whatever the direct source of evidence , must be testable with reference to language use . registration : no charge . symposium attendance on a space-available basis . for further information , contact suzanne kemmer ( kemmer @ ruf . rice . edu ) or michael barlow ( barlow @ ruf . rice . edu ) snailmail : dept . of linguistics , rice university , houston tx 77251-1892 .
119
+ <br>
120
+ ham
121
+ ------------
122
+ Subject: domani
123
+ <br>
124
+ new improved with free software , free bulk e mail system , free web site = to do what you wish , ongoing support ( optional ) , and a lot more ! all = included . . . . . . . . . . . this is a one time mailing . . . . . . . . . . . . . . . \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ $ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ \ = \ \ \ \ \ you are about to make at least $ 50 , 000 in less than 90 days read the enclosed program . . . then read it again . . . / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / / = / / / / / / / dear friend , the enclosed information is something i almost let slip through my fingers . fortunately , sometime later i re-read everything and gave some thought and study to it . my name is christopher erickson . two years ago , the corporation i worked at for the past twelve years down-sized and my position was eliminated . after unproductive job interviews , i decided to open my own business . over the past year , i incurred many unforeseen financial problems . i owed my family , friends , and creditors over $ 35 , 000 . the economy was taking a toll on my business and i just could n't seem to make ends meet . i had to refinance and borrow against my home to support my family and struggling business . i truly believe it was wrong for me to be in debt like this . at that moment something significant happened in my life and i am writing to share my experience in hopes that this will change your life forever . . . . financially ! ! ! in mid - december , i received this program via email . six months prior to receiving this program i had been sending away for information on various business opportunities . all of the programs i received , in my opinion , were not cost effective . they were either too difficult for me to comprehend or the initial investment was too much for me to risk to see if they worked or not . one claimed i 'd make a million dollars in one year . . . it did n't tell me i 'd have to write a book to make it . but like i was saying , in december of ' 92 i received this program . i did n't send for it , or ask for it , they just got my name off a mailing list . thank goodness for that ! ! ! after reading it several times , to = make sure i was reading it correctly , i could n't believe my eyes . = 20 here was a money-making phenomenon . i could invest as much as i wanted = to start , without putting me further in debt . after i got a pencil and paper and figured it out , i would at least get my money back . after determining that the program is legal and not a chain letter , i decided " why not " . initially i sent out 10 , 000 emails . it only cost me about $ 15 . 00 for my time on-line . the great thing about email is that i did n't need any money for printing to send out the program , only the cost to fulfill my orders . i am telling you like it is , i hope it does n't turn you off , but i promised myself that i would not " rip-off " anyone , no matter how much money it cost me ! . in less than one week , i was starting to receive orders for report # 1 . by january 13th , i had received 26 orders for report # 1 . when you read the guarantee in the program , you will see that " you must receive = 15 to 20 orders for report # 1 within two weeks . if you don ' t , send out = more programs until you do ! " my first step in making $ 50 , 000 in 20 to = 90 days was done . by january 30th , i had received 196 orders for report = # 2 . if you go back to the guarantee , " you must receive 100 or more orders for report # 2 within two weeks . if not , send out more = programs until you do . once you have 100 orders , the rest is easy , = relax , you will make your $ 50 , 000 goal . " well , i had 196 orders for = report # 2 , 96 more than i needed . so i sat back and relaxed . by march = 19th , of my emailing of 10 , 000 , i received $ 58 , 000 with more coming in = every day . i paid off all my debts and bought a much needed new car . please take time to read the attached program , it will change your life forever ! remember , it wont work if you do n't try it . this program does work , but you must follow it exactly ! especially the rules of not trying to place your name in a different place . it does n't work , you ' ll lose out on a lot of money ! report # 2 explains this . = 20 always follow the guarantee , 15 to 20 orders for report # 1 , and 100 or more orders for report # 2 and you will make $ 50 , 000 or more in 20 to 90 days . i am living proof that it works ! ! ! if you choose not to participate in this program , i ' m sorry . it really is a great opportunity with little cost or risk to you . if you choose to participate , follow the program and you will be on your way to financial security . if you are a fellow business owner and you are in financial trouble like i was , or you want to start your own business , consider this a sign . i did ! sincerely , christopher erickson ps do you have any idea what 11 , 700 $ 5 bills ( $ 58 , 000 ) look like piled up on a kitchen table ? it ' s awesome ! " threw it away " " i had received this program before . i threw it away , but later wondered if i should n't have given it a try . of course , i had no idea who to contact to get a copy , so i had to wait until i was emailed another copy of the program . eleven months passed , then it came . i didn ' t throw this one away . i made $ 41 , 000 on the first try . " dawn w . , evansville , in " no free lunch " " my late father always told me , ' remember , alan , there is no free lunch in life . you get out of life what you put into it . ' through trial and error and a somewhat slow frustrating start , i finally figured it out . the program works very well , i just had to find the right target group of people to email it to . so far this year , i have made over $ 63 , 000 using this program . i know my dad would have been very proud of me . " alan b . , philadelphia , pa a personal note from the originator of this program by the time you have read the enclosed information and looked over the enclosed program and reports , you should have concluded that such a program , and one that is legal , could not have been created by an amateur . let me tell you a little about myself . i had a profitable business for ten years . then in 1979 my business began falling off . i was doing the same things that were previously successful for me , but it was n't working . finally , i figured it out . it was n't me , it was the economy . inflation and recession had replaced the stable economy that had been with us since 1945 . i do n't have to tell you what happened to the unemployment rate . . . because many of you know from first hand experience . there were more failures and bankruptcies than ever before . the middle class was vanishing . those who knew what they were doing = invested wisely and moved up . those who did not , including those who = never had anything to save or invest , were moving down into the ranks of = the poor . as the saying goes , " the rich get richer and the poor get = poorer . " the traditional methods of making money will never allow you = to " move up " or " get rich " , inflation will see to that . you have just received information that can give you financial freedom for the rest of your life , with " no risk " and " just a little bit of effort . " you can make more money in the next few months than you have = ever imagined . i should also point out that i will not see a penny of your money , nor anyone else who has provided a testimonial for this program . i have already made over four million dollars ! i have retired from the program after sending out over 16 , 000 programs . now i have several offices which market this and several other programs here in the us and overseas . by the spring , we wish to market the ' internet ' by a partnership with america on line . follow the program exactly as instructed . do not change it in any way . = it works exceedingly well as it is now . remember to email a copy of = this exciting program to everyone that you can think of . one of the people you send this to may send out 50 , 000 . . . and your name will be on every one of them ! . remember though , the more you send out , the = more potential customers you will reach . so my friend , i have given you the ideas , information , materials and opportunity to become financially independent , it is up to you now ! " think about it " before you delete this program from your mailbox , as i almost did , take a little time to read it and really think about it . get a pencil and figure out what could happen when you participate . figure out the worst possible response and no matter how you calculate it , you will still make a lot of money ! definitely get back what you invested . = 20 any doubts you have will vanish when your first orders come in . it works ! paul johnson , raleigh , nc here ' s how this amazing program will make you $ $ $ $ $ $ let 's say that you decide to start small , just to see how it goes , and we ' ll assume you and all those involved send out 2 , 000 programs each . let 's also assume that the mailing receives a . 5 % response . using a good list the response could be much better . also many people will send out hundreds of thousands of programs instead of 2 , 000 . but continuing with this example , you send out only 2 , 000 programs . with a . 5 % response , that is only 10 orders for report # 1 . those 10 people respond by sending out 2 , 000 programs each for a total of 20 , 000 . out of those . 5 % , 100 people respond and order report # 2 . those 100 mail out 2 , 000 programs each for a total of 200 , 000 . the . 5 % response to that is 1 , 000 orders for report # 3 . those 1 , 000 send out 2 , 000 programs each for a 2 , 000 , 000 total . the . 5 % response to that is 10 , 000 orders for report # 4 . that 's 10 , 000 five dollar bills for you . cash ! ! ! ! your total income in this example is $ 50 + $ 500 + $ 5000 + $ 50 , 000 for a total of $ 55 , 550 ! ! ! ! remember friend , this is assuming 1 , 990 out of 2 , 000 people you mail to = will do absolutely nothing . . . and trash this program ! dare to think for = a moment what would happen if everyone or half sent out 100 , 000 programs instead of only 2 , 000 . believe me , many people will do = that and more ! by the way , your cost to participate in this is = practically nothing . you obviously already have an internet connection and email is free ! ! ! report # 3 will show you the best methods for bulk emailing and purchasing email lists . this is a legitimate , legal , money making opportunity . it does not require you to come in contact with people , do any hard work , and best of all , you never have to leave the house except to get the mail . if you believe that someday you ' ll get that big break that you ' ve been waiting for , this is it ! simply follow the instructions , and your dream will come true . this multi-level email order marketing program works perfectly . . . 100 % every time . email is the sales tool of the future . take advantage of this non-commercialized method of advertising now ! ! the longer you wait , the more people will be doing business using email . get your piece of this action ! ! multi-level marketing ( mlm ) has finally gained respectability . it is = being taught in the harvard business school , and both stanford research and the wall street journal have stated that between 50 % and = 65 % of all goods and services will be sold throughout multi - level methods by the mid to late 1990 's . this is a multi - billion dollar industry and of the 500 , 000 millionaires in the us , 20 % ( 100 , 000 ) made their fortune in the last several years in mlm . moreover , statistics show 45 people become millionaires everyday through multi - level marketing . instructions we at erris mail order marketing business , have a method of raising capital that really works 100 % every time . i am sure that you could use = $ 50 , 000 to $ 125 , 000 in the next 20 to 90 days . before you say " bull " , please read the program carefully . this is not a chain letter , but a perfectly legal money making opportunity . basically , this is what we do : as with all multi-level business , we build our business by recruiting new partners and selling our products . every state in the usa allows you to recruit new multi - level business partners , and we offer a product for every dollar sent . your orders come and are filled through the mail , so you are not = involved in personal selling . you do it privately in your own home , = store or office . this is the greatest multi - level mail order marketing anywhere : step ( 1 ) order all four 4 reports listed by name and number . dothis by ordering the report from each of the four 4 names listed on the next page . for each report , send $ 5 cash and a self - addressed , stamped envelope ( business size # 10 ) = to the person listed for the specific report . international = = 20 orders should also include $ 2 extra for postage . it is essential that you specify the name and number of the report requested to the person you are ordering from . you will need all four 4 reports because you will be reprinting and reselling them . do not alter the names or sequence other than what the instructions say . important : always provide same-day service on all orders . step ( 2 ) replace the name and address under report # 1 with yours , moving the one that was there down to report # 2 . drop the name and address under report # 2 to report # 3 , moving the one that was there to report # 4 . the name and address that was under report # 4 is dropped from the list and this party is no doubt on the way to the bank . when doing this , make certain you type the names and addresses accurately ! do not mix up moving product / report positions ! ! ! step ( 3 ) having made the required changes in the name list , save it as a text ( . txt ) file in it 's own directory to be used with whatever email program you like . again , report # 3 will tell you the best methods of bulk emailing and acquiring email lists . step ( 4 ) email a copy of the entire program ( all of this is very important ) to everyone whose address you can get your hands on . start with friends and relatives since you can encourage them to take advantage of this fabulous = 20 money-making opportunity . that 's what i did . and they love me now , more than ever . then , email to anyone and everyone ! use your imagination ! you can get email addresses from companies on the internet who specialize in email mailing lists . these are very cheap , 100 , 000 addresses for around $ 35 . 00 . important : you won't get a good response if you use an old list , so always request a fresh , new list . you will find out where to purchase these lists when you order the four 4 reports . always provide same-day service on all orders ! ! ! required reports * * * order each report by number and name * * * always send a self-addressed , stamped envelope and $ 5 usd cash for each order requesting the specific report by name and number ( international orders should also include $ 2 usd extra for postage ) = 20 add you e amil address when sending in for your report this is for = updated information and continueing support ( optional ) that will be = handed down by you sponcers . _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 1 " how to make $ 250 , 000 through multi-level sales " order report # 1 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 2 " major corporations and multi-level sales " j . maz 15774 s . lagrange rd suite # 312 orland pk , il 60462 usa _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ order report # 2 from : a . siegmund # 57 trakehnenstr . 13 53332 bornheim , germany _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 3 " sources for the best mailing lists " order report # 3 from : b . thompson 13504 greencaslte ridge tr . 404 burtonsville md . 20866 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ report # 4 " evaluating multi-level sales plans " order report # 4 from : muw # 2 po box 71442 salt lake city , ut 84171-0442 _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ conclusion . i am enjoying my fortune that i made by sending out this program . you too , will be making money in 20 to 90 days , if you follow the simple steps outlined in this mailing . to be financially independent is to be free . free to make financial decisions as never before . go into business , get into investments , retire or take a vacation . = 20 = = = = = = 20
125
+ <br>
126
+ spam
127
+ ------------
128
+ Subject: linguistic datasources
129
+ <br>
130
+ at the request of subscribers , we ' ve been collecting the addresses of linguistic datasources which can be reached through world wide web . these addresses are now available to any of you who have web access on the linguist web server at the following url : http : / / engserve . tamu . edu / files / linguistics / linguist / datasources . html this file is also available , to those of you who read web - linguist , through the " linguistic datasources " link . we 'd be delighted to hear any comments anyone would care to make . and if there ' re any addresses we need to add , please let us know what they are . we 'd like to emphasize that we 'd be happy to include sites where individual linguists keep data they would like to make available to their colleagues . since the web allows us to share not merely text , but pictures and sound-recordings , we can now begin an interchange of linguistic information that is of a very different nature from that which was possible in the past . anthony & helen
131
+ <br>
132
+ ham
133
+ ------------
134
+ </code></pre>
135
+ </details>
136
+
137
+
138
  <details>
139
  <summary>sms_spam 样本示例</summary>
140
  <pre><code>------------
 
260
  </details>
261
 
262
 
263
+ <details>
264
+ <summary>spam_message_lr 样本示例</summary>
265
+ <pre><code>------------
266
+ 3G小贴士提醒您可不要让您的流量白白浪费了哦,快来唤醒吧!与您分享杨子黎2013全新单曲《爱人好累》MV 详情点击:http://yuny.com.cn:3042/tpo/SU/NjiYby
267
+ spam
268
+ ------------
269
+ 巫倩云:来周总办公室现在
270
+ ham
271
+ ------------
272
+ 结婚娶亲本是一件高兴事,新郎却因一辆加长林肯车而当场落泪!这是为什么?详情请点击 http://10006.co/lbJ5
273
+ spam
274
+ ------------
275
+ PP提醒你来认证啦!在电脑上登录PP租车官方网站(www.ppzuche.com)或下载PP租车手机客户端(www.ppzuche.com/get-app)上传身份证和驾驶证照片,即可完成租客身份认证。600余款车型,低于市场价30%,随时随地取车,开启便捷用车新时代!【PP租车】
276
+ ham
277
+ ------------
278
+ 【联通飞影】对美女自作多情的后果… http://fql.cc/pub/view/iid-48305
279
+ spam
280
+ ------------
281
+ 您已成功添加王然(13811083077)为好友,可以接收对方的飞信消息。回复本短信可直接与对方聊天。
282
+ ham
283
+ ------------
284
+ 棕盛商业广场一期5月18号火爆认筹,孟庭苇邀您亲见世界500强签约蕲春,VIP卡全城免费热办中。0713-7300000【棕盛商业地产】
285
+ spam
286
+ ------------
287
+ 信用卡1707于10月23日16:19消费人民币360.07,记账金额以账单显示为准。加“浦发银行信用卡中心”官方微信查询/待续【浦发银行】
288
+ ham
289
+ ------------
290
+ </code></pre>
291
+ </details>
292
+
293
+
294
  ### 参考来源
295
 
296
  <details>
data/enron_spam_subset.jsonl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d81b6d1b3a46df6b832e57960292a8d755f528f7668aeedcc81c8a111b0d159c
3
+ size 16895439
data/ling_spam.jsonl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0dbdea28dd55b2aba08fa12a02d95c57c47849997b11c3cd00c23de02d1834db
3
+ size 25950867
data/{email_spam.jsonl → sms_spam_collection.jsonl} RENAMED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:4b530e530e6251b7755b2329e15482a95332d71caad0b8c2fb85a7433a335fd7
3
- size 5919435
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:3fae437ae98e46a252af5293bcac84b467c5474c5f7f307d72a717a4b55265fe
3
+ size 985500
data/spam_assassin.jsonl CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:48753da2e31d5d65efcacf31f12e2f3b3f68b02a1a625222173a7bc8fb86435c
3
- size 41867894
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0133bf8ef2505d9ffc34cddb8cea67efc64376b6bc91b33ca27de7c10d0485e2
3
+ size 11549868
data/spam_message_lr.jsonl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ade197753c9d391530c019dc1ccf39add26fea13c9119fa13f9600d4d93ef863
3
+ size 2977972
data/youtube_spam_collection.jsonl ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:17264121e0c8a12e33db75c781019998273f6a922a42907d17e1347d1edac66d
3
+ size 427008
examples/preprocess/process_enron_spam_subset.py ADDED
@@ -0,0 +1,74 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python3
2
+ # -*- coding: utf-8 -*-
3
+ import argparse
4
+ from collections import defaultdict
5
+ import json
6
+ import os
7
+ from pathlib import Path
8
+ import random
9
+ import re
10
+ import sys
11
+
12
+ pwd = os.path.abspath(os.path.dirname(__file__))
13
+ sys.path.append(os.path.join(pwd, '../../'))
14
+
15
+ from datasets import load_dataset
16
+ import pandas as pd
17
+ from tqdm import tqdm
18
+
19
+ from project_settings import project_path
20
+
21
+
22
+ def get_args():
23
+ parser = argparse.ArgumentParser()
24
+
25
+ parser.add_argument("--data_file", default="data/email_spam/enronSpamSubset.csv", type=str)
26
+
27
+ parser.add_argument(
28
+ "--output_file",
29
+ default=(project_path / "data/enron_spam_subset.jsonl"),
30
+ type=str
31
+ )
32
+ args = parser.parse_args()
33
+ return args
34
+
35
+
36
+ def main():
37
+ args = get_args()
38
+
39
+ df = pd.read_csv(args.data_file)
40
+
41
+ with open(args.output_file, "w", encoding="utf-8") as f:
42
+ for i, row in tqdm(df.iterrows(), total=len(df)):
43
+ # print(row)
44
+ text = row["Body"]
45
+ label = row["Label"]
46
+
47
+ label = "spam" if label == 1 else "ham"
48
+
49
+ if label not in ("spam", "ham"):
50
+ raise AssertionError
51
+
52
+ num = random.random()
53
+ if num < 0.9:
54
+ split = "train"
55
+ elif num < 0.95:
56
+ split = "validation"
57
+ else:
58
+ split = "test"
59
+
60
+ row = {
61
+ "text": text,
62
+ "label": label,
63
+ "category": None,
64
+ "data_source": "enron_spam_subset",
65
+ "split": split
66
+ }
67
+ row = json.dumps(row, ensure_ascii=False)
68
+ f.write("{}\n".format(row))
69
+
70
+ return
71
+
72
+
73
+ if __name__ == '__main__':
74
+ main()
examples/preprocess/process_ling_spam.py ADDED
@@ -0,0 +1,73 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python3
2
+ # -*- coding: utf-8 -*-
3
+ import argparse
4
+ from collections import defaultdict
5
+ import json
6
+ import os
7
+ from pathlib import Path
8
+ import random
9
+ import re
10
+ import sys
11
+
12
+ pwd = os.path.abspath(os.path.dirname(__file__))
13
+ sys.path.append(os.path.join(pwd, '../../'))
14
+
15
+ import pandas as pd
16
+ from tqdm import tqdm
17
+
18
+ from project_settings import project_path
19
+
20
+
21
+ def get_args():
22
+ parser = argparse.ArgumentParser()
23
+
24
+ parser.add_argument("--data_file", default="data/email_spam/lingSpam.csv", type=str)
25
+
26
+ parser.add_argument(
27
+ "--output_file",
28
+ default=(project_path / "data/ling_spam.jsonl"),
29
+ type=str
30
+ )
31
+ args = parser.parse_args()
32
+ return args
33
+
34
+
35
+ def main():
36
+ args = get_args()
37
+
38
+ df = pd.read_csv(args.data_file)
39
+
40
+ with open(args.output_file, "w", encoding="utf-8") as f:
41
+ for i, row in tqdm(df.iterrows(), total=len(df)):
42
+ # print(row)
43
+ text = row["Body"]
44
+ label = row["Label"]
45
+
46
+ label = "spam" if label == 1 else "ham"
47
+
48
+ if label not in ("spam", "ham"):
49
+ raise AssertionError
50
+
51
+ num = random.random()
52
+ if num < 0.9:
53
+ split = "train"
54
+ elif num < 0.95:
55
+ split = "validation"
56
+ else:
57
+ split = "test"
58
+
59
+ row = {
60
+ "text": text,
61
+ "label": label,
62
+ "category": None,
63
+ "data_source": "ling_spam",
64
+ "split": split
65
+ }
66
+ row = json.dumps(row, ensure_ascii=False)
67
+ f.write("{}\n".format(row))
68
+
69
+ return
70
+
71
+
72
+ if __name__ == '__main__':
73
+ main()
examples/preprocess/process_sms_spam_collection.py ADDED
@@ -0,0 +1,71 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python3
2
+ # -*- coding: utf-8 -*-
3
+ import argparse
4
+ from collections import defaultdict
5
+ import json
6
+ import os
7
+ from pathlib import Path
8
+ import random
9
+ import re
10
+ import sys
11
+
12
+ pwd = os.path.abspath(os.path.dirname(__file__))
13
+ sys.path.append(os.path.join(pwd, '../../'))
14
+
15
+ import pandas as pd
16
+ from tqdm import tqdm
17
+
18
+ from project_settings import project_path
19
+
20
+
21
+ def get_args():
22
+ parser = argparse.ArgumentParser()
23
+
24
+ parser.add_argument("--data_file", default="data/sms_spam_collection/spam.csv", type=str)
25
+
26
+ parser.add_argument(
27
+ "--output_file",
28
+ default=(project_path / "data/sms_spam_collection.jsonl"),
29
+ type=str
30
+ )
31
+ args = parser.parse_args()
32
+ return args
33
+
34
+
35
+ def main():
36
+ args = get_args()
37
+
38
+ df = pd.read_csv(args.data_file)
39
+
40
+ with open(args.output_file, "w", encoding="utf-8") as f:
41
+ for i, row in tqdm(df.iterrows(), total=len(df)):
42
+ # print(row)
43
+ text = row["Message"]
44
+ label = row["Category"]
45
+
46
+ if label not in ("spam", "ham"):
47
+ raise AssertionError
48
+
49
+ num = random.random()
50
+ if num < 0.9:
51
+ split = "train"
52
+ elif num < 0.95:
53
+ split = "validation"
54
+ else:
55
+ split = "test"
56
+
57
+ row = {
58
+ "text": text,
59
+ "label": label,
60
+ "category": None,
61
+ "data_source": "sms_spam_collection",
62
+ "split": split
63
+ }
64
+ row = json.dumps(row, ensure_ascii=False)
65
+ f.write("{}\n".format(row))
66
+
67
+ return
68
+
69
+
70
+ if __name__ == '__main__':
71
+ main()
examples/preprocess/process_spam_assassin.py CHANGED
@@ -13,6 +13,7 @@ pwd = os.path.abspath(os.path.dirname(__file__))
13
  sys.path.append(os.path.join(pwd, '../../'))
14
 
15
  from datasets import load_dataset
 
16
  from tqdm import tqdm
17
 
18
  from project_settings import project_path
@@ -21,12 +22,8 @@ from project_settings import project_path
21
  def get_args():
22
  parser = argparse.ArgumentParser()
23
 
24
- parser.add_argument("--dataset_path", default="talby/spamassassin", type=str)
25
- parser.add_argument(
26
- "--dataset_cache_dir",
27
- default=(project_path / "hub_datasets").as_posix(),
28
- type=str
29
- )
30
  parser.add_argument(
31
  "--output_file",
32
  default=(project_path / "data/spam_assassin.jsonl"),
@@ -39,34 +36,36 @@ def get_args():
39
  def main():
40
  args = get_args()
41
 
42
- dataset_dict = load_dataset(
43
- path=args.dataset_path,
44
- cache_dir=args.dataset_cache_dir,
45
- )
46
- print(dataset_dict)
47
 
48
  with open(args.output_file, "w", encoding="utf-8") as f:
49
- for split, dataset in dataset_dict.items():
50
- for sample in tqdm(dataset):
51
- # print(sample)
52
- text = sample["text"]
53
- group = sample["group"]
54
- label = sample["label"]
55
-
56
- label = "spam" if label == 1 else "ham"
57
-
58
- if label not in ("spam", "ham"):
59
- raise AssertionError
60
-
61
- row = {
62
- "text": text,
63
- "label": label,
64
- "category": group,
65
- "data_source": "spam_assassin",
66
- "split": split
67
- }
68
- row = json.dumps(row, ensure_ascii=False)
69
- f.write("{}\n".format(row))
 
 
 
 
 
 
70
 
71
  return
72
 
 
13
  sys.path.append(os.path.join(pwd, '../../'))
14
 
15
  from datasets import load_dataset
16
+ import pandas as pd
17
  from tqdm import tqdm
18
 
19
  from project_settings import project_path
 
22
  def get_args():
23
  parser = argparse.ArgumentParser()
24
 
25
+ parser.add_argument("--data_file", default="data/email_spam/completeSpamAssassin.csv", type=str)
26
+
 
 
 
 
27
  parser.add_argument(
28
  "--output_file",
29
  default=(project_path / "data/spam_assassin.jsonl"),
 
36
  def main():
37
  args = get_args()
38
 
39
+ df = pd.read_csv(args.data_file)
 
 
 
 
40
 
41
  with open(args.output_file, "w", encoding="utf-8") as f:
42
+ for i, row in tqdm(df.iterrows(), total=len(df)):
43
+ # print(row)
44
+ text = row["Body"]
45
+ label = row["Label"]
46
+
47
+ label = "spam" if label == 1 else "ham"
48
+
49
+ if label not in ("spam", "ham"):
50
+ raise AssertionError
51
+
52
+ num = random.random()
53
+ if num < 0.9:
54
+ split = "train"
55
+ elif num < 0.95:
56
+ split = "validation"
57
+ else:
58
+ split = "test"
59
+
60
+ row = {
61
+ "text": text,
62
+ "label": label,
63
+ "category": None,
64
+ "data_source": "spam_assassin",
65
+ "split": split
66
+ }
67
+ row = json.dumps(row, ensure_ascii=False)
68
+ f.write("{}\n".format(row))
69
 
70
  return
71
 
examples/preprocess/process_spam_message_lr.py ADDED
@@ -0,0 +1,77 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python3
2
+ # -*- coding: utf-8 -*-
3
+ import argparse
4
+ from collections import defaultdict
5
+ import json
6
+ import os
7
+ from pathlib import Path
8
+ import random
9
+ import re
10
+ import sys
11
+
12
+ pwd = os.path.abspath(os.path.dirname(__file__))
13
+ sys.path.append(os.path.join(pwd, '../../'))
14
+
15
+ from datasets import load_dataset
16
+ from tqdm import tqdm
17
+
18
+ from project_settings import project_path
19
+
20
+
21
+ def get_args():
22
+ parser = argparse.ArgumentParser()
23
+
24
+ parser.add_argument("--data_file", default="data/spam_message_lr/train.txt", type=str)
25
+ parser.add_argument(
26
+ "--output_file",
27
+ default=(project_path / "data/spam_message_lr.jsonl"),
28
+ type=str
29
+ )
30
+ args = parser.parse_args()
31
+ return args
32
+
33
+
34
+ def main():
35
+ args = get_args()
36
+
37
+ with open(args.output_file, "w", encoding="utf-8") as fout:
38
+ with open(args.data_file, "r", encoding="utf-8") as fin:
39
+ for row in fin:
40
+ row = str(row).rstrip("\n")
41
+ row = row.split("\t", maxsplit=1)
42
+
43
+ if len(row) != 2:
44
+ print(row)
45
+ raise AssertionError
46
+
47
+ label = row[0]
48
+ text = row[1]
49
+
50
+ label = "spam" if label == "1" else "ham"
51
+
52
+ if label not in ("spam", "ham"):
53
+ raise AssertionError
54
+
55
+ num = random.random()
56
+ if num < 0.9:
57
+ split = "train"
58
+ elif num < 0.95:
59
+ split = "validation"
60
+ else:
61
+ split = "test"
62
+
63
+ row = {
64
+ "text": text,
65
+ "label": label,
66
+ "category": None,
67
+ "data_source": "spam_message_lr",
68
+ "split": split
69
+ }
70
+ row = json.dumps(row, ensure_ascii=False)
71
+ fout.write("{}\n".format(row))
72
+
73
+ return
74
+
75
+
76
+ if __name__ == '__main__':
77
+ main()
examples/preprocess/{process_email_spam.py → process_youtube_spam_collection.py} RENAMED
@@ -12,7 +12,7 @@ import sys
12
  pwd = os.path.abspath(os.path.dirname(__file__))
13
  sys.path.append(os.path.join(pwd, '../../'))
14
 
15
- from datasets import load_dataset
16
  from tqdm import tqdm
17
 
18
  from project_settings import project_path
@@ -21,15 +21,11 @@ from project_settings import project_path
21
  def get_args():
22
  parser = argparse.ArgumentParser()
23
 
24
- parser.add_argument("--dataset_path", default="NotShrirang/email-spam-filter", type=str)
25
- parser.add_argument(
26
- "--dataset_cache_dir",
27
- default=(project_path / "hub_datasets").as_posix(),
28
- type=str
29
- )
30
  parser.add_argument(
31
  "--output_file",
32
- default=(project_path / "data/email_spam.jsonl"),
33
  type=str
34
  )
35
  args = parser.parse_args()
@@ -39,27 +35,37 @@ def get_args():
39
  def main():
40
  args = get_args()
41
 
42
- dataset_dict = load_dataset(
43
- path=args.dataset_path,
44
- cache_dir=args.dataset_cache_dir,
45
- )
46
- print(dataset_dict)
47
 
48
  with open(args.output_file, "w", encoding="utf-8") as f:
49
- for split, dataset in dataset_dict.items():
50
- for sample in tqdm(dataset):
51
- # print(sample)
52
- text = sample["text"]
53
- label = sample["label"]
 
 
 
 
 
 
54
 
55
  if label not in ("spam", "ham"):
56
  raise AssertionError
57
 
 
 
 
 
 
 
 
 
58
  row = {
59
  "text": text,
60
  "label": label,
61
- "category": None,
62
- "data_source": "email_spam",
63
  "split": split
64
  }
65
  row = json.dumps(row, ensure_ascii=False)
 
12
  pwd = os.path.abspath(os.path.dirname(__file__))
13
  sys.path.append(os.path.join(pwd, '../../'))
14
 
15
+ import pandas as pd
16
  from tqdm import tqdm
17
 
18
  from project_settings import project_path
 
21
  def get_args():
22
  parser = argparse.ArgumentParser()
23
 
24
+ parser.add_argument("--data_dir", default="data/youtube_spam_collection", type=str)
25
+
 
 
 
 
26
  parser.add_argument(
27
  "--output_file",
28
+ default=(project_path / "data/youtube_spam_collection.jsonl"),
29
  type=str
30
  )
31
  args = parser.parse_args()
 
35
  def main():
36
  args = get_args()
37
 
38
+ data_dir = Path(args.data_dir)
 
 
 
 
39
 
40
  with open(args.output_file, "w", encoding="utf-8") as f:
41
+ for filename in data_dir.glob("*.csv"):
42
+ df = pd.read_csv(filename.as_posix())
43
+
44
+ for i, row in tqdm(df.iterrows(), total=len(df)):
45
+ # print(row)
46
+ text = row["CONTENT"]
47
+ label = row["CLASS"]
48
+
49
+ text = text.replace("", "")
50
+
51
+ label = "spam" if label == 1 else "ham"
52
 
53
  if label not in ("spam", "ham"):
54
  raise AssertionError
55
 
56
+ num = random.random()
57
+ if num < 0.9:
58
+ split = "train"
59
+ elif num < 0.95:
60
+ split = "validation"
61
+ else:
62
+ split = "test"
63
+
64
  row = {
65
  "text": text,
66
  "label": label,
67
+ "category": filename.stem,
68
+ "data_source": "youtube_spam_collection",
69
  "split": split
70
  }
71
  row = json.dumps(row, ensure_ascii=False)
examples/preprocess/samples_count.py CHANGED
@@ -6,12 +6,16 @@ from datasets import load_dataset, DownloadMode
6
 
7
  dataset_dict = load_dataset(
8
  "../../spam_detect.py",
9
- # name="email_spam",
10
  # name="enron_spam",
 
 
11
  # name="sms_spam",
12
  # name="spam_assassin",
13
  # name="spam_detection",
14
- name="spam_message",
 
 
 
15
  split=None,
16
  cache_dir=None,
17
  download_mode=DownloadMode.FORCE_REDOWNLOAD
 
6
 
7
  dataset_dict = load_dataset(
8
  "../../spam_detect.py",
 
9
  # name="enron_spam",
10
+ # name="enron_spam_subset",
11
+ # name="ling_spam",
12
  # name="sms_spam",
13
  # name="spam_assassin",
14
  # name="spam_detection",
15
+ # name="sms_spam_collection",
16
+ # name="spam_message",
17
+ # name="spam_message_lr",
18
+ name="youtube_spam_collection",
19
  split=None,
20
  cache_dir=None,
21
  download_mode=DownloadMode.FORCE_REDOWNLOAD
main.py CHANGED
@@ -5,12 +5,16 @@ from datasets import load_dataset, DownloadMode
5
 
6
  dataset = load_dataset(
7
  "spam_detect.py",
8
- # name="email_spam",
9
  # name="enron_spam",
 
 
10
  # name="sms_spam",
11
  # name="spam_assassin",
12
  # name="spam_detection",
13
- name="spam_message",
 
 
 
14
  split="train",
15
  cache_dir=None,
16
  download_mode=DownloadMode.FORCE_REDOWNLOAD
 
5
 
6
  dataset = load_dataset(
7
  "spam_detect.py",
 
8
  # name="enron_spam",
9
+ # name="enron_spam_subset",
10
+ # name="ling_spam",
11
  # name="sms_spam",
12
  # name="spam_assassin",
13
  # name="spam_detection",
14
+ # name="sms_spam_collection",
15
+ # name="spam_message",
16
+ # name="spam_message_lr",
17
+ name="youtube_spam_collection",
18
  split="train",
19
  cache_dir=None,
20
  download_mode=DownloadMode.FORCE_REDOWNLOAD
spam_detect.py CHANGED
@@ -11,12 +11,16 @@ import datasets
11
 
12
 
13
  _urls = {
14
- "email_spam": "data/email_spam.jsonl",
15
  "enron_spam": "data/enron_spam.jsonl",
 
 
16
  "sms_spam": "data/sms_spam.jsonl",
17
  "spam_assassin": "data/spam_assassin.jsonl",
18
  "spam_detection": "data/spam_detection.jsonl",
 
19
  "spam_message": "data/spam_message.jsonl",
 
 
20
 
21
  }
22
 
 
11
 
12
 
13
  _urls = {
 
14
  "enron_spam": "data/enron_spam.jsonl",
15
+ "enron_spam_subset": "data/enron_spam_subset.jsonl",
16
+ "ling_spam": "data/ling_spam.jsonl",
17
  "sms_spam": "data/sms_spam.jsonl",
18
  "spam_assassin": "data/spam_assassin.jsonl",
19
  "spam_detection": "data/spam_detection.jsonl",
20
+ "spam_emails": "data/spam_emails.jsonl",
21
  "spam_message": "data/spam_message.jsonl",
22
+ "spam_message_lr": "data/spam_message_lr.jsonl",
23
+ "youtube_spam_collection": "data/youtube_spam_collection.jsonl",
24
 
25
  }
26