gyigit commited on
Commit
dd49f8a
·
1 Parent(s): bc3198a
src/bin/.DS_Store ADDED
Binary file (6.15 kB). View file
 
src/bin/LICENSE ADDED
@@ -0,0 +1,674 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ GNU GENERAL PUBLIC LICENSE
2
+ Version 3, 29 June 2007
3
+
4
+ Copyright (C) 2007 Free Software Foundation, Inc. <http://fsf.org/>
5
+ Everyone is permitted to copy and distribute verbatim copies
6
+ of this license document, but changing it is not allowed.
7
+
8
+ Preamble
9
+
10
+ The GNU General Public License is a free, copyleft license for
11
+ software and other kinds of works.
12
+
13
+ The licenses for most software and other practical works are designed
14
+ to take away your freedom to share and change the works. By contrast,
15
+ the GNU General Public License is intended to guarantee your freedom to
16
+ share and change all versions of a program--to make sure it remains free
17
+ software for all its users. We, the Free Software Foundation, use the
18
+ GNU General Public License for most of our software; it applies also to
19
+ any other work released this way by its authors. You can apply it to
20
+ your programs, too.
21
+
22
+ When we speak of free software, we are referring to freedom, not
23
+ price. Our General Public Licenses are designed to make sure that you
24
+ have the freedom to distribute copies of free software (and charge for
25
+ them if you wish), that you receive source code or can get it if you
26
+ want it, that you can change the software or use pieces of it in new
27
+ free programs, and that you know you can do these things.
28
+
29
+ To protect your rights, we need to prevent others from denying you
30
+ these rights or asking you to surrender the rights. Therefore, you have
31
+ certain responsibilities if you distribute copies of the software, or if
32
+ you modify it: responsibilities to respect the freedom of others.
33
+
34
+ For example, if you distribute copies of such a program, whether
35
+ gratis or for a fee, you must pass on to the recipients the same
36
+ freedoms that you received. You must make sure that they, too, receive
37
+ or can get the source code. And you must show them these terms so they
38
+ know their rights.
39
+
40
+ Developers that use the GNU GPL protect your rights with two steps:
41
+ (1) assert copyright on the software, and (2) offer you this License
42
+ giving you legal permission to copy, distribute and/or modify it.
43
+
44
+ For the developers' and authors' protection, the GPL clearly explains
45
+ that there is no warranty for this free software. For both users' and
46
+ authors' sake, the GPL requires that modified versions be marked as
47
+ changed, so that their problems will not be attributed erroneously to
48
+ authors of previous versions.
49
+
50
+ Some devices are designed to deny users access to install or run
51
+ modified versions of the software inside them, although the manufacturer
52
+ can do so. This is fundamentally incompatible with the aim of
53
+ protecting users' freedom to change the software. The systematic
54
+ pattern of such abuse occurs in the area of products for individuals to
55
+ use, which is precisely where it is most unacceptable. Therefore, we
56
+ have designed this version of the GPL to prohibit the practice for those
57
+ products. If such problems arise substantially in other domains, we
58
+ stand ready to extend this provision to those domains in future versions
59
+ of the GPL, as needed to protect the freedom of users.
60
+
61
+ Finally, every program is threatened constantly by software patents.
62
+ States should not allow patents to restrict development and use of
63
+ software on general-purpose computers, but in those that do, we wish to
64
+ avoid the special danger that patents applied to a free program could
65
+ make it effectively proprietary. To prevent this, the GPL assures that
66
+ patents cannot be used to render the program non-free.
67
+
68
+ The precise terms and conditions for copying, distribution and
69
+ modification follow.
70
+
71
+ TERMS AND CONDITIONS
72
+
73
+ 0. Definitions.
74
+
75
+ "This License" refers to version 3 of the GNU General Public License.
76
+
77
+ "Copyright" also means copyright-like laws that apply to other kinds of
78
+ works, such as semiconductor masks.
79
+
80
+ "The Program" refers to any copyrightable work licensed under this
81
+ License. Each licensee is addressed as "you". "Licensees" and
82
+ "recipients" may be individuals or organizations.
83
+
84
+ To "modify" a work means to copy from or adapt all or part of the work
85
+ in a fashion requiring copyright permission, other than the making of an
86
+ exact copy. The resulting work is called a "modified version" of the
87
+ earlier work or a work "based on" the earlier work.
88
+
89
+ A "covered work" means either the unmodified Program or a work based
90
+ on the Program.
91
+
92
+ To "propagate" a work means to do anything with it that, without
93
+ permission, would make you directly or secondarily liable for
94
+ infringement under applicable copyright law, except executing it on a
95
+ computer or modifying a private copy. Propagation includes copying,
96
+ distribution (with or without modification), making available to the
97
+ public, and in some countries other activities as well.
98
+
99
+ To "convey" a work means any kind of propagation that enables other
100
+ parties to make or receive copies. Mere interaction with a user through
101
+ a computer network, with no transfer of a copy, is not conveying.
102
+
103
+ An interactive user interface displays "Appropriate Legal Notices"
104
+ to the extent that it includes a convenient and prominently visible
105
+ feature that (1) displays an appropriate copyright notice, and (2)
106
+ tells the user that there is no warranty for the work (except to the
107
+ extent that warranties are provided), that licensees may convey the
108
+ work under this License, and how to view a copy of this License. If
109
+ the interface presents a list of user commands or options, such as a
110
+ menu, a prominent item in the list meets this criterion.
111
+
112
+ 1. Source Code.
113
+
114
+ The "source code" for a work means the preferred form of the work
115
+ for making modifications to it. "Object code" means any non-source
116
+ form of a work.
117
+
118
+ A "Standard Interface" means an interface that either is an official
119
+ standard defined by a recognized standards body, or, in the case of
120
+ interfaces specified for a particular programming language, one that
121
+ is widely used among developers working in that language.
122
+
123
+ The "System Libraries" of an executable work include anything, other
124
+ than the work as a whole, that (a) is included in the normal form of
125
+ packaging a Major Component, but which is not part of that Major
126
+ Component, and (b) serves only to enable use of the work with that
127
+ Major Component, or to implement a Standard Interface for which an
128
+ implementation is available to the public in source code form. A
129
+ "Major Component", in this context, means a major essential component
130
+ (kernel, window system, and so on) of the specific operating system
131
+ (if any) on which the executable work runs, or a compiler used to
132
+ produce the work, or an object code interpreter used to run it.
133
+
134
+ The "Corresponding Source" for a work in object code form means all
135
+ the source code needed to generate, install, and (for an executable
136
+ work) run the object code and to modify the work, including scripts to
137
+ control those activities. However, it does not include the work's
138
+ System Libraries, or general-purpose tools or generally available free
139
+ programs which are used unmodified in performing those activities but
140
+ which are not part of the work. For example, Corresponding Source
141
+ includes interface definition files associated with source files for
142
+ the work, and the source code for shared libraries and dynamically
143
+ linked subprograms that the work is specifically designed to require,
144
+ such as by intimate data communication or control flow between those
145
+ subprograms and other parts of the work.
146
+
147
+ The Corresponding Source need not include anything that users
148
+ can regenerate automatically from other parts of the Corresponding
149
+ Source.
150
+
151
+ The Corresponding Source for a work in source code form is that
152
+ same work.
153
+
154
+ 2. Basic Permissions.
155
+
156
+ All rights granted under this License are granted for the term of
157
+ copyright on the Program, and are irrevocable provided the stated
158
+ conditions are met. This License explicitly affirms your unlimited
159
+ permission to run the unmodified Program. The output from running a
160
+ covered work is covered by this License only if the output, given its
161
+ content, constitutes a covered work. This License acknowledges your
162
+ rights of fair use or other equivalent, as provided by copyright law.
163
+
164
+ You may make, run and propagate covered works that you do not
165
+ convey, without conditions so long as your license otherwise remains
166
+ in force. You may convey covered works to others for the sole purpose
167
+ of having them make modifications exclusively for you, or provide you
168
+ with facilities for running those works, provided that you comply with
169
+ the terms of this License in conveying all material for which you do
170
+ not control copyright. Those thus making or running the covered works
171
+ for you must do so exclusively on your behalf, under your direction
172
+ and control, on terms that prohibit them from making any copies of
173
+ your copyrighted material outside their relationship with you.
174
+
175
+ Conveying under any other circumstances is permitted solely under
176
+ the conditions stated below. Sublicensing is not allowed; section 10
177
+ makes it unnecessary.
178
+
179
+ 3. Protecting Users' Legal Rights From Anti-Circumvention Law.
180
+
181
+ No covered work shall be deemed part of an effective technological
182
+ measure under any applicable law fulfilling obligations under article
183
+ 11 of the WIPO copyright treaty adopted on 20 December 1996, or
184
+ similar laws prohibiting or restricting circumvention of such
185
+ measures.
186
+
187
+ When you convey a covered work, you waive any legal power to forbid
188
+ circumvention of technological measures to the extent such circumvention
189
+ is effected by exercising rights under this License with respect to
190
+ the covered work, and you disclaim any intention to limit operation or
191
+ modification of the work as a means of enforcing, against the work's
192
+ users, your or third parties' legal rights to forbid circumvention of
193
+ technological measures.
194
+
195
+ 4. Conveying Verbatim Copies.
196
+
197
+ You may convey verbatim copies of the Program's source code as you
198
+ receive it, in any medium, provided that you conspicuously and
199
+ appropriately publish on each copy an appropriate copyright notice;
200
+ keep intact all notices stating that this License and any
201
+ non-permissive terms added in accord with section 7 apply to the code;
202
+ keep intact all notices of the absence of any warranty; and give all
203
+ recipients a copy of this License along with the Program.
204
+
205
+ You may charge any price or no price for each copy that you convey,
206
+ and you may offer support or warranty protection for a fee.
207
+
208
+ 5. Conveying Modified Source Versions.
209
+
210
+ You may convey a work based on the Program, or the modifications to
211
+ produce it from the Program, in the form of source code under the
212
+ terms of section 4, provided that you also meet all of these conditions:
213
+
214
+ a) The work must carry prominent notices stating that you modified
215
+ it, and giving a relevant date.
216
+
217
+ b) The work must carry prominent notices stating that it is
218
+ released under this License and any conditions added under section
219
+ 7. This requirement modifies the requirement in section 4 to
220
+ "keep intact all notices".
221
+
222
+ c) You must license the entire work, as a whole, under this
223
+ License to anyone who comes into possession of a copy. This
224
+ License will therefore apply, along with any applicable section 7
225
+ additional terms, to the whole of the work, and all its parts,
226
+ regardless of how they are packaged. This License gives no
227
+ permission to license the work in any other way, but it does not
228
+ invalidate such permission if you have separately received it.
229
+
230
+ d) If the work has interactive user interfaces, each must display
231
+ Appropriate Legal Notices; however, if the Program has interactive
232
+ interfaces that do not display Appropriate Legal Notices, your
233
+ work need not make them do so.
234
+
235
+ A compilation of a covered work with other separate and independent
236
+ works, which are not by their nature extensions of the covered work,
237
+ and which are not combined with it such as to form a larger program,
238
+ in or on a volume of a storage or distribution medium, is called an
239
+ "aggregate" if the compilation and its resulting copyright are not
240
+ used to limit the access or legal rights of the compilation's users
241
+ beyond what the individual works permit. Inclusion of a covered work
242
+ in an aggregate does not cause this License to apply to the other
243
+ parts of the aggregate.
244
+
245
+ 6. Conveying Non-Source Forms.
246
+
247
+ You may convey a covered work in object code form under the terms
248
+ of sections 4 and 5, provided that you also convey the
249
+ machine-readable Corresponding Source under the terms of this License,
250
+ in one of these ways:
251
+
252
+ a) Convey the object code in, or embodied in, a physical product
253
+ (including a physical distribution medium), accompanied by the
254
+ Corresponding Source fixed on a durable physical medium
255
+ customarily used for software interchange.
256
+
257
+ b) Convey the object code in, or embodied in, a physical product
258
+ (including a physical distribution medium), accompanied by a
259
+ written offer, valid for at least three years and valid for as
260
+ long as you offer spare parts or customer support for that product
261
+ model, to give anyone who possesses the object code either (1) a
262
+ copy of the Corresponding Source for all the software in the
263
+ product that is covered by this License, on a durable physical
264
+ medium customarily used for software interchange, for a price no
265
+ more than your reasonable cost of physically performing this
266
+ conveying of source, or (2) access to copy the
267
+ Corresponding Source from a network server at no charge.
268
+
269
+ c) Convey individual copies of the object code with a copy of the
270
+ written offer to provide the Corresponding Source. This
271
+ alternative is allowed only occasionally and noncommercially, and
272
+ only if you received the object code with such an offer, in accord
273
+ with subsection 6b.
274
+
275
+ d) Convey the object code by offering access from a designated
276
+ place (gratis or for a charge), and offer equivalent access to the
277
+ Corresponding Source in the same way through the same place at no
278
+ further charge. You need not require recipients to copy the
279
+ Corresponding Source along with the object code. If the place to
280
+ copy the object code is a network server, the Corresponding Source
281
+ may be on a different server (operated by you or a third party)
282
+ that supports equivalent copying facilities, provided you maintain
283
+ clear directions next to the object code saying where to find the
284
+ Corresponding Source. Regardless of what server hosts the
285
+ Corresponding Source, you remain obligated to ensure that it is
286
+ available for as long as needed to satisfy these requirements.
287
+
288
+ e) Convey the object code using peer-to-peer transmission, provided
289
+ you inform other peers where the object code and Corresponding
290
+ Source of the work are being offered to the general public at no
291
+ charge under subsection 6d.
292
+
293
+ A separable portion of the object code, whose source code is excluded
294
+ from the Corresponding Source as a System Library, need not be
295
+ included in conveying the object code work.
296
+
297
+ A "User Product" is either (1) a "consumer product", which means any
298
+ tangible personal property which is normally used for personal, family,
299
+ or household purposes, or (2) anything designed or sold for incorporation
300
+ into a dwelling. In determining whether a product is a consumer product,
301
+ doubtful cases shall be resolved in favor of coverage. For a particular
302
+ product received by a particular user, "normally used" refers to a
303
+ typical or common use of that class of product, regardless of the status
304
+ of the particular user or of the way in which the particular user
305
+ actually uses, or expects or is expected to use, the product. A product
306
+ is a consumer product regardless of whether the product has substantial
307
+ commercial, industrial or non-consumer uses, unless such uses represent
308
+ the only significant mode of use of the product.
309
+
310
+ "Installation Information" for a User Product means any methods,
311
+ procedures, authorization keys, or other information required to install
312
+ and execute modified versions of a covered work in that User Product from
313
+ a modified version of its Corresponding Source. The information must
314
+ suffice to ensure that the continued functioning of the modified object
315
+ code is in no case prevented or interfered with solely because
316
+ modification has been made.
317
+
318
+ If you convey an object code work under this section in, or with, or
319
+ specifically for use in, a User Product, and the conveying occurs as
320
+ part of a transaction in which the right of possession and use of the
321
+ User Product is transferred to the recipient in perpetuity or for a
322
+ fixed term (regardless of how the transaction is characterized), the
323
+ Corresponding Source conveyed under this section must be accompanied
324
+ by the Installation Information. But this requirement does not apply
325
+ if neither you nor any third party retains the ability to install
326
+ modified object code on the User Product (for example, the work has
327
+ been installed in ROM).
328
+
329
+ The requirement to provide Installation Information does not include a
330
+ requirement to continue to provide support service, warranty, or updates
331
+ for a work that has been modified or installed by the recipient, or for
332
+ the User Product in which it has been modified or installed. Access to a
333
+ network may be denied when the modification itself materially and
334
+ adversely affects the operation of the network or violates the rules and
335
+ protocols for communication across the network.
336
+
337
+ Corresponding Source conveyed, and Installation Information provided,
338
+ in accord with this section must be in a format that is publicly
339
+ documented (and with an implementation available to the public in
340
+ source code form), and must require no special password or key for
341
+ unpacking, reading or copying.
342
+
343
+ 7. Additional Terms.
344
+
345
+ "Additional permissions" are terms that supplement the terms of this
346
+ License by making exceptions from one or more of its conditions.
347
+ Additional permissions that are applicable to the entire Program shall
348
+ be treated as though they were included in this License, to the extent
349
+ that they are valid under applicable law. If additional permissions
350
+ apply only to part of the Program, that part may be used separately
351
+ under those permissions, but the entire Program remains governed by
352
+ this License without regard to the additional permissions.
353
+
354
+ When you convey a copy of a covered work, you may at your option
355
+ remove any additional permissions from that copy, or from any part of
356
+ it. (Additional permissions may be written to require their own
357
+ removal in certain cases when you modify the work.) You may place
358
+ additional permissions on material, added by you to a covered work,
359
+ for which you have or can give appropriate copyright permission.
360
+
361
+ Notwithstanding any other provision of this License, for material you
362
+ add to a covered work, you may (if authorized by the copyright holders of
363
+ that material) supplement the terms of this License with terms:
364
+
365
+ a) Disclaiming warranty or limiting liability differently from the
366
+ terms of sections 15 and 16 of this License; or
367
+
368
+ b) Requiring preservation of specified reasonable legal notices or
369
+ author attributions in that material or in the Appropriate Legal
370
+ Notices displayed by works containing it; or
371
+
372
+ c) Prohibiting misrepresentation of the origin of that material, or
373
+ requiring that modified versions of such material be marked in
374
+ reasonable ways as different from the original version; or
375
+
376
+ d) Limiting the use for publicity purposes of names of licensors or
377
+ authors of the material; or
378
+
379
+ e) Declining to grant rights under trademark law for use of some
380
+ trade names, trademarks, or service marks; or
381
+
382
+ f) Requiring indemnification of licensors and authors of that
383
+ material by anyone who conveys the material (or modified versions of
384
+ it) with contractual assumptions of liability to the recipient, for
385
+ any liability that these contractual assumptions directly impose on
386
+ those licensors and authors.
387
+
388
+ All other non-permissive additional terms are considered "further
389
+ restrictions" within the meaning of section 10. If the Program as you
390
+ received it, or any part of it, contains a notice stating that it is
391
+ governed by this License along with a term that is a further
392
+ restriction, you may remove that term. If a license document contains
393
+ a further restriction but permits relicensing or conveying under this
394
+ License, you may add to a covered work material governed by the terms
395
+ of that license document, provided that the further restriction does
396
+ not survive such relicensing or conveying.
397
+
398
+ If you add terms to a covered work in accord with this section, you
399
+ must place, in the relevant source files, a statement of the
400
+ additional terms that apply to those files, or a notice indicating
401
+ where to find the applicable terms.
402
+
403
+ Additional terms, permissive or non-permissive, may be stated in the
404
+ form of a separately written license, or stated as exceptions;
405
+ the above requirements apply either way.
406
+
407
+ 8. Termination.
408
+
409
+ You may not propagate or modify a covered work except as expressly
410
+ provided under this License. Any attempt otherwise to propagate or
411
+ modify it is void, and will automatically terminate your rights under
412
+ this License (including any patent licenses granted under the third
413
+ paragraph of section 11).
414
+
415
+ However, if you cease all violation of this License, then your
416
+ license from a particular copyright holder is reinstated (a)
417
+ provisionally, unless and until the copyright holder explicitly and
418
+ finally terminates your license, and (b) permanently, if the copyright
419
+ holder fails to notify you of the violation by some reasonable means
420
+ prior to 60 days after the cessation.
421
+
422
+ Moreover, your license from a particular copyright holder is
423
+ reinstated permanently if the copyright holder notifies you of the
424
+ violation by some reasonable means, this is the first time you have
425
+ received notice of violation of this License (for any work) from that
426
+ copyright holder, and you cure the violation prior to 30 days after
427
+ your receipt of the notice.
428
+
429
+ Termination of your rights under this section does not terminate the
430
+ licenses of parties who have received copies or rights from you under
431
+ this License. If your rights have been terminated and not permanently
432
+ reinstated, you do not qualify to receive new licenses for the same
433
+ material under section 10.
434
+
435
+ 9. Acceptance Not Required for Having Copies.
436
+
437
+ You are not required to accept this License in order to receive or
438
+ run a copy of the Program. Ancillary propagation of a covered work
439
+ occurring solely as a consequence of using peer-to-peer transmission
440
+ to receive a copy likewise does not require acceptance. However,
441
+ nothing other than this License grants you permission to propagate or
442
+ modify any covered work. These actions infringe copyright if you do
443
+ not accept this License. Therefore, by modifying or propagating a
444
+ covered work, you indicate your acceptance of this License to do so.
445
+
446
+ 10. Automatic Licensing of Downstream Recipients.
447
+
448
+ Each time you convey a covered work, the recipient automatically
449
+ receives a license from the original licensors, to run, modify and
450
+ propagate that work, subject to this License. You are not responsible
451
+ for enforcing compliance by third parties with this License.
452
+
453
+ An "entity transaction" is a transaction transferring control of an
454
+ organization, or substantially all assets of one, or subdividing an
455
+ organization, or merging organizations. If propagation of a covered
456
+ work results from an entity transaction, each party to that
457
+ transaction who receives a copy of the work also receives whatever
458
+ licenses to the work the party's predecessor in interest had or could
459
+ give under the previous paragraph, plus a right to possession of the
460
+ Corresponding Source of the work from the predecessor in interest, if
461
+ the predecessor has it or can get it with reasonable efforts.
462
+
463
+ You may not impose any further restrictions on the exercise of the
464
+ rights granted or affirmed under this License. For example, you may
465
+ not impose a license fee, royalty, or other charge for exercise of
466
+ rights granted under this License, and you may not initiate litigation
467
+ (including a cross-claim or counterclaim in a lawsuit) alleging that
468
+ any patent claim is infringed by making, using, selling, offering for
469
+ sale, or importing the Program or any portion of it.
470
+
471
+ 11. Patents.
472
+
473
+ A "contributor" is a copyright holder who authorizes use under this
474
+ License of the Program or a work on which the Program is based. The
475
+ work thus licensed is called the contributor's "contributor version".
476
+
477
+ A contributor's "essential patent claims" are all patent claims
478
+ owned or controlled by the contributor, whether already acquired or
479
+ hereafter acquired, that would be infringed by some manner, permitted
480
+ by this License, of making, using, or selling its contributor version,
481
+ but do not include claims that would be infringed only as a
482
+ consequence of further modification of the contributor version. For
483
+ purposes of this definition, "control" includes the right to grant
484
+ patent sublicenses in a manner consistent with the requirements of
485
+ this License.
486
+
487
+ Each contributor grants you a non-exclusive, worldwide, royalty-free
488
+ patent license under the contributor's essential patent claims, to
489
+ make, use, sell, offer for sale, import and otherwise run, modify and
490
+ propagate the contents of its contributor version.
491
+
492
+ In the following three paragraphs, a "patent license" is any express
493
+ agreement or commitment, however denominated, not to enforce a patent
494
+ (such as an express permission to practice a patent or covenant not to
495
+ sue for patent infringement). To "grant" such a patent license to a
496
+ party means to make such an agreement or commitment not to enforce a
497
+ patent against the party.
498
+
499
+ If you convey a covered work, knowingly relying on a patent license,
500
+ and the Corresponding Source of the work is not available for anyone
501
+ to copy, free of charge and under the terms of this License, through a
502
+ publicly available network server or other readily accessible means,
503
+ then you must either (1) cause the Corresponding Source to be so
504
+ available, or (2) arrange to deprive yourself of the benefit of the
505
+ patent license for this particular work, or (3) arrange, in a manner
506
+ consistent with the requirements of this License, to extend the patent
507
+ license to downstream recipients. "Knowingly relying" means you have
508
+ actual knowledge that, but for the patent license, your conveying the
509
+ covered work in a country, or your recipient's use of the covered work
510
+ in a country, would infringe one or more identifiable patents in that
511
+ country that you have reason to believe are valid.
512
+
513
+ If, pursuant to or in connection with a single transaction or
514
+ arrangement, you convey, or propagate by procuring conveyance of, a
515
+ covered work, and grant a patent license to some of the parties
516
+ receiving the covered work authorizing them to use, propagate, modify
517
+ or convey a specific copy of the covered work, then the patent license
518
+ you grant is automatically extended to all recipients of the covered
519
+ work and works based on it.
520
+
521
+ A patent license is "discriminatory" if it does not include within
522
+ the scope of its coverage, prohibits the exercise of, or is
523
+ conditioned on the non-exercise of one or more of the rights that are
524
+ specifically granted under this License. You may not convey a covered
525
+ work if you are a party to an arrangement with a third party that is
526
+ in the business of distributing software, under which you make payment
527
+ to the third party based on the extent of your activity of conveying
528
+ the work, and under which the third party grants, to any of the
529
+ parties who would receive the covered work from you, a discriminatory
530
+ patent license (a) in connection with copies of the covered work
531
+ conveyed by you (or copies made from those copies), or (b) primarily
532
+ for and in connection with specific products or compilations that
533
+ contain the covered work, unless you entered into that arrangement,
534
+ or that patent license was granted, prior to 28 March 2007.
535
+
536
+ Nothing in this License shall be construed as excluding or limiting
537
+ any implied license or other defenses to infringement that may
538
+ otherwise be available to you under applicable patent law.
539
+
540
+ 12. No Surrender of Others' Freedom.
541
+
542
+ If conditions are imposed on you (whether by court order, agreement or
543
+ otherwise) that contradict the conditions of this License, they do not
544
+ excuse you from the conditions of this License. If you cannot convey a
545
+ covered work so as to satisfy simultaneously your obligations under this
546
+ License and any other pertinent obligations, then as a consequence you may
547
+ not convey it at all. For example, if you agree to terms that obligate you
548
+ to collect a royalty for further conveying from those to whom you convey
549
+ the Program, the only way you could satisfy both those terms and this
550
+ License would be to refrain entirely from conveying the Program.
551
+
552
+ 13. Use with the GNU Affero General Public License.
553
+
554
+ Notwithstanding any other provision of this License, you have
555
+ permission to link or combine any covered work with a work licensed
556
+ under version 3 of the GNU Affero General Public License into a single
557
+ combined work, and to convey the resulting work. The terms of this
558
+ License will continue to apply to the part which is the covered work,
559
+ but the special requirements of the GNU Affero General Public License,
560
+ section 13, concerning interaction through a network will apply to the
561
+ combination as such.
562
+
563
+ 14. Revised Versions of this License.
564
+
565
+ The Free Software Foundation may publish revised and/or new versions of
566
+ the GNU General Public License from time to time. Such new versions will
567
+ be similar in spirit to the present version, but may differ in detail to
568
+ address new problems or concerns.
569
+
570
+ Each version is given a distinguishing version number. If the
571
+ Program specifies that a certain numbered version of the GNU General
572
+ Public License "or any later version" applies to it, you have the
573
+ option of following the terms and conditions either of that numbered
574
+ version or of any later version published by the Free Software
575
+ Foundation. If the Program does not specify a version number of the
576
+ GNU General Public License, you may choose any version ever published
577
+ by the Free Software Foundation.
578
+
579
+ If the Program specifies that a proxy can decide which future
580
+ versions of the GNU General Public License can be used, that proxy's
581
+ public statement of acceptance of a version permanently authorizes you
582
+ to choose that version for the Program.
583
+
584
+ Later license versions may give you additional or different
585
+ permissions. However, no additional obligations are imposed on any
586
+ author or copyright holder as a result of your choosing to follow a
587
+ later version.
588
+
589
+ 15. Disclaimer of Warranty.
590
+
591
+ THERE IS NO WARRANTY FOR THE PROGRAM, TO THE EXTENT PERMITTED BY
592
+ APPLICABLE LAW. EXCEPT WHEN OTHERWISE STATED IN WRITING THE COPYRIGHT
593
+ HOLDERS AND/OR OTHER PARTIES PROVIDE THE PROGRAM "AS IS" WITHOUT WARRANTY
594
+ OF ANY KIND, EITHER EXPRESSED OR IMPLIED, INCLUDING, BUT NOT LIMITED TO,
595
+ THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR
596
+ PURPOSE. THE ENTIRE RISK AS TO THE QUALITY AND PERFORMANCE OF THE PROGRAM
597
+ IS WITH YOU. SHOULD THE PROGRAM PROVE DEFECTIVE, YOU ASSUME THE COST OF
598
+ ALL NECESSARY SERVICING, REPAIR OR CORRECTION.
599
+
600
+ 16. Limitation of Liability.
601
+
602
+ IN NO EVENT UNLESS REQUIRED BY APPLICABLE LAW OR AGREED TO IN WRITING
603
+ WILL ANY COPYRIGHT HOLDER, OR ANY OTHER PARTY WHO MODIFIES AND/OR CONVEYS
604
+ THE PROGRAM AS PERMITTED ABOVE, BE LIABLE TO YOU FOR DAMAGES, INCLUDING ANY
605
+ GENERAL, SPECIAL, INCIDENTAL OR CONSEQUENTIAL DAMAGES ARISING OUT OF THE
606
+ USE OR INABILITY TO USE THE PROGRAM (INCLUDING BUT NOT LIMITED TO LOSS OF
607
+ DATA OR DATA BEING RENDERED INACCURATE OR LOSSES SUSTAINED BY YOU OR THIRD
608
+ PARTIES OR A FAILURE OF THE PROGRAM TO OPERATE WITH ANY OTHER PROGRAMS),
609
+ EVEN IF SUCH HOLDER OR OTHER PARTY HAS BEEN ADVISED OF THE POSSIBILITY OF
610
+ SUCH DAMAGES.
611
+
612
+ 17. Interpretation of Sections 15 and 16.
613
+
614
+ If the disclaimer of warranty and limitation of liability provided
615
+ above cannot be given local legal effect according to their terms,
616
+ reviewing courts shall apply local law that most closely approximates
617
+ an absolute waiver of all civil liability in connection with the
618
+ Program, unless a warranty or assumption of liability accompanies a
619
+ copy of the Program in return for a fee.
620
+
621
+ END OF TERMS AND CONDITIONS
622
+
623
+ How to Apply These Terms to Your New Programs
624
+
625
+ If you develop a new program, and you want it to be of the greatest
626
+ possible use to the public, the best way to achieve this is to make it
627
+ free software which everyone can redistribute and change under these terms.
628
+
629
+ To do so, attach the following notices to the program. It is safest
630
+ to attach them to the start of each source file to most effectively
631
+ state the exclusion of warranty; and each file should have at least
632
+ the "copyright" line and a pointer to where the full notice is found.
633
+
634
+ <one line to give the program's name and a brief idea of what it does.>
635
+ Copyright (C) <year> <name of author>
636
+
637
+ This program is free software: you can redistribute it and/or modify
638
+ it under the terms of the GNU General Public License as published by
639
+ the Free Software Foundation, either version 3 of the License, or
640
+ (at your option) any later version.
641
+
642
+ This program is distributed in the hope that it will be useful,
643
+ but WITHOUT ANY WARRANTY; without even the implied warranty of
644
+ MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
645
+ GNU General Public License for more details.
646
+
647
+ You should have received a copy of the GNU General Public License
648
+ along with this program. If not, see <http://www.gnu.org/licenses/>.
649
+
650
+ Also add information on how to contact you by electronic and paper mail.
651
+
652
+ If the program does terminal interaction, make it output a short
653
+ notice like this when it starts in an interactive mode:
654
+
655
+ <program> Copyright (C) <year> <name of author>
656
+ This program comes with ABSOLUTELY NO WARRANTY; for details type `show w'.
657
+ This is free software, and you are welcome to redistribute it
658
+ under certain conditions; type `show c' for details.
659
+
660
+ The hypothetical commands `show w' and `show c' should show the appropriate
661
+ parts of the General Public License. Of course, your program's commands
662
+ might be different; for a GUI interface, you would use an "about box".
663
+
664
+ You should also get your employer (if you work as a programmer) or school,
665
+ if any, to sign a "copyright disclaimer" for the program, if necessary.
666
+ For more information on this, and how to apply and follow the GNU GPL, see
667
+ <http://www.gnu.org/licenses/>.
668
+
669
+ The GNU General Public License does not permit incorporating your program
670
+ into proprietary programs. If your program is a subroutine library, you
671
+ may consider it more useful to permit linking proprietary applications with
672
+ the library. If this is what you want to do, use the GNU Lesser General
673
+ Public License instead of this License. But first, please read
674
+ <http://www.gnu.org/philosophy/why-not-lgpl.html>.
src/bin/PROBE.py ADDED
@@ -0,0 +1,62 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import yaml
2
+ import pandas as pd
3
+ import tqdm
4
+ import semantic_similarity_infer as ssi
5
+ import target_family_classifier as tfc
6
+ import function_predictor as fp
7
+ import binding_affinity_estimator as bae
8
+
9
+ print("\n\nPROBE (Protein RepresentatiOn Benchmark) run is started...\n\n")
10
+
11
+ with open('probe_config.yaml') as f:
12
+ args = yaml.load(f, Loader=yaml.FullLoader)
13
+
14
+ if args["benchmark"] not in ["similarity","family","function","affinity","all"]:
15
+ parser.error('At least one benchmark type should be selected')
16
+
17
+ print(args)
18
+
19
+ def load_representation(multi_col_representation_vector_file_path):
20
+ multi_col_representation_vector = pd.read_csv(multi_col_representation_vector_file_path)
21
+ vals = multi_col_representation_vector.iloc[:,1:(len(multi_col_representation_vector.columns))]
22
+ original_values_as_df = pd.DataFrame({'Entry': pd.Series([], dtype='str'),'Vector': pd.Series([], dtype='object')})
23
+ for index, row in tqdm.tqdm(vals.iterrows(), total = len(vals)):
24
+ list_of_floats = [float(item) for item in list(row)]
25
+ original_values_as_df.loc[index] = [multi_col_representation_vector.iloc[index]['Entry']] + [list_of_floats]
26
+ return original_values_as_df
27
+
28
+ if args["benchmark"] in ["similarity","function","all"]:
29
+ print("\nRepresentation vectors are loading...\n")
30
+ representation_dataframe = load_representation(args["representation_file_human"])
31
+
32
+ if args["benchmark"] in ["similarity","all"]:
33
+ print("\nSemantic similarity Inference Benchmark is running...\n")
34
+ ssi.representation_dataframe = representation_dataframe
35
+ ssi.representation_name = args["representation_name"]
36
+ ssi.protein_names = ssi.representation_dataframe['Entry'].tolist()
37
+ ssi.similarity_tasks = args["similarity_tasks"]
38
+ ssi.detailed_output = args["detailed_output"]
39
+ ssi.calculate_all_correlations()
40
+ if args["benchmark"] in ["function","all"]:
41
+ print("\n\nOntology-based protein function prediction benchmark is running...\n")
42
+ fp.aspect_type = args["function_prediction_aspect"]
43
+ fp.dataset_type = args["function_prediction_dataset"]
44
+ fp.representation_dataframe = representation_dataframe
45
+ fp.representation_name = args["representation_name"]
46
+ fp.detailed_output = args["detailed_output"]
47
+ fp.pred_output()
48
+ if args["benchmark"] in ["family","all"]:
49
+ print("\n\nDrug target protein family classification benchmark is running...\n")
50
+ tfc.representation_path = args["representation_file_human"]
51
+ tfc.representation_name = args["representation_name"]
52
+ tfc.detailed_output = args["detailed_output"]
53
+ for dataset in args["family_prediction_dataset"]:
54
+ tfc.score_protein_rep(dataset)
55
+ if args["benchmark"] in ["affinity","all"]:
56
+ print("\n\nProtein-protein binding affinity estimation benchmark is running...\n")
57
+ bae.skempi_vectors_path = args["representation_file_affinity"]
58
+ bae.representation_name = args["representation_name"]
59
+ bae.predict_affinities_and_report_results()
60
+ print("\n\nPROBE (Protein RepresentatiOn Benchmark) run is finished...\n")
61
+
62
+
src/bin/__init__.py ADDED
File without changes
src/bin/binding_affinity_estimator.py ADDED
@@ -0,0 +1,189 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ import tqdm
2
+ import multiprocessing
3
+ import pandas as pd
4
+ import numpy as np
5
+ import scipy.stats
6
+
7
+ from sklearn import linear_model
8
+ from sklearn.model_selection import KFold
9
+ from sklearn.metrics import mean_squared_error,mean_absolute_error
10
+ from sklearn.ensemble import RandomForestRegressor
11
+ from sklearn.preprocessing import MinMaxScaler
12
+
13
+ skempi_vectors_path = None
14
+ representation_name = None
15
+
16
+ def load_representation(multi_col_representation_vector_file_path):
17
+ print("\nLoading representation vectors...\n")
18
+ multi_col_representation_vector = pd.read_csv(multi_col_representation_vector_file_path)
19
+ vals = multi_col_representation_vector.iloc[:,1:(len(multi_col_representation_vector.columns))]
20
+ original_values_as_df = pd.DataFrame({'PDB_ID': pd.Series([], dtype='str'),'Vector': pd.Series([], dtype='object')})
21
+ for index, row in tqdm.tqdm(vals.iterrows(), total = len(vals)):
22
+ list_of_floats = [float(item) for item in list(row)]
23
+ original_values_as_df.loc[index] = [multi_col_representation_vector.iloc[index]['PDB_ID']] + [list_of_floats]
24
+ return original_values_as_df
25
+
26
+ def calc_train_error(X_train, y_train, model):
27
+ '''returns in-sample error for already fit model.'''
28
+ predictions = model.predict(X_train)
29
+ mse = mean_squared_error(y_train, predictions)
30
+ mae = mean_absolute_error(y_train, predictions)
31
+ corr = scipy.stats.pearsonr(y_train, predictions)
32
+ return mse,mae,corr
33
+
34
+ def calc_validation_error(X_test, y_test, model):
35
+ '''returns out-of-sample error for already fit model.'''
36
+ predictions = model.predict(X_test)
37
+ mse = mean_squared_error(y_test, predictions)
38
+ mae = mean_absolute_error(y_test, predictions)
39
+ corr = scipy.stats.pearsonr(y_test, predictions)
40
+ return mse,mae,corr
41
+
42
+ def calc_metrics(X_train, y_train, X_test, y_test, model):
43
+ '''fits model and returns the metrics for in-sample error and out-of-sample error'''
44
+ model.fit(X_train, y_train)
45
+ train_mse_error,train_mae_error,train_corr = calc_train_error(X_train, y_train, model)
46
+ val_mse_error,val_mae_error,val_corr = calc_validation_error(X_test, y_test, model)
47
+ return train_mse_error, val_mse_error, train_mae_error, val_mae_error,train_corr,val_corr
48
+
49
+ def report_results(
50
+ train_mse_error_list,
51
+ validation_mse_error_list,
52
+ train_mae_error_list,
53
+ validation_mae_error_list,
54
+ train_corr_list,
55
+ validation_corr_list,
56
+ train_corr_pval_list,
57
+ validation_corr_pval_list,
58
+ ):
59
+ result_df = pd.DataFrame(
60
+ {
61
+ "train_mse_error": round(np.mean(train_mse_error_list) * 100, 4),
62
+ "train_mse_std": round(np.std(train_mse_error_list) * 100, 4),
63
+ "val_mse_error": round(np.mean(validation_mse_error_list) * 100, 4),
64
+ "val_mse_std": round(np.std(validation_mse_error_list) * 100, 4),
65
+ "train_mae_error": round(np.mean(train_mae_error_list) * 100, 4),
66
+ "train_mae_std": round(np.std(train_mae_error_list) * 100, 4),
67
+ "val_mae_error": round(np.mean(validation_mae_error_list) * 100, 4),
68
+ "val_mae_std": round(np.std(validation_mae_error_list) * 100, 4),
69
+ "train_corr": round(np.mean(train_corr_list), 4),
70
+ "train_corr_pval": round(np.mean(train_corr_pval_list), 4),
71
+ "validation_corr": round(np.mean(validation_corr_list), 4),
72
+ "validation_corr_pval": round(np.mean(validation_corr_pval_list), 4),
73
+ },
74
+ index=[0],
75
+ )
76
+
77
+ result_detail_df = pd.DataFrame(
78
+ {
79
+ "train_mse_errors": list(np.multiply(train_mse_error_list, 100)),
80
+ "val_mse_errors": list(np.multiply(validation_mse_error_list, 100)),
81
+ "train_mae_errors": list(np.multiply(train_mae_error_list, 100)),
82
+ "val_mae_errors": list(np.multiply(validation_mae_error_list, 100)),
83
+ "train_corrs": list(np.multiply(train_corr_list, 100)),
84
+ "train_corr_pvals": list(np.multiply(train_corr_pval_list, 100)),
85
+ "validation_corr": list(np.multiply(validation_corr_list, 100)),
86
+ "validation_corr_pval": list(np.multiply(validation_corr_pval_list, 100)),
87
+ },
88
+ index=range(len(train_mse_error_list)),
89
+ )
90
+ return result_df, result_detail_df
91
+
92
+
93
+ def predictAffinityWithModel(regressor_model,multiplied_vectors_df):
94
+ K = 10
95
+ kf = KFold(n_splits=K, shuffle=True, random_state=42)
96
+
97
+ train_mse_error_list = []
98
+ validation_mse_error_list = []
99
+ train_mae_error_list = []
100
+ validation_mae_error_list = []
101
+ train_corr_list = []
102
+ validation_corr_list = []
103
+ train_corr_pval_list = []
104
+ validation_corr_pval_list = []
105
+
106
+ data = np.array(np.asarray(multiplied_vectors_df["Vector"].tolist()), dtype=float)
107
+ ppi_affinity_filtered_df = ppi_affinity_df\
108
+ [ppi_affinity_df['Protein1'].isin(multiplied_vectors_df['Protein1']) &\
109
+ ppi_affinity_df['Protein2'].isin(multiplied_vectors_df['Protein2']) ]
110
+ target = np.array(ppi_affinity_filtered_df["Affinity"])
111
+ scaler = MinMaxScaler()
112
+ scaler.fit(target.reshape(-1, 1))
113
+ target = scaler.transform(target.reshape(-1, 1))[:, 0]
114
+ for train_index, val_index in tqdm.tqdm(kf.split(data, target), total=K):
115
+
116
+ # split data
117
+ X_train, X_val = data[train_index], data[val_index]
118
+ y_train, y_val = target[train_index], target[val_index]
119
+
120
+ # instantiate model
121
+ reg = regressor_model #linear_model.BayesianRidge()
122
+
123
+ # calculate error_list
124
+ (
125
+ train_mse_error,
126
+ val_mse_error,
127
+ train_mae_error,
128
+ val_mae_error,
129
+ train_corr,
130
+ val_corr,
131
+ ) = calc_metrics(X_train, y_train, X_val, y_val, reg)
132
+
133
+ # append to appropriate list
134
+ train_mse_error_list.append(train_mse_error)
135
+ validation_mse_error_list.append(val_mse_error)
136
+
137
+ train_mae_error_list.append(train_mae_error)
138
+ validation_mae_error_list.append(val_mae_error)
139
+
140
+ train_corr_list.append(train_corr[0])
141
+ validation_corr_list.append(val_corr[0])
142
+
143
+ train_corr_pval_list.append(train_corr[1])
144
+ validation_corr_pval_list.append(val_corr[1])
145
+
146
+ return report_results(
147
+ train_mse_error_list,
148
+ validation_mse_error_list,
149
+ train_mae_error_list,
150
+ validation_mae_error_list,
151
+ train_corr_list,
152
+ validation_corr_list,
153
+ train_corr_pval_list,
154
+ validation_corr_pval_list,
155
+ )
156
+
157
+ ppi_affinity_file = "../data/auxilary_input/skempi_pipr/SKEMPI_all_dg_avg.txt"
158
+ ppi_affinity_df = pd.read_csv(ppi_affinity_file,sep="\t",header=None)
159
+ ppi_affinity_df.columns = ['Protein1', 'Protein2', 'Affinity']
160
+
161
+ #Calculate vector element-wise multiplication as described in https://academic.oup.com/bioinformatics/article/35/14/i305/5529260
162
+
163
+ def calculate_vector_multiplications(skempi_vectors_df):
164
+ multiplied_vectors = pd.DataFrame({'Protein1': pd.Series([], dtype='str'),\
165
+ 'Protein2': pd.Series([], dtype='str'),\
166
+ 'Vector': pd.Series([], dtype='object')})
167
+ print("Element-wise vector multiplications are being calculated")
168
+ rep_prot_list = list(skempi_vectors_df['PDB_ID'])
169
+ for index,row in tqdm.tqdm(ppi_affinity_df.iterrows()):
170
+ if row['Protein1'] in rep_prot_list and row['Protein2'] in rep_prot_list:
171
+ vec1 = list(skempi_vectors_df[skempi_vectors_df['PDB_ID']\
172
+ == row['Protein1']]['Vector'])[0]
173
+ vec2 = list(skempi_vectors_df[skempi_vectors_df['PDB_ID']\
174
+ == row['Protein2']]['Vector'])[0]
175
+ multiplied_vec = np.multiply(vec1,vec2)
176
+
177
+ multiplied_vectors = multiplied_vectors.\
178
+ append({'Protein1':row['Protein1'], 'Protein2':row['Protein2'],\
179
+ 'Vector':multiplied_vec},ignore_index = True)
180
+ return multiplied_vectors
181
+
182
+ def predict_affinities_and_report_results():
183
+ skempi_vectors_df = load_representation(skempi_vectors_path)
184
+ multiplied_vectors_df = calculate_vector_multiplications(skempi_vectors_df)
185
+ model = linear_model.BayesianRidge()
186
+ result_df, result_detail_df = predictAffinityWithModel(model,multiplied_vectors_df)
187
+ result_df.to_csv(r"../results/Affinity_prediction_skempiv1_{0}.csv".format(representation_name),index=False)
188
+ result_detail_df.to_csv(r"../results/Affinity_prediction_skempiv1_{0}_detail.csv".format(representation_name),index=False)
189
+
src/bin/function_predictor.py ADDED
@@ -0,0 +1,216 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ import pandas as pd
3
+ import numpy as np
4
+ from datetime import datetime
5
+ import pickle
6
+ import os
7
+ import multiprocessing
8
+ from tqdm import tqdm
9
+
10
+ from sklearn.svm import SVC
11
+ from sklearn.linear_model import SGDClassifier
12
+ from sklearn.model_selection import cross_val_predict, KFold
13
+ from skmultilearn.problem_transform import BinaryRelevance
14
+ from sklearn.metrics import accuracy_score, f1_score, precision_score, recall_score, hamming_loss
15
+
16
+
17
+ aspect_type = ""
18
+ dataset_type = ""
19
+ representation_dataframe = ""
20
+ representation_name = ""
21
+ detailed_output = False
22
+
23
+ def warn(*args, **kwargs):
24
+ pass
25
+ import warnings
26
+ warnings.warn = warn
27
+
28
+ def check_for_at_least_two_class_sample_exits(y):
29
+ for column in y:
30
+ column_sum = np.sum(y[column].array)
31
+ if column_sum < 2:
32
+ print('At least 2 positive samples are required for each class {0} class has {1} positive samples'.format(column,column_sum))
33
+ return False
34
+ return True
35
+
36
+ def create_valid_kfold_object_for_multilabel_splits(X,y,kf):
37
+ check_for_at_least_two_class_sample_exits(y)
38
+ sample_class_occurance = dict(zip(y.columns,np.zeros(len(y.columns))))
39
+ for column in y:
40
+ for fold_train_index,fold_test_index in kf.split(X,y):
41
+ fold_col_sum = np.sum(y.iloc[fold_test_index,:][column].array)
42
+ if fold_col_sum > 0:
43
+ sample_class_occurance[column] += 1
44
+
45
+ for key in sample_class_occurance:
46
+ value = sample_class_occurance[key]
47
+ if value < 2:
48
+ random_state = np.random.randint(1000)
49
+ print("Random state is changed since at least two positive samples are required in different train/test folds.\
50
+ \nHowever, only one fold exits with positive samples for class {0}".format(key))
51
+ print("Selected random state is {0}".format(random_state))
52
+ kf = KFold(n_splits=5, shuffle=True, random_state=random_state)
53
+ create_valid_kfold_object_for_multilabel_splits(X,y,kf)
54
+ else:
55
+ return kf
56
+
57
+ def MultiLabelSVC_cross_val_predict(representation_name, dataset, X, y, classifier):
58
+ #dataset split, estimator, cv
59
+ clf = classifier
60
+ Xn = np.array(np.asarray(X.values.tolist()), dtype=float)
61
+ kf_init = KFold(n_splits=5, shuffle=True, random_state=42)
62
+ kf = create_valid_kfold_object_for_multilabel_splits(X,y,kf_init)
63
+ y_pred = cross_val_predict(clf, Xn, y, cv=kf)
64
+
65
+ if detailed_output:
66
+ with open(r"../results/Ontology_based_function_prediction_{1}_{0}_model.pkl".format(representation_name,dataset.split(".")[0]),"wb") as file:
67
+ pickle.dump(clf,file)
68
+
69
+ acc_cv = []
70
+ f1_mi_cv = []
71
+ f1_ma_cv = []
72
+ f1_we_cv = []
73
+ pr_mi_cv = []
74
+ pr_ma_cv = []
75
+ pr_we_cv = []
76
+ rc_mi_cv = []
77
+ rc_ma_cv = []
78
+ rc_we_cv = []
79
+ hamm_cv = []
80
+ for fold_train_index,fold_test_index in kf.split(X,y):
81
+ acc = accuracy_score(y.iloc[fold_test_index,:],y_pred[fold_test_index])
82
+ acc_cv.append(np.round(acc,decimals=5))
83
+ f1_mi = f1_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="micro")
84
+ f1_mi_cv.append(np.round(f1_mi,decimals=5))
85
+ f1_ma = f1_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="macro")
86
+ f1_ma_cv.append(np.round(f1_ma,decimals=5))
87
+ f1_we = f1_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="weighted")
88
+ f1_we_cv.append(np.round(f1_we,decimals=5))
89
+ pr_mi = precision_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="micro")
90
+ pr_mi_cv.append(np.round(pr_mi,decimals=5))
91
+ pr_ma = precision_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="macro")
92
+ pr_ma_cv.append(np.round(pr_ma,decimals=5))
93
+ pr_we = precision_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="weighted")
94
+ pr_we_cv.append(np.round(pr_we,decimals=5))
95
+ rc_mi = recall_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="micro")
96
+ rc_mi_cv.append(np.round(rc_mi,decimals=5))
97
+ rc_ma = recall_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="macro")
98
+ rc_ma_cv.append(np.round(rc_ma,decimals=5))
99
+ rc_we = recall_score(y.iloc[fold_test_index,:],y_pred[fold_test_index],average="weighted")
100
+ rc_we_cv.append(np.round(rc_we,decimals=5))
101
+ hamm = hamming_loss(y.iloc[fold_test_index,:],y_pred[fold_test_index])
102
+ hamm_cv.append(np.round(hamm,decimals=5))
103
+
104
+ means = list(np.mean([acc_cv,f1_mi_cv,f1_ma_cv,f1_we_cv,pr_mi_cv,pr_ma_cv,pr_we_cv,rc_mi_cv,rc_ma_cv,rc_we_cv,hamm_cv], axis=1))
105
+ means = [np.round(i,decimals=5) for i in means]
106
+
107
+ stds = list(np.std([acc_cv,f1_mi_cv,f1_ma_cv,f1_we_cv,pr_mi_cv,pr_ma_cv,pr_we_cv,rc_mi_cv,rc_ma_cv,rc_we_cv,hamm_cv], axis=1))
108
+ stds = [np.round(i,decimals=5) for i in stds]
109
+
110
+ return ([representation_name+"_"+dataset,acc_cv,f1_mi_cv,f1_ma_cv,f1_we_cv,pr_mi_cv,pr_ma_cv,pr_we_cv,rc_mi_cv,rc_ma_cv,rc_we_cv,hamm_cv],\
111
+ [representation_name+"_"+dataset]+means,\
112
+ [representation_name+"_"+dataset]+stds,\
113
+ y_pred)
114
+
115
+ def ProtDescModel():
116
+ #desc_file = pd.read_csv(r"protein_representations\final\{0}_dim{1}.tsv".format(representation_name,desc_dim),sep="\t")
117
+ datasets = os.listdir(r"../data/auxilary_input/GO_datasets")
118
+ if dataset_type == "All_Data_Sets" and aspect_type == "All_Aspects":
119
+ filtered_datasets = datasets
120
+ elif dataset_type == "All_Data_Sets":
121
+ filtered_datasets = [dataset for dataset in datasets if aspect_type in dataset]
122
+ elif aspect_type == "All_Aspects":
123
+ filtered_datasets = [dataset for dataset in datasets if dataset_type in dataset]
124
+ else:
125
+ filtered_datasets = [dataset for dataset in datasets if aspect_type in dataset and dataset_type in dataset]
126
+ cv_results = []
127
+ cv_mean_results = []
128
+ cv_std_results = []
129
+
130
+ for dt in tqdm(filtered_datasets,total=len(filtered_datasets)):
131
+ print(r"Protein function prediction is started for the dataset: {0}".format(dt.split(".")[0]))
132
+ dt_file = pd.read_csv(r"../data/auxilary_input/GO_datasets/{0}".format(dt),sep="\t")
133
+ dt_merge = dt_file.merge(representation_dataframe,left_on="Protein_Id",right_on="Entry")
134
+
135
+ dt_X = dt_merge['Vector']
136
+ dt_y = dt_merge.iloc[:,1:-2]
137
+ if check_for_at_least_two_class_sample_exits(dt_y) == False:
138
+ print(r"No funtion will be predicted for the dataset: {0}".format(dt.split(".")[0]))
139
+ continue
140
+ #print("raw dt vs. dt_merge: {} - {}".format(len(dt_file),len(dt_merge)))
141
+ #print("Calculating predictions for " + dt.split(".")[0])
142
+ #model = MultiLabelSVC_cross_val_predict(representation_name, dt.split(".")[0], dt_X, dt_y, classifier=BinaryRelevance(SVC(kernel="linear", random_state=42)))
143
+ cpu_number = multiprocessing.cpu_count()
144
+ model = MultiLabelSVC_cross_val_predict(representation_name, dt.split(".")[0], dt_X, dt_y, classifier=BinaryRelevance(SGDClassifier(n_jobs=cpu_number, random_state=42)))
145
+ cv_results.append(model[0])
146
+ cv_mean_results.append(model[1])
147
+ cv_std_results.append(model[2])
148
+
149
+ predictions = dt_merge.iloc[:,:6]
150
+ predictions["predicted_values"] = list(model[3].toarray())
151
+ if detailed_output:
152
+ predictions.to_csv(r"../results/Ontology_based_function_prediction_{1}_{0}_predictions.tsv".format(representation_name,dt.split(".")[0]),sep="\t",index=None)
153
+
154
+ return (cv_results, cv_mean_results,cv_std_results)
155
+
156
+ #def pred_output(representation_name, desc_dim):
157
+ def pred_output():
158
+ model = ProtDescModel()
159
+ cv_result = model[0]
160
+ df_cv_result = pd.DataFrame({"Model": pd.Series([], dtype='str') ,"Accuracy": pd.Series([], dtype='float'),"F1_Micro": pd.Series([], dtype='float'),\
161
+ "F1_Macro": pd.Series([], dtype='float'),"F1_Weighted": pd.Series([], dtype='float'),"Precision_Micro": pd.Series([], dtype='float'),\
162
+ "Precision_Macro": pd.Series([], dtype='float'),"Precision_Weighted": pd.Series([], dtype='float'),"Recall_Micro": pd.Series([], dtype='float'),\
163
+ "Recall_Macro": pd.Series([], dtype='float'),"Recall_Weighted": pd.Series([], dtype='float'),"Hamming_Distance": pd.Series([], dtype='float')})
164
+ for i in cv_result:
165
+ df_cv_result.loc[len(df_cv_result)] = i
166
+ if detailed_output:
167
+ df_cv_result.to_csv(r"../results/Ontology_based_function_prediction_5cv_{0}.tsv".format(representation_name),sep="\t",index=None)
168
+
169
+ cv_mean_result = model[1]
170
+ df_cv_mean_result = pd.DataFrame({"Model": pd.Series([], dtype='str') ,"Accuracy": pd.Series([], dtype='float'),"F1_Micro": pd.Series([], dtype='float'),\
171
+ "F1_Macro": pd.Series([], dtype='float'),"F1_Weighted": pd.Series([], dtype='float'),"Precision_Micro": pd.Series([], dtype='float'),\
172
+ "Precision_Macro": pd.Series([], dtype='float'),"Precision_Weighted": pd.Series([], dtype='float'),"Recall_Micro": pd.Series([], dtype='float'),\
173
+ "Recall_Macro": pd.Series([], dtype='float'),"Recall_Weighted": pd.Series([], dtype='float'),"Hamming_Distance": pd.Series([], dtype='float')})
174
+
175
+
176
+ #pd.DataFrame(columns=["Model","Accuracy","F1_Micro","F1_Macro","F1_Weighted","Precision_Micro","Precision_Macro","Precision_Weighted",\
177
+ # "Recall_Micro","Recall_Macro","Recall_Weighted","Hamming_Distance"])
178
+
179
+ for j in cv_mean_result:
180
+ df_cv_mean_result.loc[len(df_cv_mean_result)] = j
181
+ df_cv_mean_result.to_csv(r"../results/Ontology_based_function_prediction_5cv_mean_{0}.tsv".format(representation_name),sep="\t",index=None)
182
+
183
+ #save std deviation of scores to file
184
+ cv_std_result = model[2]
185
+ df_cv_std_result = pd.DataFrame({"Model": pd.Series([], dtype='str') ,"Accuracy": pd.Series([], dtype='float'),"F1_Micro": pd.Series([], dtype='float'),\
186
+ "F1_Macro": pd.Series([], dtype='float'),"F1_Weighted": pd.Series([], dtype='float'),"Precision_Micro": pd.Series([], dtype='float'),\
187
+ "Precision_Macro": pd.Series([], dtype='float'),"Precision_Weighted": pd.Series([], dtype='float'),"Recall_Micro": pd.Series([], dtype='float'),\
188
+ "Recall_Macro": pd.Series([], dtype='float'),"Recall_Weighted": pd.Series([], dtype='float'),"Hamming_Distance": pd.Series([], dtype='float')})
189
+
190
+
191
+ #pd.DataFrame(columns=["Model","Accuracy","F1_Micro","F1_Macro","F1_Weighted","Precision_Micro","Precision_Macro","Precision_Weighted",\
192
+ # "Recall_Micro","Recall_Macro","Recall_Weighted","Hamming_Distance"])
193
+
194
+ for k in cv_std_result:
195
+ df_cv_std_result.loc[len(df_cv_std_result)] = k
196
+ df_cv_std_result.to_csv(r"../results/Ontology_based_function_prediction_5cv_std_{0}.tsv".format(representation_name),sep="\t",index=None)
197
+
198
+ print(datetime.now())
199
+
200
+
201
+ # tcga = pred_output("tcga","50")
202
+ # protvec = pred_output("protvec","100")
203
+ # unirep = pred_output("unirep","5700")
204
+ # gene2vec = pred_output("gene2vec","200")
205
+ # learned_embed = pred_output("learned_embed","64")
206
+ # mut2vec = pred_output("mut2vec","300")
207
+ # seqvec = pred_output("seqvec","1024")
208
+
209
+ #bepler = pred_output("bepler","100")
210
+ # resnet_rescaled = pred_output("resnet-rescaled","256")
211
+ # transformer_avg = pred_output("transformer","768")
212
+ # transformer_pool = pred_output("transformer-pool","768")
213
+
214
+ # apaac = pred_output("apaac","80")
215
+ #ksep = pred_output("ksep","400")
216
+
src/bin/probe_config.yaml ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #Representation name (used for naming output files):
2
+ representation_name: AAC
3
+ #representation_name: LEARNED-VEC
4
+ #representation_name: T5
5
+
6
+ #Benchmarks (should be one of the "similarity","family","function","affinity","all"):
7
+ # "similarity" for running protein semantic similarity inference benchmark
8
+ # "function" for running ontology-based function prediction benchmark
9
+ # "family" for running drug target protein family classification benchmark
10
+ # "affinity" for running protein-protein binding affinity estimation benchmark
11
+ # "all" for running all benchmarks
12
+ benchmark: all
13
+
14
+ #Path of the file containing representation vectors of UniProtKB/Swiss-Prot human proteins:
15
+ representation_file_human: ../data/representation_vectors/AAC_UNIPROT_HUMAN.csv
16
+ #representation_file_human: ../data/representation_vectors/LEARNED-VEC_UNIPROT_HUMAN.csv
17
+ #representation_file_human: ../data/representation_vectors/T5_UNIPROT_HUMAN.csv
18
+
19
+ #Path of the file containing representation vectors of samples in the SKEMPI dataset:
20
+ representation_file_affinity: ../data/representation_vectors/skempi_aac_representation_multi_col.csv
21
+ #representation_file_affinity: ../data/representation_vectors/skempi_learned-vec_representation_multi_col.csv
22
+ #representation_file_affinity: ../data/representation_vectors/skempi_t5_representation_multi_col.csv
23
+
24
+ #Semantic similarity inference benchmark dataset (should be a list that includes any combination of "Sparse", "200", and "500"):
25
+ similarity_tasks: ["Sparse","200","500"]
26
+
27
+ #Ontology-based function prediction benchmark dataset in terms of GO aspect (should be one of the following: "MF", "BP", "CC", or "All_Aspects"):
28
+ function_prediction_aspect: All_Aspects
29
+
30
+ #Ontology-based function prediction benchmark dataset in terms of size-based-splits (should be one of the following: "High", "Middle", "Low", or "All_Data_Sets")
31
+ function_prediction_dataset: All_Data_Sets
32
+
33
+ #Drug target protein family classification benchmark dataset in terms of similarity-based splits (should be a list that includes any combination of "nc", "uc50", "uc30", and "mm15")
34
+ family_prediction_dataset: ["nc","uc50","uc30","mm15"]
35
+
36
+ #Detailed results (can be True or False)
37
+ detailed_output: False
src/bin/semantic_similarity_infer.py ADDED
@@ -0,0 +1,160 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/env python
2
+ # coding: utf-8
3
+
4
+ import pandas as pd
5
+ import numpy as np
6
+ import gzip
7
+ import itertools
8
+ import multiprocessing
9
+ import csv
10
+ import pickle
11
+ import random
12
+ from sklearn.metrics.pairwise import cosine_similarity as cosine
13
+ from sklearn.metrics import mean_squared_error as mse
14
+ from tqdm import tqdm, tqdm_notebook
15
+ from multiprocessing import Manager, Pool
16
+ from scipy.spatial.distance import cdist
17
+ from numpy.linalg import norm
18
+ from scipy.stats import spearmanr, pearsonr
19
+ from functools import partial
20
+
21
+ manager = Manager()
22
+ similarity_list = manager.list()
23
+ proteinListNew = manager.list()
24
+
25
+ representation_dataframe = ""
26
+ protein_names = ""
27
+ # define similarity_list and proteinList as global variables
28
+ representation_name = ""
29
+ similarity_tasks = ""
30
+ detailed_output = False
31
+
32
+ def parallelSimilarity(paramList):
33
+ protein_embedding_dataframe = representation_dataframe
34
+ i = paramList[0]
35
+ j = paramList[1]
36
+ aspect = paramList[2]
37
+ if j>i:
38
+ protein1 = proteinListNew[i]
39
+ protein2 = proteinListNew[j]
40
+ if protein1 in protein_names and protein2 in protein_names:
41
+ prot1vec = np.asarray(protein_embedding_dataframe.query("Entry == @protein1")['Vector'].item())
42
+ prot2vec = np.asarray(protein_embedding_dataframe.query("Entry == @protein2")['Vector'].item())
43
+ #cosine will return in shape of input vectors first dimension
44
+ cos = cosine(prot1vec.reshape(1,-1),prot2vec.reshape(1,-1)).item()
45
+ manhattanDist = cdist(prot1vec.reshape(1,-1), prot2vec.reshape(1,-1), 'cityblock')
46
+ manhattanDistNorm = manhattanDist/(norm(prot1vec,1) + norm(prot2vec,1))
47
+ manhattanSim = 1-manhattanDistNorm.item()
48
+ if (norm(prot1vec,1)==0 and norm(prot2vec,1) == 0):
49
+ manhattanSim = 1.0
50
+ #print((protein1,protein2))
51
+ #print(manhattanDist)
52
+ #print(norm(prot1vec,1))
53
+ #print(norm(prot2vec,1))
54
+ euclideanDist = cdist(prot1vec.reshape(1,-1), prot2vec.reshape(1,-1), 'euclidean')
55
+ euclideanDistNorm = euclideanDist/(norm(prot1vec,2) + norm(prot2vec,2))
56
+ euclidianSim = 1-euclideanDistNorm.item()
57
+ if (norm(prot1vec,1)==0 and norm(prot2vec,1) == 0):
58
+ euclidianSim = 1.0
59
+ real = paramList[3]
60
+ # To ensure real and calculated values appended to same postion they saved similtanously and then decoupled
61
+ similarity_list.append((real,cos,manhattanSim ,euclidianSim))
62
+ return similarity_list
63
+
64
+ def calculateCorrelationforOntology(aspect,matrix_type):
65
+ print("\n\nSemantic similarity correlation calculation for aspect: " + aspect + " using matrix/dataset: " + matrix_type + " ...\n")
66
+ #Clear lists before each aspect
67
+ similarity_list[:] = []
68
+ proteinListNew[:] = []
69
+
70
+ similarityMatrixNameDict = {}
71
+ similarityMatrixNameDict["All"] = "../data/preprocess/human_"+aspect+"_proteinSimilarityMatrix.csv"
72
+ similarityMatrixNameDict["500"] = "../data/preprocess/human_"+aspect+"_proteinSimilarityMatrix_for_highest_annotated_500_proteins.csv"
73
+ similarityMatrixNameDict["Sparse"] = "../data/preprocess/human_"+aspect+"_proteinSimilarityMatrix_for_highest_annotated_500_proteins.csv"
74
+ similarityMatrixNameDict["200"] = "../data/preprocess/human_"+aspect+"_proteinSimilarityMatrix_for_highest_annotated_200_proteins.csv"
75
+
76
+ similarityMatrixFileName = similarityMatrixNameDict[matrix_type]
77
+
78
+ human_proteinSimilarityMatrix = pd.read_csv(similarityMatrixFileName)
79
+ human_proteinSimilarityMatrix.set_index(human_proteinSimilarityMatrix.columns, inplace = True)
80
+ proteinList = human_proteinSimilarityMatrix.columns
81
+
82
+ #proteinListNew is referanced using Manager
83
+ for prot in proteinList:
84
+ proteinListNew.append(prot)
85
+ if matrix_type == "Sparse":
86
+ #sparsified_similarities = np.load("SparsifiedSimilarites_for_highest_500.npy")
87
+ sparsified_similarity_coordinates = np.load("../data/auxilary_input/SparsifiedSimilarityCoordinates_"+aspect+"_for_highest_500.npy")
88
+ protParamList = sparsified_similarity_coordinates
89
+ else:
90
+ i = range(len(proteinList))
91
+ j = range(len(proteinList))
92
+ protParamList = list(itertools.product(i,j))
93
+ protParamListNew = []
94
+ # Prepare parameters for parallel processing these parameters will be
95
+ # used concurrently by different processes
96
+ for tup in tqdm(protParamList):
97
+ i = tup[0]
98
+ j = tup[1]
99
+
100
+ if matrix_type == "Sparse":
101
+ protein1 = proteinListNew[i]
102
+ protein2 = proteinListNew[j]
103
+ real = human_proteinSimilarityMatrix.loc[protein1,protein2]
104
+ tupNew = (tup[0],tup[1],aspect,real)
105
+ protParamListNew.append(tupNew)
106
+ else:
107
+ if j > i:
108
+ protein1 = proteinListNew[i]
109
+ protein2 = proteinListNew[j]
110
+ real = human_proteinSimilarityMatrix.loc[protein1,protein2]
111
+ tupNew = (tup[0],tup[1],aspect,real)
112
+ protParamListNew.append(tupNew)
113
+
114
+ total_task_num=len(protParamListNew)
115
+ pool = Pool()
116
+ similarity_listRet = []
117
+ #parallelSimilarityPartial = partial(parallelSimilarity,protein_embedding_type)
118
+ for similarity_listRet in tqdm(pool.imap_unordered(parallelSimilarity,protParamListNew), total=total_task_num , position=0, leave=True ):
119
+ pass
120
+ #time.sleep(0.1)
121
+ pool.close()
122
+ pool.join()
123
+
124
+ real_distance_list = [value[0] for value in similarity_listRet]
125
+ cosine_distance_list = [value[1] for value in similarity_listRet]
126
+ manhattan_distance_list = [value[2] for value in similarity_listRet]
127
+ euclidian_distance_list = [value[3] for value in similarity_listRet]
128
+
129
+ distance_lists = [real_distance_list,cosine_distance_list,manhattan_distance_list,euclidian_distance_list]
130
+ if detailed_output:
131
+ report_detailed_distance_scores(representation_name,matrix_type,aspect,distance_lists)
132
+
133
+ cosineCorr = spearmanr(real_distance_list, cosine_distance_list)
134
+ manhattanCorr = spearmanr(real_distance_list, manhattan_distance_list)
135
+ euclidianCorr = spearmanr(real_distance_list, euclidian_distance_list)
136
+
137
+ #print("Cosine Correlation for "+aspect+" is " + str(cosineCorr))
138
+ #print("Manhattan Correlation for "+aspect+" is " + str(manhattanCorr))
139
+ #print("Euclidian Correlation for "+aspect+" is " + str(euclidianCorr))
140
+
141
+ return (cosineCorr,manhattanCorr,euclidianCorr)
142
+
143
+ def report_detailed_distance_scores(representation_name,similarity_matrix_type,aspect,distance_lists):
144
+ saveFileName = "../results/Semantic_sim_inference_detailed_distance_scores"+aspect+"_"+similarity_matrix_type+"_"+representation_name+".pkl"
145
+ with open(saveFileName, "wb") as f:
146
+ pickle.dump(distance_lists, f)
147
+
148
+ def calculate_all_correlations():
149
+ for similarity_matrix_type in similarity_tasks:
150
+ saveFileName = "../results/Semantic_sim_inference_"+similarity_matrix_type+"_"+representation_name+".csv"
151
+ buffer = "Semantic Aspect,CosineSim_Correlation,CosineSim_Correlation p-value, ManhattanSim_Correlation,ManhattanSim_Correlation p-value, EuclidianSim_Correlation,EuclidianSim_Correlation p-value \n"
152
+ f = open(saveFileName,'w')
153
+ f.write(buffer)
154
+ for aspect in ["MF","BP","CC"]:
155
+ corr = calculateCorrelationforOntology(aspect,similarity_matrix_type)
156
+ buffer = "" + aspect + ","+ str(round(corr[0][0],5))+ ","+ str(round(corr[0][1],5))+ ","+ str(round(corr[1][0],5))\
157
+ + ","+ str(round(corr[1][1],5))+ ","+ str(round(corr[2][0],5))+ ","+str(round(corr[2][1],5))+"\n"
158
+ f = open(saveFileName,'a')
159
+ f.write(buffer)
160
+ f.close()
src/bin/target_family_classifier.py ADDED
@@ -0,0 +1,226 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ # -*- coding: utf-8 -*-
2
+ """
3
+ Created on Mon Jun 8 09:32:26 2020
4
+
5
+ @author: Muammer
6
+ """
7
+
8
+ import numpy as np
9
+ from sklearn.model_selection import cross_validate
10
+ from sklearn.model_selection import cross_val_predict
11
+ from sklearn.metrics import matthews_corrcoef
12
+ from sklearn.metrics import classification_report
13
+ from sklearn.multiclass import OneVsRestClassifier
14
+ from sklearn import linear_model
15
+ from sklearn.metrics import f1_score
16
+ from sklearn.metrics import confusion_matrix
17
+ from sklearn.model_selection import train_test_split
18
+ import pandas as pd
19
+ from numpy import save
20
+ from sklearn.metrics import precision_recall_fscore_support
21
+ from tqdm import tqdm
22
+ from sklearn.metrics import accuracy_score
23
+ import math
24
+
25
+
26
+ representation_name = ""
27
+ representation_path = ""
28
+ dataset = "nc"
29
+ detailed_output = False
30
+
31
+ def convert_dataframe_to_multi_col(representation_dataframe):
32
+ entry = pd.DataFrame(representation_dataframe['Entry'])
33
+ vector = pd.DataFrame(list(representation_dataframe['Vector']))
34
+ multi_col_representation_vector = pd.merge(left=entry,right=vector,left_index=True, right_index=True)
35
+ return multi_col_representation_vector
36
+
37
+ def class_based_scores(c_report, c_matrix):
38
+ c_report = pd.DataFrame(c_report).transpose()
39
+ #print(c_report)
40
+ c_report = c_report.drop(['precision', 'recall'], axis=1)
41
+ c_report = c_report.drop(labels=['accuracy', 'macro avg', 'weighted avg'], axis=0)
42
+ cm = c_matrix.astype('float') / c_matrix.sum(axis=1)[:, np.newaxis]
43
+ #print(c_report)
44
+ accuracy = cm.diagonal()
45
+
46
+ #print(accuracy)
47
+ #if len(accuracy) == 6:
48
+ # accuracy = np.delete(accuracy, 5)
49
+
50
+ accuracy = pd.Series(accuracy, index=c_report.index)
51
+ c_report['accuracy'] = accuracy
52
+
53
+ total = c_report['support'].sum()
54
+ #print(total)
55
+ num_classes = np.shape(c_matrix)[0]
56
+ mcc = np.zeros(shape=(num_classes,), dtype='float32')
57
+ weights = np.sum(c_matrix, axis=0)/np.sum(c_matrix)
58
+ total_tp = 0
59
+ total_fp = 0
60
+ total_fn = 0
61
+ total_tn = 0
62
+
63
+ for j in range(num_classes):
64
+ tp = np.sum(c_matrix[j, j])
65
+ fp = np.sum(c_matrix[j, np.concatenate((np.arange(0, j), np.arange(j+1, num_classes)))])
66
+ fn = np.sum(c_matrix[np.concatenate((np.arange(0, j), np.arange(j+1, num_classes))), j])
67
+ tn = int(total - tp - fp - fn)
68
+ total_tp = total_tp + tp
69
+ total_fp = total_fp + fp
70
+ total_fn = total_fn + fn
71
+ total_tn = total_tn + tn
72
+ #print(tp,fp,fn,tn)
73
+ mcc[j] = ((tp*tn)-(fp*fn))/math.sqrt((tp+fp)*(tp+fn)*(tn+fp)*(tn+fn))
74
+ #print(mcc)
75
+ #if len(mcc) == 6:
76
+ # mcc = np.delete(mcc, 5)
77
+
78
+ mcc = pd.Series(mcc, index=c_report.index)
79
+ c_report['mcc'] = mcc
80
+ #c_report.to_excel('../results/resultss_class_based_'+dataset+'.xlsx')
81
+ #print(c_report)
82
+ return c_report, total_tp, total_fp, total_fn, total_tn
83
+
84
+
85
+
86
+ def score_protein_rep(dataset):
87
+ #def score_protein_rep(pkl_data_path):
88
+
89
+ vecsize = 0
90
+ #protein_list = pd.read_csv('../data/auxilary_input/entry_class.csv')
91
+ protein_list = pd.read_csv('../data/preprocess/entry_class_nn.csv')
92
+ dataframe = pd.read_csv(representation_path)
93
+ #dataframe = convert_dataframe_to_multi_col(dataframe)
94
+ #dataframe = pd.read_pickle(pkl_data_path)
95
+ vecsize = dataframe.shape[1]-1
96
+ x = np.empty([0, vecsize])
97
+ xemp = np.zeros((1, vecsize), dtype=float)
98
+ y = []
99
+ ne = []
100
+
101
+ print("\n\nPreprocessing data for drug-target protein family prediction...\n ")
102
+ for index, row in tqdm(protein_list.iterrows(), total=len(protein_list)):
103
+ pdrow = dataframe.loc[dataframe['Entry'] == row['Entry']]
104
+ if len(pdrow) != 0:
105
+ a = pdrow.loc[ : , pdrow.columns != 'Entry']
106
+ a = np.array(a)
107
+ a.shape = (1,vecsize)
108
+ x = np.append(x, a, axis=0)
109
+ y.append(row['Class'])
110
+ else:
111
+ ne.append(index)
112
+ x = np.append(x, xemp, axis=0,)
113
+ y.append(0.0)
114
+ #print(index)
115
+
116
+ x = x.astype(np.float64)
117
+ y = np.array(y)
118
+ y = y.astype(np.float64)
119
+ #print(len(y))
120
+ scoring = ['precision_weighted', 'recall_weighted', 'f1_weighted', 'accuracy']
121
+ target_names = ['Enzyme', 'Membrane receptor', 'Transcription factor', 'Ion channel', 'Other']
122
+ labels = [1.0, 11.0, 12.0, 1005.0, 2000.0]
123
+
124
+ f1 = []
125
+ accuracy = []
126
+ mcc = []
127
+ f1_perclass = []
128
+ ac_perclass = []
129
+ mcc_perclass = []
130
+ sup_perclass = []
131
+ report_list = []
132
+ train_index = pd.read_csv('../data/preprocess/indexes/'+dataset+'_trainindex.csv')
133
+ test_index = pd.read_csv('../data/preprocess/indexes/testindex_family.csv')
134
+ train_index = train_index.dropna(axis=1)
135
+ test_index = test_index.dropna(axis=1)
136
+ #print(train_index)
137
+ #for index in ne:
138
+
139
+
140
+ conf = pd.DataFrame()
141
+
142
+ print('Producing protein family predictions...\n')
143
+ for i in tqdm(range(10)):
144
+ clf = linear_model.SGDClassifier(class_weight="balanced", loss="log", penalty="elasticnet", max_iter=1000, tol=1e-3,random_state=i,n_jobs=-1)
145
+ clf2 = OneVsRestClassifier(clf,n_jobs=-1)
146
+ #print(test_index)
147
+ train_indexx = train_index.iloc[i].astype(int)
148
+ test_indexx = test_index.iloc[i].astype(int)
149
+ #print(train_indexx)
150
+ #train_indexx.drop(labels=ne)
151
+ #print(type(train_indexx))
152
+ for index in ne:
153
+
154
+ train_indexx = train_indexx[train_indexx!=index]
155
+ test_indexx = test_indexx[test_indexx!=index]
156
+
157
+
158
+
159
+ train_X, test_X = x[train_indexx], x[test_indexx]
160
+ train_y, test_y = y[train_indexx], y[test_indexx]
161
+
162
+ clf2.fit(train_X, train_y)
163
+
164
+ #print(train_X)
165
+ y_pred = clf2.predict(test_X)
166
+
167
+ #y_pred = cross_val_predict(clf2, x, y, cv=10, n_jobs=-1)
168
+ #mcc.append(matthews_corrcoef(test_y, y_pred, sample_weight = test_y))
169
+ f1_ = f1_score(test_y, y_pred, average='weighted')
170
+ f1.append(f1_)
171
+ ac = accuracy_score(test_y, y_pred)
172
+ accuracy.append(ac)
173
+ c_report = classification_report(test_y, y_pred, target_names=target_names, output_dict=True)
174
+ c_matrix = confusion_matrix(test_y, y_pred, labels=labels)
175
+
176
+ conf = conf.append(pd.DataFrame(c_matrix, columns=['Enzymes', 'Membrane receptor', 'Transcription factor', 'Ion channel', 'Other']), ignore_index=True)
177
+ class_report, tp, fp, fn, tn = class_based_scores(c_report, c_matrix)
178
+
179
+ #print(total_tp)
180
+ mcc.append(((tp*tn)-(fp*fn))/math.sqrt((tp+fp)*(tp+fn)*(tn+fp)*(tn+fn)))
181
+
182
+
183
+ f1_perclass.append(class_report['f1-score'])
184
+ ac_perclass.append(class_report['accuracy'])
185
+ mcc_perclass.append(class_report['mcc'])
186
+ sup_perclass.append(class_report['support'])
187
+ report_list.append(class_report)
188
+
189
+ if detailed_output:
190
+ conf.to_csv('../results/Drug_target_protein_family_classification_confusion_'+dataset+'_'+representation_name+'.csv', index=None)
191
+
192
+ f1_perclass = pd.concat(f1_perclass, axis=1)
193
+ ac_perclass = pd.concat(ac_perclass, axis=1)
194
+ mcc_perclass = pd.concat(mcc_perclass, axis=1)
195
+ sup_perclass = pd.concat(sup_perclass, axis=1)
196
+
197
+ report_list = pd.concat(report_list, axis=1)
198
+ report_list.to_csv('../results/Drug_target_protein_family_classification_class_based_results_'+dataset+'_'+representation_name+'.csv')
199
+
200
+ report = pd.DataFrame()
201
+ f1mean = np.mean(f1, axis=0)
202
+ #print(f1mean)
203
+ f1mean = f1mean.round(decimals=5)
204
+ f1std = np.std(f1).round(decimals=5)
205
+ acmean = np.mean(accuracy, axis=0).round(decimals=5)
206
+ acstd = np.std(accuracy).round(decimals=5)
207
+ mccmean = np.mean(mcc, axis=0).round(decimals=5)
208
+ mccstd = np.std(mcc).round(decimals=5)
209
+ labels = ['Average Score', 'Standard Deviation']
210
+ report['Protein Family'] = labels
211
+ report['F1_score'] = [f1mean, f1std]
212
+ report['Accuracy'] = [acmean, acstd]
213
+ report['MCC'] = [mccmean, mccstd]
214
+
215
+ report.to_csv('../results/Drug_target_protein_family_classification_mean_results_'+dataset+'_'+representation_name+'.csv',index=False)
216
+ #report.to_csv('scores_general.csv')
217
+ #print(report)
218
+ if detailed_output:
219
+ save('../results/Drug_target_protein_family_classification_f1_'+dataset+'_'+representation_name+'.npy', f1)
220
+ save('../results/Drug_target_protein_family_classification_accuracy_'+dataset+'_'+representation_name+'.npy', accuracy)
221
+ save('../results/Drug_target_protein_family_classification_mcc_'+dataset+'_'+representation_name+'.npy', mcc)
222
+ save('../results/Drug_target_protein_family_classification_class_based_f1_'+dataset+'_'+representation_name+'.npy', f1_perclass)
223
+ save('../results/Drug_target_protein_family_classification_class_based_accuracy_'+dataset+'_'+representation_name+'.npy', ac_perclass)
224
+ save('../results/Drug_target_protein_family_classification_class_based_mcc_'+dataset+'_'+representation_name+'.npy', mcc_perclass)
225
+ save('../results/Drug_target_protein_family_classification_class_based_support_'+dataset+'_'+representation_name+'.npy', sup_perclass)
226
+