Al John Lexter Lozano commited on
Commit
ead2dcb
1 Parent(s): d945311

Initial commit-make model as importable module and add simple gradio interface

Browse files
Files changed (13) hide show
  1. LICENSE +21 -0
  2. README.md +2 -0
  3. README.rst +47 -0
  4. app.py +46 -0
  5. bad.txt +5 -0
  6. big.txt +0 -0
  7. demo_bad.txt +5 -0
  8. demo_blank.csv +5 -0
  9. gib_detect.py +12 -0
  10. gib_detect_module.py +11 -0
  11. gib_detect_train.py +75 -0
  12. gib_model.pki +764 -0
  13. good.txt +6 -0
LICENSE ADDED
@@ -0,0 +1,21 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ The MIT License (MIT)
2
+
3
+ Copyright (c) 2015 Rob Renaud
4
+
5
+ Permission is hereby granted, free of charge, to any person obtaining a copy
6
+ of this software and associated documentation files (the "Software"), to deal
7
+ in the Software without restriction, including without limitation the rights
8
+ to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
9
+ copies of the Software, and to permit persons to whom the Software is
10
+ furnished to do so, subject to the following conditions:
11
+
12
+ The above copyright notice and this permission notice shall be included in
13
+ all copies or substantial portions of the Software.
14
+
15
+ THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
16
+ IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
17
+ FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
18
+ AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
19
+ LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
20
+ OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN
21
+ THE SOFTWARE.
README.md CHANGED
@@ -10,3 +10,5 @@ pinned: false
10
  ---
11
 
12
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces#reference
 
 
 
10
  ---
11
 
12
  Check out the configuration reference at https://huggingface.co/docs/hub/spaces#reference
13
+
14
+ A gibberish detection program based on https://github.com/rrenaud/Gibberish-Detector deployed in Gradio.
README.rst ADDED
@@ -0,0 +1,47 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ Overview
2
+ ========
3
+
4
+ A sample program I wrote to detect gibberish. It uses a 2 character markov chain.
5
+
6
+ http://en.wikipedia.org/wiki/Markov_chain
7
+
8
+ This is a nice (IMO) answer to this guys question on stackoverflow.
9
+ http://stackoverflow.com/questions/6297991/is-there-any-way-to-detect-strings-like-putjbtghguhjjjanika/6298040#comment-7360747
10
+
11
+ Usage
12
+ =====
13
+
14
+ First train the model:
15
+
16
+ python gib_detect_train.py
17
+
18
+ Then try it on some sample input
19
+
20
+ python gib_detect.py
21
+
22
+ my name is rob and i like to hack True
23
+
24
+ is this thing working? True
25
+
26
+ i hope so True
27
+
28
+ t2 chhsdfitoixcv False
29
+
30
+ ytjkacvzw False
31
+
32
+ yutthasxcvqer False
33
+
34
+ seems okay True
35
+
36
+ yay! True
37
+
38
+ How it works
39
+ ============
40
+ The markov chain first 'trains' or 'studies' a few MB of English text, recording how often characters appear next to each other. Eg, given the text "Rob likes hacking" it sees Ro, ob, o[space], [space]l, ... It just counts these pairs. After it has finished reading through the training data, it normalizes the counts. Then each character has a probability distribution of 27 followup character (26 letters + space) following the given initial.
41
+
42
+ So then given a string, it measures the probability of generating that string according to the summary by just multiplying out the probabilities of the adjacent pairs of characters in that string. EG, for that "Rob likes hacking" string, it would compute prob['r']['o'] * prob['o']['b'] * prob['b'][' '] ... This probability then measures the amount of 'surprise' assigned to this string according the data the model observed when training. If there is funny business with the input string, it will pass through some pairs with very low counts in the training phase, and hence have low probability/high surprise.
43
+
44
+ I then look at the amount of surprise per character for a few known good strings, and a few known bad strings, and pick a threshold between the most surprising good string and the least surprising bad string. Then I use that threshold whenever to classify any new piece of text.
45
+
46
+ Peter Norvig, the director of Research at Google, has this nice talk about "The unreasonable effectiveness of data" here, http://www.youtube.com/watch?v=9vR8Vddf7-s. This insight is really not to try to do something complicated, just write a small program that utilizes a bunch of data and you can do cool things.
47
+
app.py ADDED
@@ -0,0 +1,46 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ from fastapi import File
2
+ import gradio as gr
3
+ from gib_detect_module import detect
4
+ import csv
5
+
6
+ def greet(name):
7
+ return "Hello " + name + "!!"
8
+
9
+ def detect_gibberish(line,f):
10
+
11
+ if line:
12
+ if detect(line):
13
+ return "Valid!!!!", None
14
+ else:
15
+ return "Bollocks Giberrish",None
16
+ elif f:
17
+ return None, annotate_csv(f)
18
+
19
+
20
+ def annotate_csv(f):
21
+ with open(f.name) as csvfile:
22
+ creader = csv.reader(csvfile, delimiter=',', quotechar='"')
23
+
24
+ with open('out.csv', 'w', newline='') as csvout:
25
+ cwriter = csv.writer(csvout, delimiter=',',
26
+ quotechar='"', quoting=csv.QUOTE_MINIMAL)
27
+ for row in creader:
28
+ print(row)
29
+ row.append(str(detect(row[0])))
30
+ cwriter.writerow(row)
31
+
32
+
33
+ return "out.csv"
34
+
35
+ inputFile=gr.inputs.File(file_count="single", type="file", label="File to Annotate", optional=True)
36
+ outputFile=gr.outputs.File( label="Annotated CSV")
37
+
38
+ examples=[
39
+ ["quetzalcoatl","demo_blank.csv"],
40
+ ["Shinkansen","demo_blank.csv"],
41
+ ["aasdf","demo_blank.csv"],
42
+ ["Covfefe","demo_blank.csv"]
43
+ ]
44
+ iface = gr.Interface(fn=[detect_gibberish], inputs=["text",inputFile], outputs=["text",outputFile],examples=examples, allow_flagging='never')
45
+
46
+ iface.launch()
bad.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ zxcvwerjasc
2
+ nmnjcviburili,<>
3
+ zxcvnadtruqe
4
+ ertrjiloifdfyyoiu
5
+ grty iuewdiivjh
big.txt ADDED
The diff for this file is too large to render. See raw diff
 
demo_bad.txt ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ "zxcvwerjasc"
2
+ "nmnjcviburili,<>"
3
+ "zxcvnadtruqe"
4
+ "ertrjiloifdfyyoiu"
5
+ "grty iuewdiivjh"
demo_blank.csv ADDED
@@ -0,0 +1,5 @@
 
 
 
 
 
 
1
+ "zxcvwerjasc"
2
+ "nmnjcviburili,<>"
3
+ "zxcvnadtruqe"
4
+ "ertrjiloifdfyyoiu"
5
+ "grty iuewdiivjh"
gib_detect.py ADDED
@@ -0,0 +1,12 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python
2
+
3
+ import pickle
4
+ import gib_detect_train
5
+
6
+ model_data = pickle.load(open('gib_model.pki', 'rb'))
7
+
8
+ while True:
9
+ l = raw_input()
10
+ model_mat = model_data['mat']
11
+ threshold = model_data['thresh']
12
+ print gib_detect_train.avg_transition_prob(l, model_mat) > threshold
gib_detect_module.py ADDED
@@ -0,0 +1,11 @@
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python
2
+
3
+ import pickle
4
+ import gib_detect_train
5
+
6
+ model_data = pickle.load(open('gib_model.pki', 'rb'))
7
+
8
+ def detect(text):
9
+ model_mat = model_data['mat']
10
+ threshold = model_data['thresh']
11
+ return gib_detect_train.avg_transition_prob(text, model_mat) > threshold
gib_detect_train.py ADDED
@@ -0,0 +1,75 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ #!/usr/bin/python
2
+
3
+ import math
4
+ import pickle
5
+
6
+ accepted_chars = 'abcdefghijklmnopqrstuvwxyz '
7
+
8
+ pos = dict([(char, idx) for idx, char in enumerate(accepted_chars)])
9
+
10
+ def normalize(line):
11
+ """ Return only the subset of chars from accepted_chars.
12
+ This helps keep the model relatively small by ignoring punctuation,
13
+ infrequenty symbols, etc. """
14
+ return [c.lower() for c in line if c.lower() in accepted_chars]
15
+
16
+ def ngram(n, l):
17
+ """ Return all n grams from l after normalizing """
18
+ filtered = normalize(l)
19
+ for start in range(0, len(filtered) - n + 1):
20
+ yield ''.join(filtered[start:start + n])
21
+
22
+ def train():
23
+ """ Write a simple model as a pickle file """
24
+ k = len(accepted_chars)
25
+ # Assume we have seen 10 of each character pair. This acts as a kind of
26
+ # prior or smoothing factor. This way, if we see a character transition
27
+ # live that we've never observed in the past, we won't assume the entire
28
+ # string has 0 probability.
29
+ counts = [[10 for i in xrange(k)] for i in xrange(k)]
30
+
31
+ # Count transitions from big text file, taken
32
+ # from http://norvig.com/spell-correct.html
33
+ for line in open('big.txt'):
34
+ for a, b in ngram(2, line):
35
+ counts[pos[a]][pos[b]] += 1
36
+
37
+ # Normalize the counts so that they become log probabilities.
38
+ # We use log probabilities rather than straight probabilities to avoid
39
+ # numeric underflow issues with long texts.
40
+ # This contains a justification:
41
+ # http://squarecog.wordpress.com/2009/01/10/dealing-with-underflow-in-joint-probability-calculations/
42
+ for i, row in enumerate(counts):
43
+ s = float(sum(row))
44
+ for j in xrange(len(row)):
45
+ row[j] = math.log(row[j] / s)
46
+
47
+ # Find the probability of generating a few arbitrarily choosen good and
48
+ # bad phrases.
49
+ good_probs = [avg_transition_prob(l, counts) for l in open('good.txt')]
50
+ bad_probs = [avg_transition_prob(l, counts) for l in open('bad.txt')]
51
+
52
+ # Assert that we actually are capable of detecting the junk.
53
+ assert min(good_probs) > max(bad_probs)
54
+
55
+ # And pick a threshold halfway between the worst good and best bad inputs.
56
+ thresh = (min(good_probs) + max(bad_probs)) / 2
57
+ pickle.dump({'mat': counts, 'thresh': thresh}, open('gib_model.pki', 'wb'))
58
+
59
+ def avg_transition_prob(l, log_prob_mat):
60
+ """ Return the average transition prob from l through log_prob_mat. """
61
+ log_prob = 0.0
62
+ transition_ct = 0
63
+ for a, b in ngram(2, l):
64
+ log_prob += log_prob_mat[pos[a]][pos[b]]
65
+ transition_ct += 1
66
+ # The exponentiation translates from log probs to probs.
67
+ return math.exp(log_prob / (transition_ct or 1))
68
+
69
+ if __name__ == '__main__':
70
+ train()
71
+
72
+
73
+
74
+
75
+
gib_model.pki ADDED
@@ -0,0 +1,764 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ (dp0
2
+ S'thresh'
3
+ p1
4
+ F0.018782003473122023
5
+ sS'mat'
6
+ p2
7
+ (lp3
8
+ (lp4
9
+ F-8.569137312930899
10
+ aF-3.9369332597631863
11
+ aF-3.220670162697391
12
+ aF-3.0482479869676102
13
+ aF-6.052279063336297
14
+ aF-4.69956099775001
15
+ aF-3.9941585968087816
16
+ aF-6.710407217596661
17
+ aF-3.2453041060602184
18
+ aF-7.060740255010108
19
+ aF-4.512283359624297
20
+ aF-2.4997201529644935
21
+ aF-3.642636781640966
22
+ aF-1.5707462805725019
23
+ aF-7.978468801653891
24
+ aF-3.8936418102220776
25
+ aF-9.821900281426267
26
+ aF-2.3025283782801376
27
+ aF-2.348366425382398
28
+ aF-1.9448651421947813
29
+ aF-4.539158126663701
30
+ aF-3.871849760115083
31
+ aF-4.706359120463831
32
+ aF-6.560313338465017
33
+ aF-3.649725323207633
34
+ aF-6.641954302926283
35
+ aF-2.7134701747591117
36
+ aa(lp5
37
+ F-2.5528619980785
38
+ aF-5.139226208055755
39
+ aF-6.049719822245583
40
+ aF-6.219404795035026
41
+ aF-1.173596307609444
42
+ aF-8.563954087128105
43
+ aF-8.805116143944993
44
+ aF-8.494961215641153
45
+ aF-3.3328454702735173
46
+ aF-5.004532700251055
47
+ aF-8.805116143944993
48
+ aF-2.139085063222716
49
+ aF-6.121607051758899
50
+ aF-6.808562262070924
51
+ aF-2.1459387811703974
52
+ aF-8.312639658847198
53
+ aF-8.900426323749317
54
+ aF-2.719375008855968
55
+ aF-3.788438535392774
56
+ aF-4.703224376087508
57
+ aF-2.137350056136624
58
+ aF-6.320209494156992
59
+ aF-7.676650892127201
60
+ aF-8.900426323749317
61
+ aF-2.3611294272462486
62
+ aF-8.900426323749317
63
+ aF-4.738423113053401
64
+ aa(lp6
65
+ F-2.08946313988477
66
+ aF-9.398284978640158
67
+ aF-3.8466182187208457
68
+ aF-7.678499009037193
69
+ aF-1.7391136109740999
70
+ aF-8.792149175069843
71
+ aF-9.580606535434113
72
+ aF-1.9093388132314668
73
+ aF-2.9331775988939013
74
+ aF-9.485296355629789
75
+ aF-3.343650033348159
76
+ aF-3.276523430584318
77
+ aF-8.515895798441685
78
+ aF-8.838669190704735
79
+ aF-1.6000355228278762
80
+ aF-9.485296355629789
81
+ aF-6.386023403134956
82
+ aF-3.377666241879603
83
+ aF-5.838186314392146
84
+ aF-2.3909106346293085
85
+ aF-3.2184408726543063
86
+ aF-9.485296355629789
87
+ aF-8.838669190704735
88
+ aF-9.580606535434113
89
+ aF-4.621264535725407
90
+ aF-8.299672689972049
91
+ aF-3.8668737299247438
92
+ aa(lp7
93
+ F-3.7200475908408235
94
+ aF-7.418434550575756
95
+ aF-7.88681348409449
96
+ aF-4.564865474086475
97
+ aF-1.9699465394791131
98
+ aF-6.796896902393934
99
+ aF-5.430448837645974
100
+ aF-6.754155354016663
101
+ aF-2.4627220077183942
102
+ aF-6.277375571660389
103
+ aF-7.666270714480338
104
+ aF-4.558326450319418
105
+ aF-5.5188785108919935
106
+ aF-5.946484744877372
107
+ aF-3.117009722931763
108
+ aF-8.380471304238116
109
+ aF-8.136274343726074
110
+ aF-3.6243165394601604
111
+ aF-3.6811521539816408
112
+ aF-7.1936663035345445
113
+ aF-3.9941983786827824
114
+ aF-5.568764165402863
115
+ aF-7.077327311450395
116
+ aF-9.928033812954128
117
+ aF-4.613843097249801
118
+ aF-9.745712256160173
119
+ aF-0.5394468422260206
120
+ aa(lp8
121
+ F-3.0956911156796725
122
+ aF-6.3279563530093395
123
+ aF-3.711394117529345
124
+ aF-2.414309096615009
125
+ aF-3.7280101535107066
126
+ aF-4.555095809949321
127
+ aF-4.9582369569836615
128
+ aF-6.340824177620733
129
+ aF-4.465115884463944
130
+ aF-8.043262266521952
131
+ aF-7.012917948982845
132
+ aF-3.4395652463282116
133
+ aF-3.7489856294972617
134
+ aF-2.389153595060143
135
+ aF-5.281198278652138
136
+ aF-4.426598784036354
137
+ aF-6.265131628349503
138
+ aF-1.976271246918681
139
+ aF-2.519225987547763
140
+ aF-3.7331106440131854
141
+ aF-6.05129367580717
142
+ aF-4.115226028040922
143
+ aF-4.719249301196115
144
+ aF-4.451178867151557
145
+ aF-4.535588241035776
146
+ aF-7.731754615658102
147
+ aF-1.1291364595675442
148
+ aa(lp9
149
+ F-2.7634422911455703
150
+ aF-7.9114774546442685
151
+ aF-7.529542843946299
152
+ aF-8.494623739989885
153
+ aF-2.451100566435661
154
+ aF-2.926120215794328
155
+ aF-7.612234559791412
156
+ aF-8.53718335440868
157
+ aF-2.4505074466080714
158
+ aF-9.033620240722573
159
+ aF-8.72823859117139
160
+ aF-3.748243385450271
161
+ aF-8.839464226281615
162
+ aF-7.66534438510536
163
+ aF-1.9043799278633868
164
+ aF-8.305381740351358
165
+ aF-9.370092477343785
166
+ aF-2.3558173549839756
167
+ aF-5.929674382528349
168
+ aF-3.315653131074415
169
+ aF-3.5058932812816797
170
+ aF-9.370092477343785
171
+ aF-7.801476559429941
172
+ aF-9.370092477343785
173
+ aF-6.135343303319295
174
+ aF-9.370092477343785
175
+ aF-0.9976707550554195
176
+ aa(lp10
177
+ F-2.681117318742622
178
+ aF-8.560252680876685
179
+ aF-8.083328608786376
180
+ aF-6.8862762473050125
181
+ aF-1.9631827738890724
182
+ aF-7.924263914156688
183
+ aF-4.61975020412667
184
+ aF-2.2213641819441556
185
+ aF-2.8612966495974126
186
+ aF-9.148039345778804
187
+ aF-8.560252680876685
188
+ aF-3.35807917488155
189
+ aF-6.075346031088684
190
+ aF-3.7399711152716475
191
+ aF-2.831958278425978
192
+ aF-7.954116877306369
193
+ aF-9.052729165974478
194
+ aF-2.5526687518888194
195
+ aF-4.062915199691808
196
+ aF-4.943346726387837
197
+ aF-3.494498087559349
198
+ aF-9.148039345778804
199
+ aF-7.579423427864958
200
+ aF-9.148039345778804
201
+ aF-5.882279935011752
202
+ aF-8.617411094716633
203
+ aF-1.0302198528344138
204
+ aa(lp11
205
+ F-1.8949866865832032
206
+ aF-7.38720654177459
207
+ aF-8.05310407988716
208
+ aF-7.542278456121169
209
+ aF-0.7286353651447546
210
+ aF-7.858315754328075
211
+ aF-9.589971299486425
212
+ aF-8.779041083270096
213
+ aF-1.9951372153459488
214
+ aF-10.100796923252416
215
+ aF-7.885223207247999
216
+ aF-6.635061020452689
217
+ aF-6.261344610659105
218
+ aF-6.705170586639715
219
+ aF-2.5617698674284206
220
+ aF-9.253499062865211
221
+ aF-10.100796923252416
222
+ aF-4.5720294302077304
223
+ aF-6.222675469499951
224
+ aF-3.764561336708525
225
+ aF-4.622243506401445
226
+ aF-9.541181135316993
227
+ aF-7.3816968859636205
228
+ aF-10.28311848004637
229
+ aF-5.024581983510178
230
+ aF-10.187808300242045
231
+ aF-2.3623088007577695
232
+ aa(lp12
233
+ F-3.712244886548001
234
+ aF-4.717475282130211
235
+ aF-2.783984370515817
236
+ aF-3.2216013768067646
237
+ aF-3.1683365236496233
238
+ aF-3.90382545586124
239
+ aF-3.680790547058952
240
+ aF-9.119304544100272
241
+ aF-6.422989599216483
242
+ aF-10.323277348426208
243
+ aF-5.240838322200969
244
+ aF-3.0795261367137394
245
+ aF-3.173687619686372
246
+ aF-1.3126793991869439
247
+ aF-2.664381175855551
248
+ aF-4.9236074876178835
249
+ aF-7.895529112478156
250
+ aF-3.409705684122631
251
+ aF-2.051579832828198
252
+ aF-2.1011937452905483
253
+ aF-6.511074678280273
254
+ aF-3.8167461832949807
255
+ aF-9.672689782285058
256
+ aF-6.168308164387673
257
+ aF-10.410288725415837
258
+ aF-5.511770729440288
259
+ aF-3.788398987959084
260
+ aa(lp13
261
+ F-2.3427609160575655
262
+ aF-6.1024094410597085
263
+ aF-6.037870919922137
264
+ aF-6.507874549167873
265
+ aF-1.4153520955994334
266
+ aF-6.245510284700382
267
+ aF-6.17140231254666
268
+ aF-6.325552992373918
269
+ aF-5.51462277615759
270
+ aF-6.507874549167873
271
+ aF-6.245510284700382
272
+ aF-6.412564369363548
273
+ aF-6.325552992373918
274
+ aF-6.507874549167873
275
+ aF-1.2783714986201964
276
+ aF-6.412564369363548
277
+ aF-6.325552992373918
278
+ aF-6.1024094410597085
279
+ aF-6.1024094410597085
280
+ aF-6.245510284700382
281
+ aF-1.1002544477293865
282
+ aF-6.507874549167873
283
+ aF-6.245510284700382
284
+ aF-6.507874549167873
285
+ aF-6.507874549167873
286
+ aF-6.507874549167873
287
+ aF-4.859215923580492
288
+ aa(lp14
289
+ F-3.6194933584945135
290
+ aF-7.047703539402737
291
+ aF-6.062419936041631
292
+ aF-7.671857848475732
293
+ aF-1.2114318817004575
294
+ aF-6.285563487355841
295
+ aF-6.824559988088527
296
+ aF-3.6311485020862624
297
+ aF-1.7851984157338754
298
+ aF-7.814958692116405
299
+ aF-7.489536291681777
300
+ aF-3.88766821455747
301
+ aF-6.036102627724257
302
+ aF-2.3543916630889337
303
+ aF-3.8417682258102714
304
+ aF-7.546694705521725
305
+ aF-7.9820127767795706
306
+ aF-5.48206824962703
307
+ aF-3.0008999219496366
308
+ aF-6.6910285954640045
309
+ aF-3.73872587983735
310
+ aF-7.2444138336487915
311
+ aF-5.438265626968637
312
+ aF-8.077322956583895
313
+ aF-4.682814563072537
314
+ aF-8.077322956583895
315
+ aF-1.4288566755523215
316
+ aa(lp15
317
+ F-2.269183388314388
318
+ aF-6.573297799694782
319
+ aF-5.745479365449931
320
+ aF-2.8596367214634224
321
+ aF-1.7841347050080587
322
+ aF-4.144047900909572
323
+ aF-6.844091654118041
324
+ aF-7.7437746495925355
325
+ aF-2.175115871017978
326
+ aF-9.701519256294851
327
+ aF-4.956587127931601
328
+ aF-2.065933606191924
329
+ aF-5.060338632783727
330
+ aF-6.492693767280152
331
+ aF-2.4508837443961715
332
+ aF-5.604400767190025
333
+ aF-9.701519256294851
334
+ aF-5.648286082315182
335
+ aF-3.879460040714278
336
+ aF-3.864517937852428
337
+ aF-3.8209862698941515
338
+ aF-5.091361528795721
339
+ aF-5.466205750947557
340
+ aF-9.883840813088806
341
+ aF-2.333600268348952
342
+ aF-8.602906967626742
343
+ aF-2.0434097729831913
344
+ aa(lp16
345
+ F-1.7539942375247688
346
+ aF-3.6841198980845076
347
+ aF-6.559311098803656
348
+ aF-8.439623965373157
349
+ aF-1.3654444296574595
350
+ aF-6.4075846625879045
351
+ aF-9.337565558579115
352
+ aF-8.238953269911006
353
+ aF-2.4389427631602505
354
+ aF-9.17051147391595
355
+ aF-8.902247487321269
356
+ aF-6.206031743866062
357
+ aF-3.641082386404525
358
+ aF-5.662416297277081
359
+ aF-2.2280575251437695
360
+ aF-2.6972144960398214
361
+ aF-9.432875738383439
362
+ aF-5.501050105659114
363
+ aF-3.5381974599631496
364
+ aF-6.907147094075184
365
+ aF-3.4685254837670296
366
+ aF-9.250554181589486
367
+ aF-8.071899185247839
368
+ aF-9.432875738383439
369
+ aF-3.424799925470261
370
+ aF-9.432875738383439
371
+ aF-1.8968318511898539
372
+ aa(lp17
373
+ F-3.41157713428474
374
+ aF-6.728285208389976
375
+ aF-3.0909427430720093
376
+ aF-1.740264825689735
377
+ aF-2.5015190512398306
378
+ aF-4.817371901371758
379
+ aF-2.113210530516431
380
+ aF-6.807749379744223
381
+ aF-3.295792549094097
382
+ aF-6.252798951069294
383
+ aF-4.917797884336954
384
+ aF-4.632769841879079
385
+ aF-5.98877585293416
386
+ aF-4.66665175664129
387
+ aF-2.876759518067293
388
+ aF-7.603753945743875
389
+ aF-6.864465609186075
390
+ aF-7.187239001449126
391
+ aF-3.052336215755031
392
+ aF-2.2647479268985964
393
+ aF-4.9001580949910455
394
+ aF-5.348959516586179
395
+ aF-7.14773655847288
396
+ aF-7.454376544669275
397
+ aF-4.546351845156791
398
+ aF-8.807726750069811
399
+ aF-1.4656789800236865
400
+ aa(lp18
401
+ F-5.082241389138844
402
+ aF-5.142393631993602
403
+ aF-4.276598353328924
404
+ aF-4.085216817689867
405
+ aF-5.759968385010914
406
+ aF-2.174773615995364
407
+ aF-5.288621572218711
408
+ aF-6.090174110360315
409
+ aF-4.4715723308194235
410
+ aF-6.90687468303798
411
+ aF-4.432658655412265
412
+ aF-3.2277431063139637
413
+ aF-2.8212923556513845
414
+ aF-1.7681088393713913
415
+ aF-3.5156224239092224
416
+ aF-3.9518016568316168
417
+ aF-8.986316224717816
418
+ aF-2.1664590794141954
419
+ aF-3.3712131841519417
420
+ aF-3.1224483347145418
421
+ aF-2.2101893882141153
422
+ aF-3.5450716407123704
423
+ aF-3.1404797504673136
424
+ aF-6.663111844521034
425
+ aF-5.51344438505264
426
+ aF-7.704225641127928
427
+ aF-2.2129736790077486
428
+ aa(lp19
429
+ F-2.136265349667144
430
+ aF-7.648749930275526
431
+ aF-6.522163789565011
432
+ aF-8.503165258431594
433
+ aF-1.734614577170007
434
+ aF-6.50168525822147
435
+ aF-7.785325465281278
436
+ aF-3.642577960578998
437
+ aF-2.654272848600284
438
+ aF-8.454375094262163
439
+ aF-7.861311372259199
440
+ aF-2.3851781986138283
441
+ aF-6.363099094935324
442
+ aF-7.294204912594619
443
+ aF-2.116201767527298
444
+ aF-2.7907534570773387
445
+ aF-9.196312438991539
446
+ aF-1.7868731941606955
447
+ aF-3.8487288311405847
448
+ aF-3.2942257030347744
449
+ aF-3.1971277688684197
450
+ aF-9.196312438991539
451
+ aF-7.0800569241889875
452
+ aF-9.196312438991539
453
+ aF-4.987152202340858
454
+ aF-9.196312438991539
455
+ aF-2.9363486388485507
456
+ aa(lp20
457
+ F-6.182291496945648
458
+ aF-6.182291496945648
459
+ aF-6.182291496945648
460
+ aF-6.182291496945648
461
+ aF-6.182291496945648
462
+ aF-6.182291496945648
463
+ aF-6.182291496945648
464
+ aF-6.182291496945648
465
+ aF-6.182291496945648
466
+ aF-6.182291496945648
467
+ aF-6.182291496945648
468
+ aF-6.182291496945648
469
+ aF-6.182291496945648
470
+ aF-6.182291496945648
471
+ aF-6.182291496945648
472
+ aF-6.182291496945648
473
+ aF-6.182291496945648
474
+ aF-6.182291496945648
475
+ aF-6.182291496945648
476
+ aF-6.182291496945648
477
+ aF-0.057170565024993084
478
+ aF-6.182291496945648
479
+ aF-6.182291496945648
480
+ aF-6.182291496945648
481
+ aF-6.182291496945648
482
+ aF-6.182291496945648
483
+ aF-5.540437610773254
484
+ aa(lp21
485
+ F-2.5722807952437394
486
+ aF-5.99205725679638
487
+ aF-4.319316764551239
488
+ aF-3.7284294070555273
489
+ aF-1.4225483817556492
490
+ aF-5.357231686671419
491
+ aF-4.331760927128789
492
+ aF-6.033757985995324
493
+ aF-2.364541745819138
494
+ aF-9.405183209323809
495
+ aF-4.850464566831816
496
+ aF-4.673556599383159
497
+ aF-3.74500437214974
498
+ aF-3.916576116332641
499
+ aF-2.322164557359624
500
+ aF-5.413979406021222
501
+ aF-9.0405400957359
502
+ aF-3.636862213530037
503
+ aF-2.9016138569514562
504
+ aF-3.2345687965875074
505
+ aF-3.9816438780223904
506
+ aF-4.905818082232694
507
+ aF-6.257588586460572
508
+ aF-10.13915238440401
509
+ aF-3.2585970253485272
510
+ aF-8.070182142591468
511
+ aF-1.7281348556024396
512
+ aa(lp22
513
+ F-3.2090119684802527
514
+ aF-6.334899523154303
515
+ aF-4.093299064417374
516
+ aF-7.59203503906228
517
+ aF-2.1586345703035548
518
+ aF-6.182995193953419
519
+ aF-7.849503332917564
520
+ aF-2.915257814589145
521
+ aF-2.7754710419906368
522
+ aF-9.550291023939893
523
+ aF-4.551024457450397
524
+ aF-4.707131510786311
525
+ aF-4.589272322367449
526
+ aF-6.243245073400844
527
+ aF-2.968928515714967
528
+ aF-3.853927304671137
529
+ aF-7.00903143760076
530
+ aF-8.243133983378726
531
+ aF-2.8108514947509753
532
+ aF-2.1177557637833035
533
+ aF-3.256550330868065
534
+ aF-7.865503674264006
535
+ aF-5.380597024476005
536
+ aF-10.383200146874996
537
+ aF-5.201416596582911
538
+ aF-9.913196517629261
539
+ aF-0.9911300598753203
540
+ aa(lp23
541
+ F-3.1682706209020797
542
+ aF-8.217706195099385
543
+ aF-5.878899838671022
544
+ aF-9.525219678366163
545
+ aF-2.3388918749787213
546
+ aF-7.172402459885784
547
+ aF-8.478432457562638
548
+ aF-1.1074718716074006
549
+ aF-2.3662341497302157
550
+ aF-10.456777882371107
551
+ aF-9.689522729657439
552
+ aF-4.418539855249544
553
+ aF-6.032391973858083
554
+ aF-7.21960886445558
555
+ aF-2.3367153961191565
556
+ aF-8.16191483547097
557
+ aF-10.719142146838596
558
+ aF-3.4624215669877803
559
+ aF-3.6797447588047447
560
+ aF-4.0251990917417855
561
+ aF-3.9002180815630756
562
+ aF-9.555991337032916
563
+ aF-5.18259559155831
564
+ aF-10.131355481936478
565
+ aF-4.204725796166784
566
+ aF-7.984774637419013
567
+ aF-1.5835253210583506
568
+ aa(lp24
569
+ F-3.6907063474960813
570
+ aF-3.7341336409281
571
+ aF-3.2454721262149633
572
+ aF-4.020608712629845
573
+ aF-3.281297978688059
574
+ aF-5.0172437658127595
575
+ aF-3.1933271849501734
576
+ aF-7.7963898885987
577
+ aF-3.73111886626001
578
+ aF-9.35453450664525
579
+ aF-6.208229374611885
580
+ aF-2.257675168572427
581
+ aF-3.4112977818505237
582
+ aF-2.089862646254399
583
+ aF-6.096437968623768
584
+ aF-3.100705695069777
585
+ aF-8.949069398537086
586
+ aF-1.905375903432984
587
+ aF-1.9722545836234537
588
+ aF-1.9648676140617645
589
+ aF-9.274491798971713
590
+ aF-6.7831953510849425
591
+ aF-9.35453450664525
592
+ aF-7.31765257938421
593
+ aF-7.3735330377786665
594
+ aF-5.267158613739243
595
+ aF-3.2700350935700784
596
+ aa(lp25
597
+ F-2.4677551327310105
598
+ aF-8.46601472297182
599
+ aF-8.561324902776146
600
+ aF-7.868177722216201
601
+ aF-0.5192363285514866
602
+ aF-8.561324902776146
603
+ aF-8.298960638308655
604
+ aF-8.46601472297182
605
+ aF-1.7429474349452256
606
+ aF-8.46601472297182
607
+ aF-7.919471016603751
608
+ aF-5.443374996497906
609
+ aF-8.561324902776146
610
+ aF-4.566800675836256
611
+ aF-2.7892609207035393
612
+ aF-8.561324902776146
613
+ aF-8.561324902776146
614
+ aF-6.268790145635601
615
+ aF-4.407140340198028
616
+ aF-7.973538237874027
617
+ aF-6.289199017266808
618
+ aF-8.379003345982191
619
+ aF-8.030696651713976
620
+ aF-8.46601472297182
621
+ aF-5.258107929474194
622
+ aF-8.561324902776146
623
+ aF-3.1228109057348257
624
+ aa(lp26
625
+ F-1.596798460957614
626
+ aF-7.6001421705956735
627
+ aF-7.640964165115928
628
+ aF-5.344648685135478
629
+ aF-1.8921021056868312
630
+ aF-6.839336341561913
631
+ aF-8.110967794361665
632
+ aF-1.6228272590921338
633
+ aF-1.7628783045649916
634
+ aF-8.87310784640856
635
+ aF-7.001305669506969
636
+ aF-5.467159861987808
637
+ aF-8.179960665848615
638
+ aF-3.2183655671770013
639
+ aF-2.524218636471301
640
+ aF-8.293289351155618
641
+ aF-9.209580083029774
642
+ aF-4.57000847032435
643
+ aF-4.2425484264156506
644
+ aF-5.65423202154036
645
+ aF-7.560921457442392
646
+ aF-9.209580083029774
647
+ aF-7.307472556632853
648
+ aF-9.209580083029774
649
+ aF-6.877436187794184
650
+ aF-9.02725852623582
651
+ aF-2.172728230713227
652
+ aa(lp27
653
+ F-2.254780968033424
654
+ aF-6.898209866138606
655
+ aF-2.027603216646053
656
+ aF-6.492744758030441
657
+ aF-2.462642464536694
658
+ aF-5.833499129146177
659
+ aF-6.898209866138606
660
+ aF-4.310445830910898
661
+ aF-2.0731012597852527
662
+ aF-6.898209866138606
663
+ aF-6.898209866138606
664
+ aF-6.42820623689287
665
+ aF-6.715888309344651
666
+ aF-6.898209866138606
667
+ aF-4.4558628307694015
668
+ aF-1.4955324842663262
669
+ aF-6.3675816150764355
670
+ aF-4.636446767664815
671
+ aF-5.766807754647505
672
+ aF-1.8703897472882491
673
+ aF-4.37248122183035
674
+ aF-4.348764695213034
675
+ aF-6.802899686334281
676
+ aF-4.473407140420311
677
+ aF-5.175443268397502
678
+ aF-6.898209866138606
679
+ aF-2.5557039896270073
680
+ aa(lp28
681
+ F-3.872874353532438
682
+ aF-6.07983833755968
683
+ aF-5.605956228985375
684
+ aF-6.1495716755743555
685
+ aF-2.9181087352006836
686
+ aF-5.833140189836589
687
+ aF-6.998723104610883
688
+ aF-7.317176835729417
689
+ aF-3.8266394412494407
690
+ aF-8.857621876676566
691
+ aF-7.45070822835394
692
+ aF-4.745382824577915
693
+ aF-4.447858487031085
694
+ aF-5.448125692199715
695
+ aF-2.2255103117197566
696
+ aF-4.737230605516364
697
+ aF-9.03994343347052
698
+ aF-5.751541545953709
699
+ aF-3.132676045163294
700
+ aF-4.230201081753655
701
+ aF-7.038463433260397
702
+ aF-7.65364907235063
703
+ aF-6.230540738108023
704
+ aF-7.70494236673818
705
+ aF-8.16447469611662
706
+ aF-7.731610613820342
707
+ aF-0.3817890191036424
708
+ aa(lp29
709
+ F-2.531179599331457
710
+ aF-6.00314605188182
711
+ aF-5.907835872077495
712
+ aF-4.694813232231641
713
+ aF-0.9292230185496455
714
+ aF-5.907835872077495
715
+ aF-6.00314605188182
716
+ aF-3.199785670975285
717
+ aF-2.3395844057521735
718
+ aF-6.00314605188182
719
+ aF-5.666673815260607
720
+ aF-3.886890537079268
721
+ aF-4.0016660516716955
722
+ aF-5.009894278871537
723
+ aF-1.7362497244615696
724
+ aF-6.00314605188182
725
+ aF-6.00314605188182
726
+ aF-6.00314605188182
727
+ aF-5.820824495087865
728
+ aF-5.907835872077495
729
+ aF-3.1183453390351104
730
+ aF-5.666673815260607
731
+ aF-5.907835872077495
732
+ aF-6.00314605188182
733
+ aF-4.568061526592497
734
+ aF-3.7731316517226094
735
+ aF-3.1699327078256037
736
+ aa(lp30
737
+ F-2.154456318300654
738
+ aF-3.132028909232904
739
+ aF-3.204240273221435
740
+ aF-3.554775966080049
741
+ aF-3.8320798875631166
742
+ aF-3.2625149066892174
743
+ aF-4.1318426237305275
744
+ aF-2.7847122791384975
745
+ aF-2.7534204117779435
746
+ aF-5.68607428491371
747
+ aF-5.271518205104294
748
+ aF-3.779792629468334
749
+ aF-3.352468498262208
750
+ aF-3.8123859357235683
751
+ aF-2.644532475328425
752
+ aF-3.3676057306367433
753
+ aF-6.237872536056438
754
+ aF-3.6809218434868227
755
+ aF-2.7030074975159986
756
+ aF-1.86142386740762
757
+ aF-4.4723885304095425
758
+ aF-4.918696830481552
759
+ aF-2.8042850405884527
760
+ aF-7.783949528282384
761
+ aF-4.702039558487341
762
+ aF-8.486442571260568
763
+ aF-3.2910629924454486
764
+ aas.
good.txt ADDED
@@ -0,0 +1,6 @@
 
 
 
 
 
 
 
1
+ rob
2
+ two models
3
+ some long sentence, might suck?
4
+ Project Gutenberg
5
+ a b c
6
+